Overview
On Site
Accepts corp to corp applications
Contract - W2
Contract - 6 Month(s)
Skills
Google cloud platform
sql
AWS
Java
API
Azure
NoSQL
GCP
Apache flink
Apache casandra
Sacala
Amazon Web Services
solution architecture
Network
Job Details
Job Title: Solution Architect
Location: Multiple locations (WA, GA, TX)
Work mode: On-site (Can be converted to hybrid model based on consultant's performance)
Rate: $65/Hour on Corp-Corp (C2C)
Interview Mode: Video
Number of rounds: 3
Project duration: 6+ Months to start with
Location: Multiple locations (WA, GA, TX)
Work mode: On-site (Can be converted to hybrid model based on consultant's performance)
Rate: $65/Hour on Corp-Corp (C2C)
Interview Mode: Video
Number of rounds: 3
Project duration: 6+ Months to start with
Job Summary:
Telecom client is looking for an experienced solution architect with expertise in Apache flink Cassandra Kafka in data streaming architecture. Need help to solve existing big data problems and create new custom streaming architecture. Client has some issues in scaling the current big data architecture for creating new use cases. Network level API and infrastructure knowledge in the Telecommunication industry is preferred. Solve complex technical problems and bring to solution using industry standards and guidelines, perform R&D and proof of concepts.
Responsibilities:
Architect Data Streaming Solutions:
Design and implement robust, scalable, and efficient data streaming architecture leveraging Apache Flink, Kafka, and Cassandra.
Build custom streaming solutions that address the client's needs for real-time data processing and analytics.
Solve Big Data Challenges:
Analyze and resolve existing big data architecture issues, including performance bottlenecks and scaling inefficiencies.
Collaborate with cross-functional teams to ensure the system's architecture supports existing and future business use cases.
Scalability & Performance Optimization:
Identify limitations in the current data architecture and design solutions that scale efficiently as data volumes grow.
Recommend and implement optimizations to the client's big data pipeline for enhanced throughput, latency, and reliability.
Data Pipeline and Integration:
Oversee the integration of data from various sources, ensuring consistent and reliable data flow into the streaming architecture.
Support integration with external systems and APIs for real-time data consumption and processing.
Mentoring & Best Practices:
Provide architectural guidance and best practices for developers, including writing efficient and maintainable code.
Mentor the team in the areas of stream processing frameworks, distributed databases, and message-driven architectures.
Collaboration & Stakeholder Engagement:
Work closely with business stakeholders to understand their requirements and translate them into technical solutions.
Collaborate with data scientists, engineers, and product teams to align the architecture with business goals.
Technical Skills:
Programming languages: Java, Scala, or Python
Cloud platforms (AWS, Google Cloud Platform, or Azure)
Containerization (Docker, Kubernetes)
Strong expertise in Apache Flink, Apache Kafka, Cassandra, and other data streaming technologies.
In-depth knowledge of distributed computing, real-time data streaming, and big data platforms.
Data pipeline orchestration tools: Apache Nifi, Airflow, or others.
Databases: SQL and NoSQL.
Cloud platforms (AWS, Google Cloud Platform, or Azure)
Containerization (Docker, Kubernetes)
Strong expertise in Apache Flink, Apache Kafka, Cassandra, and other data streaming technologies.
In-depth knowledge of distributed computing, real-time data streaming, and big data platforms.
Data pipeline orchestration tools: Apache Nifi, Airflow, or others.
Databases: SQL and NoSQL.
Experience:
5+ years of experience in designing and architecting solutions in large-scale data streaming environments.
Hands-on experience in optimizing data architectures for scalability, performance, and fault tolerance.
Proven track record in solving complex big data issues, including performance bottlenecks, data inconsistency, and scaling problems.
Problem Solving & Analytical Skills:
Strong ability to diagnose and troubleshoot issues in complex data architectures.
Experience in solving real-time processing challenges and designing custom solutions for streaming data pipelines.
Soft Skills:
Excellent communication skills, both written and verbal, with the ability to explain technical concepts to non-technical stakeholders.
Strong leadership and mentoring capabilities to help guide the development team.
A proactive mindset and the ability to think critically and propose innovative solutions.
Preferred Qualifications:
Certifications in cloud platforms (AWS Certified Solutions Architect, Google Professional Cloud Architect, etc.) is desired.
Experience with NoSQL databases like HBase or MongoDB.
Experience with event-driven architectures and microservices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.