Overview
Hybrid2 days minimum onsite in a week
$60 - $70
Accepts corp to corp applications
Contract - W2
Skills
BigData
Hadoop
Hive
Hbase
AWS
Java
Python
Scala
SQL
Job Details
Job Discription:
About the Role: Our team is responsible of dealing with huge amount of data and also have an integration from various systems, create a required pipeline, and schedule, also team is dealing with steaming data for Realtime applications.
What you will do here:
- Design & build all batch/Realtime components/data pipelines lines/integrations for our MLOps platform on AWS.
Collaborate with geographically distributed cross-functional teams.
Participate in on-call rotation with the rest of the team to handle production incidents.
What you ll need to succeed - Must have skills:
- Big Data Experience 2+ years experience.
- Strong in Python/SCALA/Java programming
- Strong understanding/experience in SPARK.
- Good exposure on Kafka.
- AWS Experience is a must EMR, ECS, Elastic Search, S3, Cloud Watch, Lambda, Step Functions,
- Good communication skills to work with various stakeholders on operational support work
- DEVOPS Experience is a must CI/CD, Jenkins Build, Docker, ECR Repo
Nice to have skills:
- Experience with AWS Sage maker is plus
- Experience Terraform is good to have.
- Experience with various Python packaging options such as Wheel, PEX or Conda.
- Experience with various automated testing framework is plus.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.