Data Engineer With Scala/Spark

  • Posted 2 days ago | Updated 2 days ago

Overview

Remote
Depends on Experience
Contract - W2

Skills

Data Engineering
ETL
MySQL
Postgres
SDLC
Scala
big data

Job Details

Job Title : Data Engineer With Scala/Spark

Location : Remote

Contract Type: C2C or W2

Job Description :

  • Need a senior engineer with 10+ years of experience.

  • Must be able to design, build and deploy data transformation and ETL jobs in Azure or AWS cloud.

  • Work with large scale datasets

  • Work with/use various external APIs to enhance data

  • Setup database tables for analytics users to consume the data collected by the Data Engineering team

  • Work with big data technologies to improve data availability and data quality in the cloud (Azure or AWS)

Skills Required:

- Strong technical experience in Scala and Spark for designing , creating and maintaining applications.

- Experience with Cloud Data Platforms like Databricks.

- Experience with Azure cloud (or similar cloud like AWS).

- Strong SQL skills with commensurate experience in a large database platform.

- Experience implementing ETL routines with 100M+ datasets in a performant and scalable manner

- Experience with relational database design (MySQL, Postgres etc)

- Experience in complete SDLC process and Agile Methodology

- Strong oral and written communication skills

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.