Overview
Remote
On Site
100k} - 135k}
Full Time
Skills
Big data
Analytics
Computer science
Data engineering
FOCUS
Python
Scala
Cloud computing
Amazon Web Services
Microsoft Azure
Real-time
Data processing
Machine Learning (ML)
Software deployment
Data
Databricks
Apache Spark
Optimization
Software development
Extract
transform
load
Design
Documentation
Reporting
Job Details
We are currently seeking a Senior Data Engineer with expertise in Databricks at a cutting-edge start-up.
The company is focused on leveraging big data and analytics to provide actionable insights to its clients. As a Senior Data Engineer, you will be instrumental in designing, building, and maintaining scalable data pipelines using Databricks and other cloud-based platforms.
The ideal candidate has extensive experience in data engineering, working with large-scale data infrastructures, and optimizing ETL processes.
Required Skills & Experience:
Desired Skills & Experience:
What You Will Be Doing:
Tech Breakdown:
Daily Responsibilities:
The company is focused on leveraging big data and analytics to provide actionable insights to its clients. As a Senior Data Engineer, you will be instrumental in designing, building, and maintaining scalable data pipelines using Databricks and other cloud-based platforms.
The ideal candidate has extensive experience in data engineering, working with large-scale data infrastructures, and optimizing ETL processes.
Required Skills & Experience:
- Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
- At least 5 years of experience in data engineering, with a focus on building data pipelines.
- Proven experience with Databricks and Apache Spark.
- Strong proficiency in Python or Scala.
- Experience with cloud platforms such as AWS or Azure.
- Expertise in building and optimizing ETL processes.
Desired Skills & Experience:
- Experience with real-time data processing.
- Knowledge of ML model deployment within data pipelines.
- Familiarity with Delta Lake and data lakes architecture.
What You Will Be Doing:
Tech Breakdown:
- 60% Databricks & Spark Development
- 40% ETL Optimization
Daily Responsibilities:
- 70% Hands-On Coding
- 20% Data Pipeline Design & Development
- 10% Documentation & Reporting
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.