Overview
Remote
$50 - $65
Contract - W2
Contract - 08 Month(s)
Skills
AWS
Snowflake
Spark
ETL
SQL
Data processing
pipelines
Databricks
migration
Redshift to Databricks
data integrity
Job Details
Title: Software Engineer Cloud Platform
Location: : 100% REMOTE
Duration: 08+ Months
Overview:
Opportunity to design and optimize large-scale data processing pipelines, by working on a high-impact migration project critical to the company s data infrastructure.
Responsibilities:
- Work with key partners like data platform, security, network engineering to define and implement ETL pipelines, and Databricks workflows for engineering teams and business intelligence.
- Lead the migration from Redshift to Databricks, ensuring data integrity and minimal downtime.
- Work with key partners like data platform, security, and network engineering to define and implement ETL pipelines and Databricks workflows for engineering teams and business intelligence.
- Understand, document, and communicate best practices for product engineering teams working in public cloud environments.
- Provide technical guidance and mentorship to junior engineers via code review and design docs.
- Contribute to improving the Databricks infrastructure stack by optimizing workflows and implementing best practices.
- Ensure BI dashboards, reporting pipelines, and financial data processing are successfully transitioned to Databricks.
- Establish ownership structures for ongoing data processing, quality assurance, and monitoring post-migration.
- Contribute to code improvement of the DBX Infrastructure Stack.
Requirements
- AWS expertise in both development and production, with a strong understanding of cloud automation and best practices.
- 5+ years of SQL experience (Spark and Snowflake-specific SQL is a plus).
- Experience designing, building, and maintaining data processing systems at scale.
- Strong Python development skills with experience in data engineering and scripting.
- Proficiency in Terraform, at least at an operational level; experience deploying and managing Terraform as a service is a plus.
- Understanding of Service-Oriented Architecture (SOA) and best practices for distributed systems.
- 5+ years of experience with schema design, dimensional data modeling, and scalable data architecture.
- Proven track record of managing, communicating, and collaborating with internal teams on data platform plans and capabilities
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.