Senior Data Engineer

Overview

Remote
$120,000 - $160,000
Full Time

Skills

AWS
Snowflake
ETL
Go
Golang
Python
Typescript

Job Details

Position Overview:
The Senior Data Engineer will be responsible for designing,
implementing, and maintaining robust data pipelines and integrations. You will ensure
the reliability, scalability, and cost-efficiency of our data systems while enabling internal
teams to access and utilize data effectively. Your expertise in AWS Glue, S3, and
standard ETL processes will be critical in managing and optimizing our data
infrastructure. Your experience with data systems leveraging Apache Kafka for
streaming will be vital.
Key Responsibilities:
Design, develop, and manage data pipelines and ETL processes to Snowflake
Data Cloud and Snowpipe.
Ensure data reliability, integrity, and security across all data systems.
Collaborate with engineering and data science teams to support their data needs
and enable rapid innovation.
Optimize data warehouse performance, storage, and costs in Snowflake.
Implement and maintain data integration solutions using AWS Glue and S3.
Monitor and troubleshoot data pipelines to ensure high availability and
performance.
Develop and enforce best practices for data management, including data
governance and compliance.
Analyze and manage costs associated with data storage and processing,
ensuring efficient use of resources.
Stay updated on industry trends and emerging technologies to continually
improve our data infrastructure.
Qualifications:
Bachelor s degree in Computer Science, Data Science, Business Information
Systems, or a related field; or equivalent work experience
Proven experience as a Data Engineer or similar role, with a focus on data
warehouse management and ETL processes
Extensive experience with Snowflake Data Cloud and Snowpipe, including
architecture, deployment, and management at an enterprise scale
Extensive experience with DBT, or similar data transformation tooling
Proficient coding in Go and/or Python
Strong knowledge of AWS services, particularly AWS Kinesis, Glue and S3
Experience with Apache Kafka or Pulsar
Proficiency in SQL and experience with data modeling and database design
Solid understanding of big data technologies and frameworks
Experience with data integration tools and techniques
Strong problem-solving skills and the ability to work under pressure in a
fast-paced environment
Excellent communication and collaboration skills, with a proactive and self-driven
approach
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.