Data Engineer

Overview

Remote
Hybrid
$70 - $90
Contract - W2
Contract - 12 Month(s)

Skills

GitHub
UDF
Python
Programming Languages
Kubernetes
Extract
Transform
Load
Snow Flake Schema
Testing
ELT
Docker
Data Warehouse
Data Modeling
Scripting
SQL
Continuous Delivery
Continuous Integration
Access Control

Job Details

Data Engineer
Duration: 6-12 months

Rate is $70-90/hr.

W2 ONLY (no third party, 1099 or C2C arrangements will be considered)

No H1 transfers or sponsorship permitted. NO C2C

Location:

Position is preferably hybrid, but could be remote. Prefer PDT/PST time zone if remote.
Mountain View, CA

Description:
As a Data Engineer, you will collaborate with team in building data pipelines, orchestrate and manage Snowflake Data warehouse. You will play a key role in ensuring that existing pipelines are meeting SLAs and integrate CI/CD practices in our operation.
Responsibilities:
Experience in Snowflake environment and setting up Role based access control roles and Users, setup warehouses and monitors.
Experience developing Github CI/CD pipelines and deployment into dev and production nodes. Familiar with setting up CI/CD pipelines with Kubernetes, and Docker.
Write and maintain scripts necessary for data ingestion with orchestration in Airflow
Develop testing frameworks to validate data accuracy and data consistency at each layer.
Required skills:
Proficiency in programming languages such as Python.
Strong analytical and problem-solving skills
Familiarity with data modeling and ELT /ETL processes.
Proficient in SQL.
Familiar with setting up UDF Functions / Stored Procs
Familiarity with data ingestion tools (Fivetran, HVR, Airbyte)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.