Senior Data Engineer-/LOS ANGELES, CA (Hybrid)- 12 months Contract

Overview

On Site
Contract - W2

Skills

LOS
Data Lake
Storage
Big data
Apache Spark
PySpark
Extract
transform
load
ELT
Design
Programming languages
SQL
Scala
R
ERD
Performance tuning
Database
Microsoft Power BI
Tableau
Data governance
RBAC
Data security
Regulatory Compliance
Auditing
Enterprise architecture
Leadership
Cloud computing
Data engineering
Databricks
API management
Management
Microsoft Azure
Data modeling
Python
Data
DevOps
Continuous integration
Continuous delivery
Terraform
Data warehouse
OLAP
OLTP
Business intelligence

Job Details

Title:Senior Data Engineer
Location: LOS ANGELES, CA (Hybrid)
Duration;12+ months Contract

Skills Preferred :
Cloud Platforms: Deep understanding of Azure ecosystem, including Azure Data Factory, Data Lake Storage, Blob Storage, power apps, and Functions. Additionally, in-depth understanding and implementation of API management such as Apigee. Big Data Technologies: Proficiency in Databricks, Spark, PySpark, Scala, and SQL. Data Engineering Fundamentals: Expertise in ETL/ELT processes, data pipelines, data modeling, schema design, and data warehousing. Programming Languages: Strong Python and SQL skills, with knowledge of other languages like Scala or R beneficial. Data Warehousing and Business Intelligence: Strong ERD concepts, designs, and patterns, Understanding of OLAP/OLTP systems, performance tuning, Database Server concepts, and BI tools (Power BI, Tableau). Data Governance: Strong understanding of RBAC/ABAC, Data Lineage, Data leak prevention, Data security, and compliance. Deep understanding and implementation knowledge of audit and monitoring in Cloud.

Experience Preferred :
  • A minimum of seven (7) years of applying Enterprise Architecture principles, with at least five (5) years in a lead capacity.
  • Additionally, the candidate must have a proven track record in the following areas: Cloud and Data Engineering: Minimum of five (5) years of hands-on experience with Azure Data Factory, Databricks, API management, and managing Azure resources.
  • Demonstrated ability to automate Azure data resources using DataOps principles and tools.
  • Data Modeling and Development: Minimum of five (5) years of experience developing data models and pipelines using Python.
  • Data Platform: Minimum of five (5) years of experience working with Lakehouse platforms.
  • DevOps and Infrastructure as Code: Minimum of three (5) years of experience in CI/CD pipelines and infrastructure automation using tools such as Terraform.
  • Preferred Terraform experience. Data Warehouse: Minimum of five (5) years of experience with data warehousing systems, OLAP/OLTP systems, and integration of BI tools
Education Preferred :
This classification requires the possession of a bachelor's degree in an IT-related or Engineering field.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.