FullTime Role :: Sr Databrick Engineer - Cleveland, OH or Alpharetta, GA (Hybrid)

  • New York, NY
  • Posted 3 days ago | Updated 13 hours ago

Overview

On Site
$DOE
Full Time

Skills

SAFE
Extract
Transform
Load
Workflow
Meta-data Management
Data Storage
Data Security
Analytical Skill
Performance Tuning
Cost Management
Data Quality
Data Integrity
Management
Data Engineering
Data Architecture
Databricks
PySpark
Scala
SQL
Data Processing
RBAC
Access Control
Unity
Data Governance
Regulatory Compliance
HIPAA
Auditing
Cloud Computing
Microsoft Azure
Amazon Web Services
Google Cloud
Google Cloud Platform
Continuous Integration
Continuous Delivery
DevOps
Business Intelligence
Tableau
Microsoft Power BI
Machine Learning (ML)
Problem Solving
Conflict Resolution
Communication
Collaboration
Sage

Job Details

Hi

Hope you are doing great today! Hope you are Safe and healthy!!

This is Bindu from Sage IT inc. We have an opportunity with one of our clients, please find the below job Description and let me know if you have any suitable profiles.

Role: Sr Databrick Engineer - 10+ Years Experience

Location: Cleveland, OH or Alpharetta, GA (Hybrid)

Top Required skills Databricks and Unity Catalog and strong experience on Delta Live Tables (DLT)

We are looking for a skilled Sr. Data Engineer /with expertise in Databricks and Unity Catalog to design, implement, and manage scalable data solutions.

Key Responsibilities:

  • Design and implement scalable data pipelines and ETL workflows using Databricks.
  • Implement Unity Catalog for data governance, access control, and metadata management across multiple workspaces.
  • Develop Delta Lake architectures for optimized data storage and retrieval.
  • Establish best practices for data security, compliance, and lineage tracking in Unity Catalog.
  • Optimize data lakehouse architecture for performance and cost efficiency.
  • Collaborate with data scientists, engineers, and business teams to support analytical workloads.
  • Monitor and troubleshoot Databricks clusters, performance tuning, and cost management.
  • Implement data quality frameworks and observability solutions to maintain high data integrity.
  • Work with Azure/AWS/Google Cloud Platform cloud environments to deploy and manage data solutions.
  • Required Skills & Qualifications:
  • 8-19 years of experience in data engineering, data architecture, or cloud data solutions.
  • Strong hands-on experience with Databricks and Unity Catalog.
  • Expertise in PySpark, Scala, or SQL for data processing.
  • Deep understanding of Delta Lake, Lakehouse architecture, and data partitioning strategies.
  • Experience with RBAC, ABAC, and access control mechanisms within Unity Catalog.
  • Knowledge of data governance, compliance standards (GDPR, HIPAA, etc.), and audit logging.
  • Familiarity with cloud platforms (Azure, AWS, or Google Cloud Platform) and their respective data services.
  • Strong understanding of CI/CD pipelines, DevOps, and Infrastructure as Code (IaC).
  • Experience integrating BI tools (Tableau, Power BI, Looker) and ML frameworks is a plus.
  • Excellent problem-solving, communication, and collaboration skills

Bindu

linkedin.com/in/ganji-bindu-2bb312158

Sage IT Inc.,

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Sage IT Inc