Overview
Hybrid3 day Onsite
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Skills
Apache Airflow
Django
Data Processing
Management
Docker
Good Clinical Practice
Google Cloud Platform
PySpark
Python
Workflow
Scheduling
Kubernetes
Google Cloud
Job Details
Job Description:
5+ years of professional experience in Python.
Design, develop, and maintain robust and scalable data pipelines using Python and PySpark.
Experience in Django and Container (Docker, Kubernetes)
Implement and manage data workflows using Apache Airflow for scheduling, monitoring, and alerting.
Develop and deploy data processing solutions on Google Cloud Platform (Google Cloud Platform)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.