Overview
Hybrid3 Days a week onsite
Depends on Experience
Contract - W2
Skills
Databricks
AWS
python
pyspark
Job Details
Title: Data Engineer
Location: Parsippany, NJ (Hybrid)
Needs:
- AWS
- DATABRICKS
- PYTHON
- PYSPARK
Nice to Haves:
- IVR
- Genesys
- Sales force
Job Description
- What are the Day-to-Day Responsibilities?
- Utilizing databricks as the primary data warehouse will be hands on building data pipeline using databricks medallion architecture
- Working with AWS cloud for cloud-based data engineering - experience in Configuring S3, IAM
- Working in a contact center environment primarily with genesys cloud and salesforce CRM
- Utilizing languages such as Python, SQL, and PySpark.
- How will this resource be measured?
- Ability to build data pipelines and support AWS migration
Technical Aptitude
- 3-5 Must Haves Skills & Technologies (Break down each skill or tech stack/ flexibility levels):
- Databricks experience MUST have 8+ years of experience
- AWS Cloud 8+ years of experience - Configuring S3, IAM, Comprehend, Lambda, Dynamo DB, Event bridge Architecture and related services like AWS Kinesis , Firehose etc
- Python 8+ years
- Experience on prem using SQL server 8+ years
- Experience building data pipelines 8+ years
- How important is the environment they have worked in prior? ( ie: enterprise level, industry specific/ domain specific) enterprise level is not important to the manager but definitely need the years of experience and enough exposure to databricks and AWS
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.