Data Engineer

Overview

On Site
$50 - $60
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Data Engineer
Python
Pyspark
GCP
Google Cloud Platform

Job Details

W2 Role

Job Title: Data Engineer

Type (Contract, Contract to Hire, Perm) : 12 +months

Location: New York City, New York or Phoenix Arizona (Either of 2 locations(Onsite)

Note : 2 Positions are there

1. More core data engineering expertise- building data pipeline using PySpark, BigQuery, Python

2 .Needs experience working on real time pipelines, exposed to API's

JD:

We are looking for a highly skilled Engineer with a solid experience of building Bigdata, Google Cloud Platform Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organization s data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders.
This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team.

Technical Skills 1. Core Data Engineering Skills

  • Proficiency in using Google Cloud Platform s big data tools like:
  • BigQuery: For data warehousing and SQL analytics.
  • Dataproc: For running Spark and Hadoop clusters.
  • Google Cloud Platform Dataflow: For stream and batch data processing.(High level Idea)
  • Google Cloud Platform Pub/Sub: For real-time messaging and event ingestion.(High level Idea)

Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions .

  1. Programming and Scripting
  • Strong coding skills in SQL, and Java.
  • Familiarity with APIs and SDKsfor Google Cloud Platform services to build custom data solutions.
  • Cloud Infrastructure
  • Understanding of Google Cloud Platform services such as Cloud Storage, Compute Engine, and Cloud Functions.
  • Familiarity with Kubernetes (GKE)and containerization for deploying data pipelines. (Optional but Good to have)
  1. DevOps and CI/CD
  • Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools.
  • Monitoring and logging tools like Cloud Monitoringand Cloud Logging for production workflows.
  1. Backend Development (Spring Boot & Java)
    -> Design and develop RESTful APIs and microservices using Spring Boot.
    -> Implement business logic, security, authentication (JWT/OAuth), and database operations.
    -> Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL).
    -> Optimize backend performance, scalability, and maintainability.
    -> Implement unit testing and integration testing.Soft Skills 1. Innovation and Problem-Solving
  • Ability to think creatively and design innovative solutions for complex data challenges.
  • Experience in prototyping and experimenting with cutting-edge Google Cloud Platform tools or third-party integrations.
  • Strong analytical mindset to transform raw data into actionable insights.
  1. Collaboration
  • Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders.
  • Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences.
  1. Adaptability and Continuous Learning
  • Open to exploring new Google Cloud Platform features and rapidly adapting to changes in cloud technology.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.