Big Data Developer

Overview

On Site
Depends on Experience
Contract - W2
100% Travel

Skills

Big Data
BigQuery
Collaboration
Computer Science
Data Integration
Data Modeling
Data Pipeline
Data Science
Docker
ETL
Google Cloud
Hadoop
Java
Kafka
Kubernetes
Performance Tuning
Python
Redshift
Scala
attention to detail
containerization
data pipelines
data scientists
data warehousing
database design
machine learning
performance
problem - solving
problem - solving skills
scalability

Job Details

Key Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain robust data pipelines using big data technologies such as Hadoop, Spark, or Flink.
  • Data Integration: Integrate data from multiple sources, ensuring data quality and consistency.
  • Data Modeling: Develop and optimize data models to support business requirements and analytical needs.
  • Performance Tuning: Optimize data processing workflows for performance, scalability, and cost-effectiveness.
  • Collaboration: Work with cross-functional teams including data engineers, data scientists, and business analysts to understand data requirements and deliver solutions.
  • Troubleshooting: Identify and resolve issues related to data processing, quality, and performance.
  • Documentation: Maintain clear and comprehensive documentation of data pipelines, processes, and systems.

Qualifications:

  • Education: Bachelor s degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
  • Experience: 7 years of experience in big data development or a related role.
  • Technical Skills:
    • Proficiency in big data technologies such as Hadoop, Spark, Kafka, Flink, etc.
    • Strong programming skills in languages such as Java, Scala, Python, or SQL.
    • Experience with data warehousing solutions like Amazon Redshift, Google BigQuery, or Snowflake.
    • Familiarity with ETL tools and data integration platforms.
    • Knowledge of data modeling techniques and database design.
  • Soft Skills:
    • Excellent problem-solving skills and attention to detail.
    • Strong communication and collaboration skills.
    • Ability to work independently and manage multiple tasks effectively.

Preferred Qualifications:

  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of machine learning and data analytics concepts.
  • Familiarity with containerization and orchestration tools like Docker and Kubernetes.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.