Overview
Skills
Job Details
Note: This is W2 requirement.
Title: AWS Data Engineer
Location: Torrance, CA ( Work Type: Hybrid -Less than 60%)
Duration: April 2025 – October 31, 2025 with possibility of an extension
Travel – 5%
The IT Data Integration Engineer / AWS Data Engineer is tasked with the design, development, and management of data integration processes to ensure seamless data flow and accessibility across the organization. This role is pivotal in integrating data from diverse sources, transforming it to meet business requirements, and loading it into target systems such as data warehouses or data lakes. The aim is to support the organization's data-driven decision-making by providing high-quality, consistent, and accessible data. Daily Tasks Performed will Include the Following:
- Develop and Maintain Data Integration Solutions:
- Design and implement data integration workflows using AWS Glue/EMR, Lambda, Redshift
- Demonstrate proficiency in Pyspark, Apache Spark and Python for data processing large datasets
- Ensure data is accurately and efficiently extracted, transformed, and loaded into target systems.
- Ensure Data Quality and Integrity:
- Validate and cleanse data to maintain high data quality.
- Ensure data quality and integrity by implementing monitoring, validation, and error handling mechanisms within data pipelines
- Optimize Data Integration Processes:
- Enhance the performance, optimization of data workflows to meet SLAs, scalability of data integration processes and cost-efficiency on AWS cloud infrastructure.
- Identify and resolve performance bottlenecks, fine-tuning queries, and optimizing data processing to enhance Redshift's performance
- Regularly review and refine integration processes to improve efficiency.
- Support Business Intelligence and Analytics:
- Translate business requirements to technical specifications and coded data pipeline
- Ensure timely availability of integrated data for business intelligence and analytics.
- Collaborate with data analysts and business stakeholders to meet their data requirements.
- Document all data integration processes, workflows, and technical & system specifications.
- Ensure compliance with data governance policies, industry standards, and regulatory requirements.
Required Skills and Expertise:
- Bachelor's degree in computer science, information technology, or a related field.
- A master's degree can be advantageous
- 7-10+ years of experience in the following areas:
- Data engineering
- Database design
- ETL processes
- 5+ years working hands on with various programming languages such as PySpark, and Python
- 5+ years of experience working hands-on with AWS tools and technologies (i.e. - S3, EMR, Glue, Athena, RedShift, Postgres, RDS, Lambda, and PySpark)
- 3+ years of experience of working with databases/ data marts/data warehouses.
- Proven experience in ETL development, system integration, and CI/CD implementation.
- Experience in complex database objects to move the changed data across multiple environments
- Solid understanding of data security, privacy, and compliance.
- Excellent problem-solving and communication skills.
- Displays good communication skills to effectively collaborate with multi-functional teams,
- Skilled in participating in the Agile development processes including sprint planning stand-ups and retrospectives.
- Provide technical guidance and mentorship to Junior Developers.
- Utilize good attention to detail and a display a true commitment to data quality.
- Possess a continuous learning mindset to keep up with evolving technologies and best practices in data engineering.