Overview
On Site
depends on experience
Full Time
Skills
Data Processing
Collaboration
Real-time
Data Analysis
Technical Support
System Testing
Data Marts
Service Level
Management
Data Migration
Data Integration
Testing
Release Management
Post-production
Application Development
ELT
Programming Languages
Scala
Java
Data Warehouse
Data Modeling
Privacy
Encryption
Access Control
Cloud Computing
Data Lake
Data Engineering
Big Data
Apache Hadoop
Apache Spark
Database
Apache Kafka
API
Agile
Data Flow
Conflict Resolution
Problem Solving
Snow Flake Schema
Amazon Web Services
Amazon S3
Amazon DynamoDB
Amazon Redshift
Spectrum
Amazon SageMaker
Scripting
Python
SQL
PySpark
Apache Hive
Extract
Transform
Load
Informatica
SAP BODS
IBM InfoSphere DataStage
Datastage
Job Details
Overview
This is a remote role that may only be hired in the following locations: NC, NJ, FL, AZ
This position is responsible for development/support of Enterprise data processing systems through enhancement of related software and processes. Develops software and procedures that improve processing capabilities. Analyzes, codes, tests, and implements in coordination with management, associates, and end users. Serves as production system support by resolving issues and ensuring ongoing functionality. May oversee the work of less experienced analysts or assist in special projects as needed.
Responsibilities
Qualifications
Bachelor's Degree and 4 years of experience in Software application development and maintenance OR High School Diploma or GED and 8 years of experience in Software application development and maintenance
Preferred a rea of experience:
Technical Skills : Snowflake, AWS Services like S3, Glue, Lambda, DynamoDB, Redshift, Redshift spectrum, Lake formation, Sagemaker, Quicksight, Athena
Scripting and languages: Python, SQL , PySpark, Hive
ETL tools like Informatica IICS, SAPBODS, Datastage
#LI-XG1
$descr2
$descr3
This is a remote role that may only be hired in the following locations: NC, NJ, FL, AZ
This position is responsible for development/support of Enterprise data processing systems through enhancement of related software and processes. Develops software and procedures that improve processing capabilities. Analyzes, codes, tests, and implements in coordination with management, associates, and end users. Serves as production system support by resolving issues and ensuring ongoing functionality. May oversee the work of less experienced analysts or assist in special projects as needed.
Responsibilities
- Data Integration - Provide data integration services for all batch/ real-time data Movement in data warehouse, data platforms (data lake) and dependent data marts (Cloud/OnPrem)
- Data Analysis - Sources, compiles, and interprets data. Performs analysis for accuracy and efficiencies, effectively communicates analysis output.
- Technical Proficiency - Provides technical support by performing coding, ensuring processes run smoothly, and working to continuously improve processing capabilities. Works closely with technical and operation teams to support their business objectives.
- System Testing - Thoroughly tests data integration, data warehouse, data marts to ensure accuracy, completeness, and overall efficiency. Evaluates and conveys testing results. Performs coding and assists in implementing system modifications and enhancements.
- Ensure that new solutions meet defined technical, functional, and service level requirements and standards and manage the end-to-end lifecycle for new data migration, data integration, and data services-oriented solutions.
- Additionally, the Senior Data Software Engineer will engage with partners in testing, release management, and operations to ensure quality of code development, deployment, and post-production support.
Qualifications
Bachelor's Degree and 4 years of experience in Software application development and maintenance OR High School Diploma or GED and 8 years of experience in Software application development and maintenance
Preferred a rea of experience:
- 5+ Years of Experience in designing & building data pipeline/ETL Frameworks with cloud-based data technologies (Preferred AWS Cloud Services Skills)
- 8+ years of experience in creating and maintaining ETL processes and architecting complex data pipelines - knowledge of data modeling techniques and high-volume ETL/ELT design.
- 3+ years of Experience with data warehousing/Data Lake solutions using Snowflake, AWS- Redshift.
- 5+ years of Experience with ETL and Data Ingestion tools such as (Informatica IICS, AWS Glue, SAP BODS)
- Experience in programming languages such as ANSI-SQL, Python, Scala, and/or Java
- Extensive knowledge of data warehouse principles, design, and data modeling concepts.
- 1+ years of experience in designing data privacy related constructs to enable data tokenization, encryption & data access controls on a public cloud data lake
- 3+ years of experience in Data Engineering related AWS services like Lake formation, Glue, S3, Lambda, DynamoDB and other AWS accelerators
- 3+ years of experience in Bigdata technologies like Hadoop, Hive, SPARK, No-SQL DB, Kafka and API
- Experience in leading team of engineers across geography
- Use agile engineering practices and various data development technologies to rapidly develop creative and efficient data products
- Identify inefficiencies, optimize processes and data flows, and make recommendations for improvements
- Communicate with other developers across teams, both for ad hoc problem solving, and check-ins/discussions with other initiatives
Technical Skills : Snowflake, AWS Services like S3, Glue, Lambda, DynamoDB, Redshift, Redshift spectrum, Lake formation, Sagemaker, Quicksight, Athena
Scripting and languages: Python, SQL , PySpark, Hive
ETL tools like Informatica IICS, SAPBODS, Datastage
#LI-XG1
$descr2
$descr3
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.