Programmer Analyst 6 - Data Engineer (In-person Interview)

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 month(s)
No Travel Required

Skills

Oracle
AWS
Databricks
Data Engineer
Developing complex database systems
Elastic search
Kibanna
Python/Scala
Extract
Transform
and Load (ETL) processes and developing Data Pipelines
CMM/CMMI Level 3 methods and practices
Agile Development Processes
Ccreating CI/CD pipelines using Azure Devops

Job Details

Role: Programmer Analyst 6 - Data Engineer

Location: Lansing, MI

Duration: 12+ Months

In-Person Interviews - Onsite 2 days per week REQUIRED, CANDIDATES MUST BE LOCAL TO MICHIGAN.

Position Summary:

  • Lead the design and development of scalable and high-performance solutions using AWS services.
  • Experience with Databricks, Elastic search, Kibanna, S3.
  • Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
  • Write clean, maintainable, and efficient code in Python/Scala.
  • Experience with AWS Cloud-based Application Development
  • Experience in Electronic Health Records (EHR) HL7 solutions.
  • Implement and manage Elastic Search engine for efficient data retrieval and analysis.
  • Experience with data warehousing, data visualization Tools, data integrity
  • Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
  • Excellent knowledge in designing both logical and physical database model
  • Develop database objects including stored procedures, functions,
  • Extensive knowledge on source control tools such as GIT
  • Develop software design documents and work with stakeholders for review and approval.
  • Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
  • Experience working on large agile projects.
  • Experience or Knowledge on creating CI/CD pipelines using Azure Devops.

Skill Descriptions:

  • 12+ years developing complex database systems.
  • 8+ years Databricks.
  • 8+ years using Elastic search, Kibanna.
  • 8+ years using Python/Scala.
  • 8+ years Oracle.
  • 5+ years' experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
  • 5+ years' experience with AWS.
  • Over 5+ years' experience with data warehousing, data visualization Tools, data integrity.
  • Over 5+ years using CMM/CMMI Level 3 methods and practices.
  • Over 5+ years implemented agile development processes including test driven development.
  • Over 3+ years' Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have"
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.