Overview
Skills
Job Details
We re looking for a Senior Data Engineer who can build and deploy data solutions for various financial business domains by driving architecture, design, build data pipelines and support the data solutions, creating new/leveraging existing data framework capabilities to acquire, transform, stream and integrate data. Candidate should be also able to lead and mentor team of data engineers, collaborate with business stakeholders and convert the business requirements to technical requirements.
Primary responsibilities include:
- Building robust data pipelines to collect process and compute different metrics from various financial sources, adhering to quality and development standards.
- Designing application architecture / technical design and articulating the data frameworks to team members and data engineers.
- Collaborating with cross functional team members as necessary and coming up with optimal solutions that meet data demands.
- Deploy high performing, quality/bug-free code in production.
- Should have strong understanding of database design principles, data modeling techniques ad the ability to translate business requirements into scalable data solutions.
- Drive technical requirements of Data Frameworks and build solutions to meet them.
- Identify opportunities to expand the frameworks as well as create new frameworks that can be built once and re-used across multiple use cases.
- Executing unit test of data populated in target data container, validating expected result and ensuring quality and accuracy. Coordinating with business users for User Acceptance Testing and with Operations team for code deployment to upper environments.
- Following change management team stipulations on path to production requirements and strictly adhering to the compliance and regulatory needs.
Required Skills/Experience:
At-least of 7 years of experience in the technologies listed below.
Working experience with streaming capabilities such as Apache Spark, Beam, Flink Working experience with message brokers/messaging platforms such as Apache Kafka Proven track record of working with microservices and batch sources Hands-on in a programming language such as Java, Scala Excellent skills in SQL Development Working experience with ETL Development tools like Talend, Data Stage would be a plus Working experience with Java Spring boot, React, Typescript, Angular for UI would be a plus Deep understanding of Cloud offerings for Data processing (AWS would be a plus) Ability to work on AWS S3, Redshift and other databases such as Postgres, Oracle, MongoDB.