Overview
On Site
Depends on Experience
Full Time
Skills
Leadership
Information Technology
High availability
Collaboration
Governance
Specification
Software development
Unstructured data
Relational databases
NoSQL
Flat file
Scalability
Data Lake
Extract
transform
load
ELT
Data loading
Policies
Reporting
Data quality
Data validation
Meta-data management
Regulatory Compliance
Transformation
Workflow
Documentation
Mentorship
Quality assurance
Planning
Microsoft
Real-time
Data management
Amazon Web Services
OCI
Design
Data centers
Cloud computing
Software deployment
scikit-learn
TensorFlow
PyTorch
Microsoft Azure
Machine Learning (ML)
Training
Information systems
Computer science
Incident management
Cyber security
TCP/IP
Security architecture
Hardening
Operating systems
Database
Scheduling
Apache Parquet
XML
JSON
FOCUS
Data modeling
Analytics
Business intelligence
Data integration
Database design
SQL
PL/SQL
Oracle database administration
Performance tuning
Data governance
Data integrity
Data warehouse
Writing
Python
Management
Data
Communication
Analytical skill
Problem solving
Oracle
Security clearance
PASS
Project coordination
Innovation
IMPACT
Recruiting
Job Details
About Our Company:
Delmock Technologies, Inc. (DTI), is a leading HUBZone business in Baltimore, known for delivering sophisticated IT (Information Technology) and Health solutions with a commitment to ethics, expertise, and superior service. Actively engaged in the local community, DTI creates opportunities for talented residents while maintaining a stellar reputation as an award-winning contractor, earning accolades like the Government Choice Award for IRS (Internal Revenue Service) Systems Modernizations.
Location: This position is hybrid, based in Laurel, MD 20707.
Role:
Delmock Technologies, Inc. is seeking a Senior Data Engineer with expertise in Azure Synapse Analytics and Microsoft Fabric to design, develop, and implement scalable data solutions to support analytics and reporting needs. The role involves creating, optimizing, and managing data pipelines to efficiently move and transform data across the Azure ecosystem. The candidate will be responsible for setting up and managing data lakes to store large volumes of structured and unstructured data, ensuring high availability and security. They will collaborate with cross functional teams to gather data requirements and create efficient, scalable architectures. Strong experience in ETL development, data modeling, and cloud technologies like Azure Data Factory, Azure Data Lake, and Synapse Analytics is essential. The candidate will also ensure data quality, security, and compliance with governance standards.
Responsibilities:
Minimum Requirements:
Preferred/Nice to Have Skills:
Clearance:
Recently ranked as high as #3 among HUBZone Companies in a GOVWIN survey, DTI offers a dynamic environment for those passionate about impactful projects, community involvement, and contributing to a top-ranking Federal project support team.
At DTI, we balance continuous growth and innovation with a strong dedication to corporate social responsibility. Join our talented team and be part of a company that values both professional excellence and community impact. Explore the exciting career opportunities awaiting you at DTI!
DTI is committed to hiring and maintaining a diverse workforce. We are an equal opportunity employer making decisions without regard to race, color, religion, sex, national origin, age, veteran status, disability, or any other protected class.
Delmock Technologies, Inc. (DTI), is a leading HUBZone business in Baltimore, known for delivering sophisticated IT (Information Technology) and Health solutions with a commitment to ethics, expertise, and superior service. Actively engaged in the local community, DTI creates opportunities for talented residents while maintaining a stellar reputation as an award-winning contractor, earning accolades like the Government Choice Award for IRS (Internal Revenue Service) Systems Modernizations.
Location: This position is hybrid, based in Laurel, MD 20707.
Role:
Delmock Technologies, Inc. is seeking a Senior Data Engineer with expertise in Azure Synapse Analytics and Microsoft Fabric to design, develop, and implement scalable data solutions to support analytics and reporting needs. The role involves creating, optimizing, and managing data pipelines to efficiently move and transform data across the Azure ecosystem. The candidate will be responsible for setting up and managing data lakes to store large volumes of structured and unstructured data, ensuring high availability and security. They will collaborate with cross functional teams to gather data requirements and create efficient, scalable architectures. Strong experience in ETL development, data modeling, and cloud technologies like Azure Data Factory, Azure Data Lake, and Synapse Analytics is essential. The candidate will also ensure data quality, security, and compliance with governance standards.
Responsibilities:
- Develops requirements for information systems from a project's inception to conclusion.
- Develops required specifications for moderately and highly complex systems.
- Leads other team members in preparing input and test data for the proposed system.
- Provides technical and administrative direction for personnel performing software development tasks, including the review of work products for correctness, adherence to the design concept and to user standard
- Create and optimize complex data pipelines using Azure Data Factory, Synapse Analytics, and other Azure tools to extract, transform, and load data efficiently.
- Implement and maintain Azure Data Lake solutions to store large volumes of structured and unstructured data, ensuring scalability, performance, and security.
- Integrate data from various sources, including relational databases, NoSQL databases, APIs, and flat files, into the Azure environment for analysis and reporting.
- Design and develop robust data architectures, optimizing for performance and scalability in Azure Synapse Analytics and Azure Data Lake environments.
- Develop efficient ETL/ELT processes using Azure Data Factory or other Azure tools to ensure timely and accurate data loading and transformation.
- Ensure data pipelines run smoothly by monitoring, troubleshooting, and resolving issues to minimize downtime and data inconsistencies.
- Continuously optimize data pipelines and query performance, especially within Azure Synapse to handle large data sets and complex transformations efficiently.
- Work closely with data scientists, analysts, and business teams to understand data requirements and deliver scalable data solutions that support analytics needs.
- Implement and enforce security best practices, ensuring data lakes, pipelines, and analytics solutions comply with Azure security standards and data governance policies.
- Design and implement logical and physical data models that support high performance querying and reporting within Azure Synapse.
- Implement data quality checks, data validation processes, and error handling within data pipelines to ensure accuracy and consistency of data.
- Ensure adherence to data governance frameworks, managing data lineage, metadata, and ensuring compliance with organizational and regulatory requirements.
- Implement data partitioning and indexing strategies to improve query performance within data lakes and Synapse.
- Automate data ingestion, transformation, and processing tasks to ensure efficient and scalable data workflows within the Azure environment.
- Create and maintain detailed documentation for data architectures, pipelines, processes, and data models, ensuring transparency and ease of maintenance.
- Provide technical guidance and mentorship to junior data engineers, sharing best practices and ensuring adherence to high-quality engineering standards.
- Monitor resource utilization in Azure environments, planning for future data growth and ensuring efficient use of cloud resources.
- Continuously stay informed on the latest features and best practices in Azure Synapse Analytics, Microsoft Fabric, and the Azure ecosystem, implementing improvements as needed.
- Implement real-time data ingestion and processing pipelines using technologies like Azure Stream Analytics and Event Hubs.
- Design and implement a data mesh architecture to support decentralized data ownership and self-service data infrastructure, ensuring scalable and flexible data management across the organization.
- Architect and manage multi-cloud data solutions, integrating data across different cloud platforms (e.g., AWS, OCI) with Azure Synapse for a unified data and analytics ecosystem.
- Design and manage hybrid data architectures that integrate on premises data centers with Azure cloud environments, ensuring seamless data movement and synchronization between cloud and on prem systems.
- Utilize advanced data cataloging tools such as Azure Purview to create an enterprise-wide data catalog, enabling efficient data discovery and usage across various teams.
- Create and automate end-to-end machine learning pipelines that integrate data ingestion, feature engineering, model training, and deployment using Azure ML, Python (scikit-learn, TensorFlow, PyTorch), and Azure Synapse Analytics.
- Utilize Python-based data augmentation techniques or synthetic data generation (e.g., GANs or SMOTE) to enrich datasets for machine learning training, especially in cases where data is limited or imbalanced.
Minimum Requirements:
- Bachelor's Degree in Information Systems, Computer Science, Engineering, Business or related scientific or technical field and six (6) years minimum of relevant experience.
- Must possess security certifications and/or Database Administrator certifications.
- Possesses and applies a comprehensive knowledge across key tasks and high impact assignments.
- Plans and leads major technology assignments.
- In-Depth Knowledge of the Incident Response life-cycle, working independently to investigate and effectively respond to cyber security incidents.
- Knowledge of the TCP / IP protocol suite; security architecture; securing and hardening Operating Systems; Networks; Databases; and Application.
- Strong knowledge of python for creating and scheduling data pipelines.
- Strong knowledge of Medallion architecture.
- Experience in setting up parquet and delta file structures.
- Experience in working with non-structured data sources.
- Strong knowledge of consuming and exposing data from various data sources like XML, JSON etc.
- 5+ years of experience designing and implementing data solutions and creating data pipelines at enterprise-level applications.
- Demonstrated experience working on large-scale data projects in diverse team environments with a focus on analytics, business intelligence, and enterprise systems.
- Extensive experience with data modeling and database design.
- Proven expertise in implementing enterprise-wide analytics and business intelligence solutions, including data integration from multiple systems into a single data repository
- Deep understanding of database design principles, SQL, PL/SQL, and Oracle database management systems, including performance optimization and troubleshooting.
- Familiarity with data governance frameworks, ensuring data integrity, quality, and security within an enterprise context.
- Strong experience in creating data lakes and data warehouses.
- Strong knowledge of writing python code to create and manage data pipelines
- Excellent verbal and written communication skills, with the ability to work closely with stakeholders to translate business needs into technical solutions.
- Strong analytical skills and problem-solving abilities, especially when working with large, complex datasets.
Preferred/Nice to Have Skills:
- Preferred to have experience in water and wastewater industry understanding of oracle utility applications.
Clearance:
- The ability to pass a background check is required.
Recently ranked as high as #3 among HUBZone Companies in a GOVWIN survey, DTI offers a dynamic environment for those passionate about impactful projects, community involvement, and contributing to a top-ranking Federal project support team.
At DTI, we balance continuous growth and innovation with a strong dedication to corporate social responsibility. Join our talented team and be part of a company that values both professional excellence and community impact. Explore the exciting career opportunities awaiting you at DTI!
DTI is committed to hiring and maintaining a diverse workforce. We are an equal opportunity employer making decisions without regard to race, color, religion, sex, national origin, age, veteran status, disability, or any other protected class.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.