Cloud Data Engineer - Sr. Solution Specialist

  • Dallas, TX
  • Posted 44 days ago | Updated 2 hours ago

Overview

On Site
Full Time

Skills

Business requirements
Project planning
Development testing
Specification
Documentation
User interface design
Process flow
Software development
Apache Spark
Analytical skill
Real-time
Data processing
Robotics
Decision-making
Strategy
Artificial intelligence
Data management
Governance
Unstructured data
Science
Management
Operational efficiency
Sourcing
Analytics
Data engineering
Data Analysis
Reporting
Statistics
Database
Apache Cassandra
MySQL
Snow flake schema
PostgreSQL
SQL Azure
Data warehouse
Databricks
Design
Data extraction
Transformation
Writing
Data
Scripting
SQL
Apache Kafka
Cloud computing
Amazon EC2
Amazon S3
Electronic Health Record (EHR)
Amazon Kinesis
Remote Desktop Services
Amazon RDS
Amazon Redshift
Spectrum
API
Computer science
Information Technology
Computer engineering
Security clearance
RADIUS
Amazon Web Services
Microsoft Azure
Google Cloud Platform
Google Cloud
Programming languages
Scala
PySpark
Python
MapReduce
MPP
Agile
Sprint
Scrum

Job Details

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...

Work you'll do/Responsibilities
  • Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management
  • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements
  • Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
  • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming
  • Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or Google Cloud Platform Methods
  • Use an analytical, data-driven approach to drive a deep understanding of fast changing business
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or Google Cloud Platform cloud platform
  • Moving data from on-prem to cloud and cloud data conversions

The Team

Artificial Intelligence & Data Engineering:

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Artificial Intelligence & Data Engineering will work with our clients to:
  • Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms
  • Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions
  • Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements

Qualifications

Required
  • 6+ years of experience in data engineering with an emphasis on data analytics and reporting
  • 6+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (Google Cloud Platform), others
  • 6+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Data Warehouse, Databricks, etc.)
  • 6+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines
  • 6+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other
  • 6+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
  • Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline, or equivalent experience
  • Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
  • Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve. This may include overnight travel.
  • Must be able to obtain the required level of security clearance for this role
  • Must live in a commutable distance (approximately 100-mile radius) to one of the following Delivery locations: Atlanta, GA; Charlotte, NC; Dallas, TX; Gilbert, AZ; Houston, TX; Lake Mary, FL; Mechanicsburg, PA; Philadelphia, PA with the ability to commute to assigned location for the day, without the need for overnight accommodations
  • Expectation to co-locate in your designated Delivery location up to 30% of the time based on business needs. This may include a maximum of 10% overnight client/project travel

Preferred
  • AWS, Azure and/or Google Cloud Platform Certification
  • Master's degree or higher
  • Expertise in one or more programming languages, preferably Scala, PySpark and/or Python
  • Experience working with either a Map Reduce or an MPP system on any size/scale
  • Experience working with agile development methodologies such as Sprint and Scrum

Information for applicants with a need for accommodation:
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Deloitte