Data Infrastructure Engineer

Overview

Remote
Depends on Experience
Full Time

Skills

Apache Flink
Elasticsearch
Docker
Kubernetes

Job Details

Who we are:

 

ShorePoint is a fast-growing, industry recognized, and award-winning cybersecurity services firm with a focus on high-profile, high-threat, private and public-sector customers who demand experience and proven security models to protect their data. ShorePoint subscribes to a work hard, play hard mentality and celebrates individual and company successes. We are passionate about our mission and going above and beyond to deliver for our customers. We are equally passionate about an environment that supports creativity, accountability, diversity, inclusion, and a focus on giving back to our community.

 

The Perks:

 

As recognized members of the Cyber Elite, we work together in partnership to defend our nation s critical infrastructure while building meaningful and exciting career development opportunities in a culture tailored to the individual technical and professional growth. We are committed to the belief that our team members do their best work when they are happy and well cared for. In support of this philosophy, we offer a comprehensive benefits package, including major carriers for health care providers. Highlighted benefits offered: 18 days of PTO, 11 holidays, 80% of insurance premium covered, 401k, continued education, certifications maintenance and reimbursement, etc.

 

Who we're looking for:

 

We are seeking an Data Infrastructure Engineer who has experience providing support in a dynamic, fast-paced environment within the public sector. This is a unique opportunity to shape the growth, development, and culture of an exciting and fast-growing company in the cybersecurity market. The Data Infrastructure Engineer will have the opportunity to be exposed to all aspects of support to a federal client and will be encouraged to grow as the organization expands.

 

What you'll be doing:

 

  • Integrate multiple Commercial Off the Shelf (COTS) and open-source products, software configuration packages and custom code to work together to operate as a single solution, tailored to meet customer requirements.
  • Work as part of an agile development team to conduct systems design, analysis and development of the solution.
  • Create data pipelines and implement ETL processes for both stream processing (such as tool and sensor log data) and batch processing (such as daily vulnerability updates), integrating functionality between applications, leveraging APIs and scripting languages (Python preferred).
  • Conduct data modeling, hands on configuration, tuning and operating of distributed data storage stacks, with a focus on Elasticsearch and Kafka.
  • Query data to include aggregations, calculations, and producing metrics from data; design and implement data visualizations.
  • Engage in all agile ceremonies including backlog grooming, demos and retrospectives.

 

What you need to know:

 

  • Demonstrated experience writing well-structured code and applications using coding best practices to deliver enterprise applications.
  • Proven experience in developing robust, scalable data pipelines and integrations.
  • Solid understanding of application architecture and interfaces as well as experience with data modeling.
  • Expertise in working with streaming data and implementing real-time data processing solutions.
  • Ability to develop and deploy in containerized environments (Docker, Kubernetes).
  • Experience with open-source tools including Kafka, Logstash, Beats, Elasticsearch, Kibana, or Splunk.

 

Must have's:

 

  • Minimum of 5-7 years of relevant experience.
  • Strong written and verbal communication skills.
  • Strong documentation skills.
  • Experience leveraging data processing technologies such as Apache Kafka and Elasticsearch.
  • Skilled problem-solver with strong ability to troubleshoot complex data pipeline issues.
  • Ability to work with cross functional teams.
  • Ability to obtain agency required clearance.

 

Beneficial to have the following:

 

  • BS in Computer Science, Information Systems, Mathematics, Engineering, related degree.
  • Industry related certifications.
  • Experience with cloud platforms (AWS, Azure, Google Cloud Platform).
  • Familiarity with cybersecurity concepts and tools.
  • Experience with real-time data processing frameworks (e.g., Apache Flink, Apache Spark).

 

Where it's done:

 

  • Remote (Herndon, VA).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.