Solutions Architect Principal

  • REMOTE WORK, CA
  • Posted 30 days ago | Updated 8 hours ago

Overview

Remote
On Site
USD 120,001.00 - 160,000.00 per year
Full Time

Skills

Data Analysis
Data Processing
Product Engineering
Optimization
Apache Kafka
Apache Flink
Streaming
Data Collection
Documentation
Data Quality
Analytics
Security Clearance
Science
Information Systems
Mathematics
Python
Docker
Collaboration
Machine Learning (ML)
Algorithms
Data Modeling
Real-time
Kubernetes
Network Security
Grafana
Orchestration
Access Control
Encryption
Vulnerability Management
Amazon Web Services
Microsoft Azure
DevOps
SQL
Database
Big Data
Apache Spark
PySpark
Unit Testing
Linux
Shell Scripting
Extract
Transform
Load
Cloud Computing
End-user Training
Presentations
Communication
Security+
Cisco Certifications
Apache NiFi
SAP BASIS
Information Technology
Systems Engineering
FOCUS

Job Details

Job ID:

Location: REMOTE WORK, CA, US

Date Posted: 2025-02-07

Category: Engineering and Sciences

Subcategory: Solutions Archt

Schedule: Full-time

Shift: Day Job

Travel: No

Minimum Clearance Required: Top Secret

Clearance Level Must Be Able to Obtain: None

Potential for Remote Work: No

Description

We're searching for a Senior Data Scientist that understands streaming data analytics, real-time tracking, and machine learning to lead and execute complicated data initiatives. In this role, you will design and develop solutions that allow for real-time data processing and tracking of user behaviors, transactions, and other events at scale. You will work with product, engineering, and data teams to develop models that process and analyze streaming data from a variety of sources (such as web interactions, and transaction logs). Your insights will guide product decisions, optimization initiatives, and customer experiences.

There will be remote options for candidates that are local to Camp Smith, HI, San Diego, CA, or Washington, DC.

Essential Functions/Job Duties:
  • Design, develop, and optimize scalable real-time data pipelines for streaming data using tools such as Apache Kafka, Apache Flink, Spark Streaming, and others.
  • Implement real-time tracking technologies to capture events, interactions, and behaviors across various platforms.
  • Lead the development and deployment of machine learning models to process, classify, and predict real-time data.
  • Collaborate with product teams to define key tracking metrics and ensure data collection aligns with business objectives.
  • Directly support deployment of software on Kubernetes and Docker.
  • Produce documentation for clients to assist with custom applications.
  • Provide tutorials and technical information for product documentation.
  • Development using Python, for front-end and back-end apps.
  • A thorough knowledge of indexing and search methodologies.
  • Develop algorithms for event-based tracking and behavioral analytics, ensuring data quality, integrity, and security.
  • Stay up to date with the latest trends in real-time analytics, machine learning, and big data technologies.

Qualifications

Required Qualifications:
    Must have the ability to obtain a TS/SCI clearance.
  • Bachelor of Science in Information Systems / Information Technology / Mathematics or related field. In lieu of a degree, four (4) additional years of work experience may be considered.
  • Must have five (5) or more years of hands-on experience with Python, Docker, and containerized applications in both development and production environments.
  • IAT Level 1 (CompTIA Security + and CCNA) certification (it can be obtained within 6 months of hire).
  • Flexibility to work on-site and collaborate with team members and mission partners.
  • Travel up to 25%.
  • Strong expertise in machine learning algorithms and data modeling, with experience applying them to large, real-time datasets.
  • Strong knowledge of Kubernetes networking, security, and monitoring tools (e.g., Calico, Istio, Prometheus, Grafana).
  • Proficiency in container orchestration using Helm and Kustomize.
  • Understanding of security best practices in regulated environments (e.g., access controls, encryption, vulnerability management).
  • Must have experience with Cloud technologies such as AWS or Azure.
  • Understanding of DevOps - Understanding of infrastructure.
  • Must have three (3) or more years SQL / Database integration experience.
  • Experience with Big Data experience (Spark / Pyspark).
  • Familiarity with unit testing software.
  • Linux commands and shell scripting experience.
  • Experience building data-driven applications with an understanding of ETL and data pipelines that support them.
  • Experience working with Cloud technologies.
  • Ability to deliver customer training and presentations.

Verbal Communications:
  • Demonstrates excellent clear and concise verbal communication skills.
  • Active communicator - giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate.

Desired Experience and Certifications:
  • IAT Level 1 (CompTIA Security + and CCNA).
  • Familiarity with Apache NiFi.

Target salary range: $120,001 - $160,000. The estimate displayed represents the typical salary range for this position based on experience and other factors.

SAIC accepts applications on an ongoing basis and there is no deadline.

Covid Policy: SAIC does not require COVID-19 vaccinations or boosters. Customer site vaccination requirements must be followed when work is performed at a customer site.


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About SAIC