Title: Enterprise Data & Analytics - Platform Administrator Location: Dallas, TX; Phoenix, AZ What you'll do: As the Data Platform Administrator, you are accountable to maintain and administer data governance, data management and analytics platforms. Your responsibilities include the design and configuration of the suites of applications and tools that enable Western Alliance Bank to be data driven. You are highly skilled in troubleshooting issues end to end in collaboration with network enginee
Position: Data Engineer, Location: Plano, Texas Job Type: Full-Time (w2 only) Visa: H1B, EADs Experience: 8+ Job Description: In this position/ role, resource is expected to do: Understand existing workflows and underlying frameworksMigrate these existing workflows to new CapitalOne specific data ingestion frameworkCollaborate with and across Agile teams to gather metadata to meet the current data goverance standardsPerform unit tests and conduct reviews with other team members to make sure you
1 round interview process and it will be virtual. This position does have the opportunity to go perm and the manager is open to visa candidates. The interview process will be a virtual one round interview and the top 3-5 skills are below! 1. Must be an expert with Puppet 2. Must have advanced working knowledge of Ansible 3. Must have advanced working knowledge of VMware vRealize Automation 4. Depth in working with Windows 2022, RHEL 8, Ubuntu 22 & 24 5. Advanced automation with scripts (Bash/Pow
color:black;mso-ansi-language:EN-US">Job Title: C++ Engineer MUST HAVE: C, C++, Oracle Tuxedo, Oracle Pro C, Oracle Pro C++, SQL, PL/SQL, Linux, Shell Scripting NICE TO HAVE: Java Swing EXPERIENCE LEVEL: Senior Level DESCRIPTION: Loup logistics has a BPA level application, UOP, which is a homegrown TMS system used to handle the planning and back-office support of Loup's Transload product. UOP was created in the late 1990s and consists of multiple technologies. The front end of UOP is a java sw
Location: Dallas, TX Salary: Negotiable Description: Our client is currently seeking a Kubernetes Platform Engineer [ Additional Description ] Title- Kubernetes Platform Engineer Location- Irving, TX (Hybrid) Position Type: contract JOB DESCRIPTION: Targeted Years of Experience: 10+ years Responsibilities Build and deploy highly scalable K8s container platform Automate the build of containerized systems with CI/CD, GitOps tools, Helm charts, and more. Setup and installation of containeri
Job Description: Technical Project Manager in Westlake TX (Onsite from Day 1) - Permanent/Full time Key Skill Project Management, ETL, informatica, Hadoop, IICS, Tableau, Google Cloud Platform, DevOps, Python, Shell Scripting, Control-M. Job Description: Good knowledge of Data projects, development and support, knowledge of ETL, BI and Cloud technologies such as IICS, Tableau, Google Cloud Platform. Knowledge of DevOps, Python, Shell Scripting, Control-M, SQL, Perl, Business Objects, Hadoop,
Job#: 2030422 Job Description: To apply, please email a word resume to Tatiana at Title: Information Security Engineer Location: Must be onsite in any of these locations- Dallas TX, Phoenix AZ, Charlotte NC, Columbus, OH, Des Moines IA, Minneapolis MN, Raleigh NC, San Antonio TX, St Louis MO Duration: 1 year contract with potential to extend Vulnerability Assessment Team (VAT) requires additional resources to ensure proper implementation of new Private Cloud Platforms (PCP) enabling the team to
The Expertise and Skills You BringBachelor s degree in Computer Science or similar field, Masters a plus5+ years of development experience with strong knowledge of q/KDB+ programming, Python, Scripting (Shell/Perl) , Unix or Linux, SQL/Relational Databases, such as Oracle, is requiredA deep understanding of various programming languagesHands-on experience in application deployment automation and pipelinesExperience implementing projects as a team in Agile environments (Kanban and Scrum)Experienc
Job Description Minimum 8 years of experience in Data Management and specialized in Analytical data warehouse.Minimum 3 years hands on experience with Snowflake cloud warehouse and its features.Good knowledge on cloud AWS architecture and its services.Experience in creating and analyzing SQL and PL/SQL procedures for data integration.Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflake SQL. Developing scripts using Unix shell to do Extract and Load data
Job Description: Need a Minimum of 10 years experience.Experience in implementation, configuration, and customization of the Identity and Access Management system, SailPoint IIQ.Experience in Access Request, Certification, Provisioning, User Life Cycle Management Processes, Workflows, and Role-based access control.Experience with implementing application onboarding, custom rules, forms, workflows, and configuring various types of access certifications in IIQ.Experience with web services framewor
Role : Data Engineers Location : Plano TX - only Locals Rate: based on exp and NO H1B Looking o Focus on #Spark #PySpark #Python #AWS Qualifications 5+ years of experience in application development including Python, SQL, Scala, or Java 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 3+ years experience with Distributed data computing tools (Kafka, Spark, Flink etc) 2+ year experience working on real-time data and streaming applications. 2+ years of experience w
About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 9
Title: APIGEE Platform Engineer Location: Irving, TX (Remote work possible for exceptional candidates in EST or CST hours) Duration: Long Term Primary Skills: API Administrator (not developer)APIGEE - OPDK and/or HybridManage API PlatformManaging and resolving ticketsProduction Support: On call rotation (looking at elastic logs or Splunk logs, diagnose and troubleshoot the problem if it is on Platform side)Cloud AWS (CI/CD, Jenkis etc,)Unix, Linux, Shell Scripting -- Thanks, Rajkumar, Ph: x 1
Our client is looking for a Snowflake Data Management Specialist with a minimum of 8 years of experience specializing in Analytical data warehousing. The ideal candidate will have at least 3 years of hands-on experience with Snowflake cloud warehouse and its features, along with a good understanding of cloud AWS architecture and its services. Key Responsibilities: Develop and analyze SQL and PL/SQL procedures for data integration.Build ETL pipelines in and out of data warehouse using Python and
Database Engineer Dallas, TX LongTerm Onsite Database engineer who would have used data migration tool Talent. Along with Oracle postgres SQL, Json Talend ETL expertise.Good knowledge in Oracle views, stored procedures, triggers, materialized views etc.Good knowledge in PostgreSQL and jsonb processing.Hands on experience in SQL and query tuning for performance.Good to have. Oracle to PostgreSQL data migration experience.Unix and shell script knowledge.
Title: Azure DevOps Engineer Location: Plano, TX (onsite) Duration: 12 months+ Candidate must be work from office from Day 1 Job Description: Scripting experience using PowerShell & Groovy. Python Job Description: Minimum 9 years of experience with other AzureDevOps tools, such as Github, Jenkins/GitHub Actions, Artifactory, Ansible/Octopus Deploy, , AppDynamics, Splunk or equivalentMinimum of 4 years of experience with dockerfile and image creation, running containers on Kubernetes or using Do
Job Title: APIGEE Platform Engineer Duration: Long Term Location: Irving, TX (Hybrid mode) Interview: Telephonic & F2F/Skype Description: Responsibilities: SME in APIGEE (OPDK and/or Hybrid) operations, implementation, troubleshooting, monitoring, and managementAnalyses incidents, coordinates with application owners to help troubleshoot and fix issues related to the APIGEE platformPerform platform upgrades, base image upgrades, certificate updates, scaling and other maintenance activities on a r
We are looking to hire Sr. Devops Lead / Architect for Dallas, TX / Pittsburgh, PA Required Skill set: Experience: 10-15 years, DevOps Lead with IC role. Language - Python, Shell scripts, SQL, pyspark, Java (optional), React or Angular Js. Database: HDFS, Oracle, Teradata, Big data, Parquet files, ELK, Graph DB OS: Linus, Unix, windows - working on Secured VM setup and cluster Tools: Open Shift Container Platform, Jenkins, Kubernetes, Bitbucket, Putty, Secured CRT, ETL tools like Teradata, info
2 round interview process. First will be virtual and the second will be in person with a coding assessment. What you'll do Developer working on maintenance product applications. Collaborates with leaders, business analysts, project managers, IT architects, technical leads and other developers, along with internal customers, to understand requirements and develop needs according to business requirements Maintains and enhances existing enterprise services, applications, and platforms using domain
Location: Richardson, TX Description: Transport Engineer | Richardson, TX (Fully Onsite) | 12+ Months Project Description: 5x per week in office As a member of the Systems and Maintenance Engineering (S&ME) Intelligent Edge Network Lab Governance Team, you'll be responsible for creating and executing support strategies for S&ME Lab resources. Primary responsibilities will include: Implement lab test bed setup based on requirements from IT, Planning, and Maintenance Engineering teams Insuring