Hadoop Jobs

Refine Results
1 - 20 of 836 Jobs

Data Engineer Senior - Capital Markets Technology (MongoDB, Hadoop, Python, Tableau)

PNC Financial Services

Pittsburgh, Pennsylvania, USA

Full-time

Job Profile Position Overview At PNC, our people are our greatest differentiator and competitive advantage in the markets we serve. We are all united in delivering the best experience for our customers. We work together each day to foster an inclusive workplace culture where all of our employees feel respected, valued and have an opportunity to contribute to the company's success. As a Data Engineer Senior within PNC's Capital Markets Technology organization, you will be based in Pittsburgh, P

Java developer with hive, spark Hadoop and Google Cloud Platform

Spiceorb

Sunnyvale, California, USA

Contract, Third Party

Spiceorb is hiring for Java developer with hive, spark Hadoop and Google Cloud Platform Immediate start. 2-3 days hybrid onsite in Sunnyvale, CA Java developer with hive, spark Hadoop and Google Cloud Platform - all the skills are must.. look for a candidate with a combination of Java and data.

Hadoop Java Developer

Collabera LLC

Charlotte, North Carolina, USA

Contract

Must Haves: 5+ years of Hadoop development experience 3+ years Java application programming experience Experience in multiple database platforms: SQL Server, Oracle, Exadata, Big Data

Hadoop/Exadata Developer

Oloop Technology Solutions

Kennesaw, Georgia, USA

Contract

Job Title : Hadoop/Exadata Developer Location: Chicago, IL, Kennesaw, GA, Richmond, VA Job Duration: 12 months CTH Software Developer is supporting Internal Employee Compensation Projects within the Collabra application. Must Have : Oracle Exadata/Hadoop Unix Shell Scripts Data Quality tools (any) Familiarity with writing rules, rule migration, etc, for Collibra DQ Familiarity with Data movement ETL/ELT PL/SQL, T-SQL Rest API hand on. Good to Have : Collibra DQ tools Data Warehouse/ETL Must-Have

Data Engineer with Hadoop spark/Scala/Java

Xoriant Corporation

San Francisco, California, USA

Contract

Data Engineer San Francisco, CA Hybrid Long Term contract We have a new demand for a data engineer in San Francisco (needs to work 3 days out of office) and should do an in-person interview. Core skill needed is Apache Spark and a programming language (preferably Scala with Java) or else Python. Skill level expected: excellence Secondary skill needed is some experience with Distributed computing, Big Data, Oozie, Postgres etc. Skill level expected: good

BI Hadoop Developer

Avance Consulting

Santa Clara, California, USA

Full-time

Job Title: BI Hadoop Developer Location: Santa Clara, CA Description Need Hadoop developer with focus on BI on Hadoop. Should be able to demonstrate work experience in hive/Impala and Spark. Strong spark hands on skill a must. Working knowledge of Python, shell scripting and bash. Must have work experience in Hadoop as data warehouse/data lake implementations. Will be required to provide references and will be asked to demonstrate coding.

Hadoop Admin

Guardians Infotech, LLC

Austin, Texas, USA

Third Party, Contract

10 years IT profile very strong Hadoop Admin who worked for good clients Min 7 to 8 years of experience as a Hadoop Administrator

Google Cloud Platform Developer with Hadoop / Hive

Avance Consulting

Dearborn, Michigan, USA

Full-time

Google Cloud Platform Developer with Hadoop / Hive Location: Dearborn MI (FTE) Year of Experience : 5 to 7 yrs. Job Description Develop a comprehensive migration strategy for our data infrastructure and workloads to Google Cloud Platform Handle complex migrations from legacy Teradata warehousing solutions or on-prem Hadoop/Hive to Big Query in Google Cloud Platform Implement Cloud based solutions based on various Google Cloud Platform services and technologies such as BigQuery, Dataflow, Datapro

Hadoop Administration

TalentLynk Technologies

Seattle, Washington, USA

Contract, Third Party

Supporting HDInsight adopted customer as DRI support. Helping customer in providing resolutions to end customer by resolving the ICM. Responsibilities would include Troubleshoot production issues, identify the root-cause, and provide mitigation. Monitoring cluster health for top customers and provide support. Provide Infrastructure management, live site engineering and network management, Support operations & incident management, product customer technical support remote monitoring, and servic

Hadoop Developer|| Newark DE, Atlanta GA, Plano TX, Pennington NJ, Charlotte NC

Sarian Solutions

Charlotte, North Carolina, USA

Full-time

Hadoop Developer Location: Newark DE, Atlanta GA, Plano TX, Pennington NJ, Charlotte NC Fulltime Skill: Hadoop Developer Experienced in Hadoop, Unix shell scripting. 7+ Years of experience in the data warehousing architectural approaches and minimum 3 years of experience in Big data(Cloudera). Exposure to and strong working knowledge on distributed systems. Excellent understanding of client service models and customer orientation in service delivery. Ability to grasp the 'big picture' of a sol

Hadoop Developer

Sarian Solutions

Charlotte, North Carolina, USA

Full-time

Technical/Functional Skills Good understanding of Hadoop concepts including file system and Map Reduce. Hands on experience on Spark framework, Unix scripting, Hive queries, writing UDF in Hive. Theoretical knowledge and POC alone will not suffice. Good knowledge in Software Development Life Cycle and Project Development Lifecycle. Associate should be able to work independently and should have strong debugging skill in both Hive and Spark. Associate should have experience developing large scale

Hadoop Admin

Infosys Technologies Ltd

Charlotte, North Carolina, USA

Full-time

Infosys is seeking a Hadoop/MapR Administrator. In this role, you will enable digital transformation for our clients in a global delivery model, research on technologies independently, recommend appropriate solutions and contribute to technology-specific best practices and standards. You will be responsible to interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle. You will be part of a learning culture, where teamwo

Hadoop Java Feature Lead

Bank Of America

Plano, Texas, USA

Full-time

At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We're devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds

Hadoop Administrator

4A IT Services LLC

St. Louis, Missouri, USA

Full-time

Job Title: Hadoop Administrator Location: St Louis, MO Role Type: Contract Contract Length: 36 Months How to Apply: Pease send your resume and contact information to Recruiters at 4aitservices dot com Job Description/Responsibilities: As a Hadoop Administrator, you will be responsible for the design, implementation, maintenance, and support of our Hadoop infrastructure, with a focus on AWS (Amazon Web Services) and EMR (Elastic MapReduce). You will collaborate closely with cross-functional team

Informatica with Hadoop @ Charlotte NC, Dallas TX

Okaya Inc

Dallas, Texas, USA

Full-time

Position: Informatica with Hadoop Location: Charlotte NC, Dallas TX Role Type: Full Time Employment 10+ years of IT experience. At least 6+ years of experience in ETL , in Informatica Power Center tool. Should be strong in Data warehousing concepts. Should have good experience Oracle ( Database). Should have experience with Autosys. Good experience in Unix scripting. Should have Agile working experience. Good to have baking domain experience. test

Hadoop Support Lead - Local to Cleveland or Pittsburgh

CGI

Pittsburgh, Pennsylvania, USA

Full-time

Hadoop Support Lead - Local to Cleveland or Pittsburgh Position Description CGI is looking for an experienced Lead Hadoop Support Engineer to join our team. The candidate will serve as a Lead Engineer in our Data Management department and should possess deep knowledge of the Hadoop ecosystem such as Sqoop, Spark, Hive, HDFS, Impala, Kafka. Ideally having 5+ years of experience administering and supporting CDP and experience in a production support role leading a large team is required. Position

Google Cloud Platform Developer with Hadoop / Hive

Headway Tek Inc

Dearborn, Michigan, USA

Full-time, Part-time, Contract, Third Party

Job Title: Google Cloud Platform Developer with Hadoop / Hive Location: Dearborn MI (FTE) _ day 1 onsite. FTE : 110-120K/ Annum Year of Experience : 5 to 7 yrs. Job Description Develop a comprehensive migration strategy for our data infrastructure and workloads to Google Cloud Platform Handle complex migrations from legacy Teradata warehousing solutions or on-prem Hadoop/Hive to Big Query in Google Cloud Platform Implement Cloud based solutions based on various Google Cloud Platform services

BigData Hadoop Engineer

LTIMindtree

Scottsdale, Arizona, USA

Full-time

About Us: LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 9

Senior Hadoop Developer

SSI People

Cleveland, Ohio, USA

Contract

Responsibilities Support ongoing application enhancements, new deployments, development for new functions, capabilities and enhancements to the existing code. Required Skills Experience in managing kafka based dependent java libraries. Experience with Python as used for all data engineering tasks like ingest, etl and aggregation. Experience with SCALA - Object oriented programming language for data processing. Familiarity of the Hadoop technology stack and utilities associated. Hive PySpark

Hadoop Developer with Java - W2 - Hybrid

Saksoft

Charlotte, North Carolina, USA

Contract

Hadoop Developer with Java Charlotte, NC 3 days onsite/2 remote 12+ months contract Requirements: 60-70% Hadoop, 30-40% Java Hadoop understands setting up and understanding the data Java exp (1-2 years min) spring hibernate, spring boot, javascript, ajax, css html Reporting exp Tableau, Power BI, MicroStrategy, etc Top Skills 5+ years of hands-on software development experience Experience building web applications using J2EE technologies and frameworks like Spring Experience building RESTful and