fbpx Hadoop Developer | ConnectingBridge.com Skip to main content

Hadoop Developer

Job Summary

 

Primary Responsibilities:

 

Hands-on Development in Hadoop

Design and develop data flows using Hive, Python, Spark and Scalia

Translate complex functional and technical requirements into detailed design

Perform analysis of large data sets (~1TB) and uncover insights

Able to work in a collaborative Agile environment

Translate complex functional and technical requirements into detailed design

Leverage and propose best practices and standards

Work with service delivery (support) team on transition and stabilization

Will be working in an AWS environment so assumption is they will be using Elastic Map Reduce (EMR) - this is the preferred flavor of Hadoop

Needs to be able to understand design and implementation of Hadoop solutions.

Provide Hadoop architectural oversight

Utilize Cloudera Hadoop Distribution scheduler, Bamboo CI Server and Control M workload automation Job Schedulers

Able to work in a collaborative Agile environment, utilizing Jira and Confluence

Work with service delivery (support) team on transition and stabilization

Required Qualifications:

 

5-7 years Java and Hadoop development experience

Proficient in AWS Big Data Solutions

Hands on experience on AWS cloud services (i.e. EC2, S3, EBS, RDS and VPC)

3+ years with Scala, Python, Spark, HBase

Preferred Qualifications:

 

Experience implementing Hadoop with Mainframe Systems

DB2 experience (helpful)

Financial background, understanding of basic financial and accounting processes

Strong teamwork and communication skills

Must be a team player and able to operate under direction when needed

AWS Big Data Certification

Job Type: Contract

 

Experience:

 

Hadoop: 5 years (Required)

Spark: 3 years (Required)

Scala: 3 years (Required)

Location: 
Indianapolis
Region: 
Indiana
Employment type: 
Corp to Corp
Years of experience: 
3 - 5 years
Required languages: 
English
Required degree level: 
Bachelor's degree