Thursday 1 March 2018

Need Sr Hadoop Developer

We are looking for Hadoop Developer. Experience with Hadoop ecosystem tools like Hive , Kafka, Spark and Spark Streaming, HDFS, YARN including Python and Nosql.

Exp:8 years

Location: Charlotte, NC

 

Positions: 1

 

Rate: $50 / hr, Depends on Experience.

 

NO OPT, Need H1B copy, DL for submision.

 

Duration: Long Term

 

Responsibilities:

Codes and performs unit and integration testing on following Hadoop ecosystem tools like Hive , Kafka, Spark and Spark Streaming, HDFS, YARN including Python and Nosql database knowledge to ensure proper and efficient execution and adherence to business and technical requirements

Codes, tests, and debugs new software or makes enhancements to existing software

Designs and writes programs according to functional and non-functional requirements

Leads code review sessions to validate adherence with development standards

Reviews and provides input into development standards

Develops and maintains technical documentation

Develop and deploy Data Integration

Collaborates with internal customers, technical and architecture teams to solve complex software problems

Works closely with all technical development teams on optimal utilization of Big Data solutions and Apache Open Source Software.

Provides general system users and management with system analysis and feedback

Requirements:

6 plus years' experience with coding, testing and design

3 years of Hadoop development with experience in processing Data in Apache Kafka with Streaming in Apache Spark.

2 years of working knowledge of Hive with experience in python programming language.

Thanks

David

 

No comments:

Post a Comment