Hi,

 

We have an opening for “ Hadoop Developer ” with our client @ Overland 
Park, Kansas.

 

Contract : 6 Months

 

If you’re interested, kindly send me your updated resume asap.

 

2+ years’ experience working with Big Data eco-system including tools

2+ such as Hadoop, Map Reduce, Yarn, Hive/Impala/Presto, Pig, Spark,

2+ Kafka, Sqoop, Hue, and Storm is preferred

Knowledgeable in techniques for designing Hadoop-based file layout 
optimized to meet business needs Understands the tradeoffs between 
different approaches to Hadoop file design Experience with techniques of 
performance optimization for both data loading and data retrieval 
Experience with NoSQL Databases – HBase, Apache Phoenix, Apache Cassandra, 
Vertica, or MongoDB Able to translate business requirements into logical 
and physical file structure design Ability to build and test solution in 
rapid and iterative manner Ability to articulate reasons behind the design 
choices being made Designing, building, installing, configuring and 
supporting Hadoop.

•     Translate complex functional and technical requirements into detailed 
design.

•     Perform analysis of vast data stores and uncover insights.

•     Maintain security and data privacy.

•     Good Knowledge on DW and BI

•     help build new Hadoop clusters or Manage the existing one

•     Ability to write MapReduce jobs

 

Strong technical expertise in most of the following:

 

Hadoop (Hortonworks distribution)

Talend for Big Data

Apache Hive

Apache Phoenix

Apache Spark

Apache Hbase

Apache Sqoop

Apache Hue

Kafka

SQL and NO SQL Data stores

Linux

Strong communication skills, both written and oral Excellent teamwork and 
interpersonal skills Potential and ability to lead small engagements or 
work streams within large engagements Aptitude for trouble-shooting and 
problem-solving Strong technical skills including understanding of software 
development principles Hands-on programming experience

 

•     Hadoop development and implementation.

•     Loading from disparate data sets.

•     Pre-processing using Hive and Pig.

•     Designing, building, installing, configuring and supporting Hadoop.

•     Translate complex functional and technical requirements into detailed 
design.

•     Perform analysis of vast data stores and uncover insights.

•     Maintain security and data privacy.

•     Good Knowledge on DW and BI

•     help build new Hadoop clusters or Manage the existing one

•     Ability to write MapReduce jobs

 

 

Thanks and Regards,

Abdullah. K  – Senior Technical Recruiter

Aptino, Inc.

+1 (817) 440-4150

www.aptino.com

[image: cid:[email protected]]

 

 

 

 

 

 

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"RESOURCE OPTIONS, INC." group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/resourceoptions.
For more options, visit https://groups.google.com/d/optout.

Reply via email to