Hi,


I Hope you are doing fine!!!



Please review the requirements and if you are interested then reply back
with your updated resume and contact details ASAP to
[email protected]



*Role : **Hadoop Developer*



*Location : Charlotte, NC Duration : Long term (12+ Months) Start Date :
Immediate.*

*Rate : DOE/ Market*



*Skill set:*



*Data: *

·         RDBMS, OLAP, OLTP concepts

·         Modeling design considerations and optimization

*Development: *

·         File management on UNIX

·         Application performance monitoring and troubleshooting on UNIX

·         SH / KSH

·         Java

·         Python

·         SQL

·         ETL tool (DataStage, Informatica)

*Big Data/Hadoop: *

·         Hadoop Certified Developer (Cloudera preferred, HortonWorks, etc)

·         Flume

·         agent design, development, and optimization

·         sourcing from flat files to HDFS files, HBase, and serialized
file formats

·         advanced concepts including durable channels, fault tolerance,
and interceptors

·         access metrics, ex. via mbeans, for active monitoring

HBase

·         conceptual and physical differences compared to RDBMS

·         design, development, and creation of HBase schemas

·         advanced techniques including blocksize configuration, in-memory
column families, and compression

HDFS

·         file storage concepts and optimization

Hive / Impala

·         design, development, and creation of metastore tables, views

·         optimization including indexing, partitioning, compression, and
serialization

MapReduce

·         application processing  and parallelization concepts

·         streaming API for development of non-Java MR application

·         advanced development techniques using ToolRunner, distributed
cache, logging

·         advanced development techniques using custom partitioners,
combiners, and formats

·         unit testing using MRUnit

·         common algorithms: sort, search, indexing, co-occurrence

Oozie

·         design, development, and execution of complex scalable and
fault-tolerant workflows

·         linking and coordination of HDFS, MR, Java, Sqoop, Hive, SSH etc
actions and sub-workflows to build application pipelines for data ingestion
and processing

·         application failure analysis and troubleshooting

·         advanced design techniques including graceful failure and
recovery, parameterization, parallel action execution, and decision control

·         coordinator design, development, and execution for automated
workflow execution

·         bundle design, development, and execution for advanced
coordinator control

Sqoop

·         data imports and exports, configuration and optimization thereof

·         design and limitations; incremental imports



Thanks & Regards,



Guru

Work: (732) 917-4981

-- 
You received this message because you are subscribed to the Google Groups 
"American Vendor--IT Consulting" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sap-vendor.
For more options, visit https://groups.google.com/d/optout.

Reply via email to