Hope you doing Well !!!
Here is our Implementing partner Requirement, Please go through the below
requirement and send us suitable consultant with their updated resume,
rates and Contact details ..
*Role:* *Hadoop Admin with **3 Year’s Experience With Horton Works*
*Location: Cranston**, RI*
*Work Duration:* *6+ Month*
*Years of Experience: 10+*
*Note: We need Photo id and visa copy (H1**B**)*
*Job Description: Required Skills*
· Manage scalable Hadoop virtual and physical cluster environments.
· Manage the backup and disaster recovery for Hadoop data.
· Optimize and tune the Hadoop environments to meet performance
· Install and configure monitoring tools for all the critical Hadoop
systems and services.
· Work in tandem with big data developers and designs use case specific
scalable supportable -infrastructure.
· Performance analysis and debugging of slow running development and
· Solid technical understanding of services such as Hbase, kafka, spark,
Hive, hdfs, yarn and ambari.
· Work with Linux server admin team in administering the server hardware
and operating system.
· Assist with development and maintain the system documentation.
· Create and publish various production metrics including system
performance and reliability information to systems owners and management.
· Coordinate root cause analysis (RCA) efforts to minimize future system
· Isilon based Hadoop experience is a plus.
1) At least 3 years experience in install and configure Horton works Hadoop
2) Backup and recovery of HDFS file system (distributed file system java
3) my SQL databases used by Cluster
4) Configure and maintain HA of HDFS, YARN (yet another resource
negotiator) Resource Manager. , Map Reduce, Hive, HBASE, Kafka and Spark
5) Experience setting up Kerberos and administering a kerberized cluster.
6) Ranger Service configuration and setup policies. (authentication)
7) Knox service configuration. (reverse proxy, policy enforcement for
8) String understanding and experience to use ODBC/JDBC with various
clients like tableau, Micro strategy and server components like hive/spark
9) Monitor and tune cluster component performance.
10) Very strong skill in using various Linux commands for day to day
operations and ability to analyze and create Shell scripts.
· Demonstrated experience in working with the vendor(s) and user
communities to research and testing new technologies to enhance the
technical capabilities of existing Hadoop cluster.
· Demonstrated experience in working with Hadoop architect and big data
users to implement new Hadoop eco-system technologies to support
· Ability and desire to “think out of the box” to not only meet
requirements but exceed them whenever possible.
· Ability to multi-task in a fast-paced environment and complete assigned
tasks on time.
· Ability to effectively interact with team members using strong verbal and
written communication skills.
· Self-motivated and enthusiastic when working with difficult problems and
Nityo Infotech Corp.
666 Plainsboro Road,
Plainsboro, NJ 08536
*Santosh Kumar *
Desk No-609-853-0818 Ext-2170
Fax : 609 799 5746
You received this message because you are subscribed to the Google Groups
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to email@example.com.
Visit this group at https://groups.google.com/group/reqsresumes.
For more options, visit https://groups.google.com/d/optout.