Hi,

Hope you doing Well !!!

Here is our Implementing partner Requirement, Please go through the below
requirement and send us suitable consultant with their updated resume,
rates and Contact details ..



*Role:* *Hadoop Admin with **3 Year’s Experience With Horton Works*

*Location: Cranston**, RI*

*Work Duration:* *6+ Month*

*Years of Experience: 10+*

*Note: We need Photo id and visa copy (H1**B**)*


*Job Description: Required Skills*
Position Responsibilities:
· Manage scalable Hadoop virtual and physical cluster environments.
· Manage the backup and disaster recovery for Hadoop data.
· Optimize and tune the Hadoop environments to meet performance
requirements.
· Install and configure monitoring tools for all the critical Hadoop
systems and services.
· Work in tandem with big data developers and designs use case specific
scalable supportable -infrastructure.
· Performance analysis and debugging of slow running development and
production processes.
· Solid technical understanding of services such as Hbase, kafka, spark,
Hive, hdfs, yarn and ambari.
· Work with Linux server admin team in administering the server hardware
and operating system.
· Assist with development and maintain the system documentation.
· Create and publish various production metrics including system
performance and reliability information to systems owners and management.
· Coordinate root cause analysis (RCA) efforts to minimize future system
issues.
· Isilon based Hadoop experience is a plus.
Technical Qualifications:
1) At least 3 years experience in install and configure Horton works Hadoop
cluster
2) Backup and recovery of HDFS file system (distributed file system java
based)
3) my SQL databases used by Cluster
4) Configure and maintain HA of HDFS, YARN (yet another resource
negotiator) Resource Manager. , Map Reduce, Hive, HBASE, Kafka and Spark
5) Experience setting up Kerberos and administering a kerberized cluster.
6) Ranger Service configuration and setup policies. (authentication)
7) Knox service configuration. (reverse proxy, policy enforcement for
Hadoop)…access control
8) String understanding and experience to use ODBC/JDBC with various
clients like tableau, Micro strategy and server components like hive/spark
and Hbase
9) Monitor and tune cluster component performance.
10) Very strong skill in using various Linux commands for day to day
operations and ability to analyze and create Shell scripts.
General Qualifications:
· Demonstrated experience in working with the vendor(s) and user
communities to research and testing new technologies to enhance the
technical capabilities of existing Hadoop cluster.
· Demonstrated experience in working with Hadoop architect and big data
users to implement new Hadoop eco-system technologies to support
multi-tenancy cluster.
· Ability and desire to “think out of the box” to not only meet
requirements but exceed them whenever possible.
· Ability to multi-task in a fast-paced environment and complete assigned
tasks on time.
· Ability to effectively interact with team members using strong verbal and
written communication skills.
· Self-motivated and enthusiastic when working with difficult problems and
tight deadlines.









[image: cid:image001.jpg@01D0BE16.B9DD7240]



Nityo Infotech Corp.
666 Plainsboro Road,

Suite 1285

Plainsboro, NJ 08536

*Santosh Kumar *

*Technical Recruiter*

Desk No-609-853-0818 Ext-2170
Fax :   609 799 5746

kuntal.sant...@nityo.com
www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"Oracle-HRMS-India" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to oracle-hrms-india+unsubscr...@googlegroups.com.
To post to this group, send email to oracle-hrms-india@googlegroups.com.
Visit this group at https://groups.google.com/group/oracle-hrms-india.
For more options, visit https://groups.google.com/d/optout.

Reply via email to