Hi All,

Hope you are doing good.

Please go through the requirement And Send Updated Resume to
*srika...@softhq.com
<srika...@softhq.com>*

*Job Title : Hadoop Administrator*

*Duration : 6+Months.*

*Location :  Providence, Rhode Island.*

*Interview : Phone and Skype.*

*Job Description :*

 Manage scalable Hadoop virtual and physical cluster environments.

 Manage the backup and disaster recovery for Hadoop data.

 Optimize and tune the Hadoop environments to meet performance
requirements.

 Install and configure monitoring tools for all the critical Hadoop systems
and services.

 Work in tandem with big data developers and designs use case specific
scalable supportable -infrastructure.

 Provide very responsive support for day to day requests from, development,
support, and business analyst teams.

 Performance analysis and debugging of slow running development and
production processes.

 Solid technical understanding of services such as Hbase, kafka, spark,
Hive, hdfs, yarn and ambari.

 Work with Linux server admin team in administering the server hardware and
operating system.

 Assist with development and maintain the system documentation.

 Create and publish various production metrics including system performance
and reliability information to systems owners and management.

 Perform ongoing capacity management forecasts including timing and budget
considerations.

 Coordinate root cause analysis (RCA) efforts to minimize future system
issues.

 Mentor, develop and train other systems operations staff members as
needed.

 Provide off hour support as required

 Isilon based hadoop experience is a plus.

*Technical Qualifications:*

1) 10 years of IT experience and At least 3 years experience in install and
configure hortonworks hadoop cluster

2) Backup and recovery of HDFS file system (distributed filesystem java
based)

3) mySQL databases used by Cluster

4) Configure and maintain HA of HDFS, YARN (yet another resource
negotiator) Resource Manager. , MapReduce, Hive, HBASE, Kafka and Spark

5) Experience setting up Kerberos and administering a kerberized cluster.

6) Ranger Service configuration and setup policies. (authentication)

7) Knox service configuration. (reverse proxy, policy enforcement for
Hadoop)?access control

8) String understanding and experience to use ODBC/JDBC with various
clients like tableau, Microstrategy and server components like hive/spark
and hbase

9) Diagnose and resolve individual service issues.

10) Monitor and tune cluster component performance.

11) Very strong skill in using various Linux commands for day to day
operations and ability to analyze and create Shell scripts.

*General Qualifications:*

 Demonstrated experience in working with the vendor(s) and user communities
to research and testing new technologies to enhance the technical
capabilities of existing Hadoop cluster.

 Demonstrated experience in working with Hadoop architect and big data
users to implement new Hadoop eco-system technologies to support
multi-tenancy cluster.

 Ability and desire to “think out of the box” to not only meet requirements
but exceed them whenever possible.

 Ability to multi-task in a fast-paced environment and complete assigned
tasks on time.

 Ability to effectively interact with team members using strong verbal and
written communication skills.

 Self-motivated and enthusiastic when working with difficult problems and
tight deadlines.


*Thanks and Regards .*

*Srikanth .*

*IT Recruiter.*

*Soft HQ INC.*

*Email Id : srika...@softhq.com <srika...@softhq.com>*

*Desk Num : 858-658-9200  Ext :6**22*

-- 
You received this message because you are subscribed to the Google Groups "Open 
Source Erp & Crm" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to open-source-erp-crm+unsubscr...@googlegroups.com.
To post to this group, send email to open-source-erp-crm@googlegroups.com.
Visit this group at https://groups.google.com/group/open-source-erp-crm.
For more options, visit https://groups.google.com/d/optout.

Reply via email to