*Please send replies/profiles to mohan.ga...@somasoftsol.com
<mohan.ga...@somasoftsol.com>*
*​-All our requirements are Direct client Requirements.*
*​-​​Please share the resumes only after you discuss with your consultant​.*
*-Resumes without Linkedin ID & 2 Professional references will not be
considered.*

​
*Title : Hadoop Admin*

*Location : Santa Clara, CA *

*Duration : Long Term *

*Position Description: *

Our client, the leading provider of online marketing software and services
to the restaurant industry, is seeking an experienced Hadoop Infrastructure
Administrator to join their outstanding software development team at Santa
Clara, CA.

The person will be responsible for implementation and ongoing maintenance
of Hadoop Big Data Infrastructure.

The candidate will work within a team and will interact daily with
Engineers, Data Analysts, BI Engineers, and Services Team.

The QA Engineer will need strong analytical and organizational skills and
strictest attention to details.

Successful candidate will adhere to established standards and provide input
to develop new standards as needed.

The Hadoop Administrator must be able to work with independently with
minimum supervision, have strong communication skills, be self-driven, and
an effective team player.



*Job Responsibilities: *

Responsible for implementation and ongoing administration of Hadoop
infrastructure.

Cluster maintenance including Administrating, monitoring, tuning and
troubleshooting.

Design, implement and maintain security, Data capacity and node forecasting
and planning.

Providing hardware architectural guidance, planning and estimating cluster
capacity, and creating roadmaps for Hadoop cluster deployment.

Closely working with the Engineering, infrastructure, network, database,
and business intelligence teams to ensure availability.



*Candidate Profile: *

BS in Computer Science or related area 7-10 years of system administration,
networking and virtualization experience.

3+ years of experience in Hadoop, NoSQL infrastructure administration
preferably MapR distribution.

Manage extracting, loading and transforming data in and out of Hadoop,
primarily using Hive, Sqoop, and distcp.

Proficiency with Shell scripts and administration tools like Ganglia.

Experience with deployment tools like Puppet, Chef, and Ansible etc.

Familiarity with YARN, HIVE, Pig, Sqoop, HBase, Spark, Kafka On-call
Production support experience.

Proficiency with agile or lean development practices

Excellent technical and organizational skills

Excellent written and verbal communication skills

Work independently with minimal supervision



Top 4 skill sets / technologies in the ideal candidate:

1. Cloud Infrastructure Managment, SAAS

2. Hadoop Infrastructure administration

3. Experience working in an Agile environment

4. Hadoop/HBase/Hive/Spark/Tableau/Kafka/ Technologies that we use include:
* Java * Hadoop/MapReduce * Flume * Spark * Kafka * HBase * Drill * MemSQL
* Pig * Hive * Talend * Tableau Integration * ETL

​
Thanks & Regards,




*Mohan Ganti*

Mail: mohan.ga...@somasoftsol.com <recruite...@metmox.com>


*Disclaimer: We respect your Online Privacy. This is not an unsolicited
mail. Under Bill 1618 Title III passed by the 105th U.S. Congress this mail
cannot be considered Spam as long as we include Contact Information and a
method to be removed from our mailing list. If you are not interested in
receiving our emails then please reply with a "REMOVE" in the subject line
and mention all the email addresses to be removed with any email addresses.*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Workflow" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-workflow+unsubscr...@googlegroups.com.
To post to this group, send email to sap-workflow@googlegroups.com.
Visit this group at http://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.

Reply via email to