*Please send reply to [email protected] * *-All our requirements are Direct client Requirements.* *-Please share the resumes only after you discuss with your consultant.* *-Resumes with Linkedin URL / 2 Professional references only will be processed.*
Hadoop Administrator Saint Louis, MO Long Term *Responsibilities:* § Responsible for implementation and ongoing administration of Hadoop infrastructure. § Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. § Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users. § Cluster maintenance as well as creation and removal of nodes using tools like Ganglia,Nagios,Cloudera Manager Enterprise, Dell Open Manage and other tools. § Performance tuning of Hadoop clusters and Hadoop MapReduce routines. § Screen Hadoop cluster job performances and capacity planning § Monitor Hadoop cluster connectivity and security § Manage and review Hadoop log files. § File system management and monitoring. § HDFS support and maintenance. § Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guaranteehigh data quality and availability. § Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required. *Responsibilities Performed by Hadoop Administrator:* § Data modelling, design & implementation based on recognized standards. § Software installation and configuration. § Database backup and recovery. § Database connectivity and security. § Performance monitoring and tuning. § Disk space management. § Software patches and upgrades. § Automate manual tasks. *Skills Required : * § General operational expertise such as good troubleshooting skills, understanding of systemâ??s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks. § Hadoop skillslike HBase, Hive, Pig, Mahout, etc. § The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups. § Good knowledge of Linux as Hadoop runs on Linux. § Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting. § Knowledge of Troubleshooting Core Java Applications is a plus Thanks & Regards, *Mohan Ganti* Mail: [email protected] *Disclaimer: We respect your Online Privacy. This is not an unsolicited mail. Under Bill 1618 Title III passed by the 105th U.S. Congress this mail cannot be considered Spam as long as we include Contact Information and a method to be removed from our mailing list. If you are not interested in receiving our emails then please reply with a "REMOVE" in the subject line and mention all the email addresses to be removed with any email addresses.* -- You received this message because you are subscribed to the Google Groups "SAP Workflow" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/sap-workflow. For more options, visit https://groups.google.com/d/optout.
