*Dear Candidate,* *We have an urgent opening for and I have sent you a job description please go through it and let me know if you are comfortable with it and also send me your consultant's updated. *
*Title:- Sr. Hadoop Design, Implementation and Support Admin(Need some Architectural Experience)* *Location:- Bowie/MD* *Long term Contract* *IN case of H1B need Visa copy* *We have immediate openings for technical professionals in the Big Data arena with our direct client in the DC / Maryland area. * * We develop cutting-edge software solutions that are helping to revolutionize the Informatics industry. We are seeking technical and business professionals with advanced leadership skills to join our tight-knit team in our headquarters located in Maryland. This is an opportunity to work with fellow best-in-class IT professionals to deploy new Business Solutions utilizing the latest technologies for Big Data solutions, including a wide array of Open Source tools This position requires extensive experience on the Hadoop platform using Sqoop, Pig, Hive, and Flume to design, build and support highly scalable data processing pipelines. Hadoop Administrator Responsibilities:* · *Work with Data Architects to plan and deploy new Hadoop environments and expand existing Hadoop clusters. Design Big Data solutions capable of supporting and processing large sets of structured, semi-structured and structured data* · *Provide Administration, management and support for large scale Big Data platforms on Hadoop eco-system. Provide Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. * · *Install, configure, support and manage Hadoop clusters using Apache & Cloudera (CDH3, CDH4), and Yarn distributions. Install and configure Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster * · *Monitor and follow proper backup & recovery strategies for High Availability * · *Configure various property files like core-site.xml, hdfs-site.xml, mapred-site.xml.* · *Monitor multiple Hadoop cluster environments using Ganglia and Nagios, and monitored workload, job performance and capacity using Cloudera Manger* · *Define and schedule all Hadoop/Hive/Sqoop/HBase jobs * · *Import and export data from web servers into HDFS using various tools* *Required Skills:* · *Extensive experience in Business Intelligence, data warehousing, analytics, and Big Data * · *Experience with Hardware architectural guidance, planning and estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment * · *Expertise in the design, installation, configuration and administration of Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster* · *Working knowledge of capacity planning, performance tuning and optimizing the Hadoop environment. * · *Experience in HDFS data storage and support for running map-reduce jobs. * · *Experience in commissioning, decommissioning, balancing, and managing Nodes on Hadoop Clusters. * · *Experience with Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. Good understanding of Partitioning concepts and different file formats supported in Hive and Pig. * · *Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. * · *Hands on experience with data analytics tools such as Splunk, Cognos, Tableau etc. * [image: logo] *Rohit Bhasin |* lead Recruiter | Apetan Consulting LLC Tel: 201-620-9700* |121| Fax:2015266869 | *Mail *: 72 van reipen ave pmb#255, Jersey City, NJ 07306| *Corp. Office:* 15 Union Avenue, office # 6, Rutherford, New Jersey 07070| [email protected]| www.apetan.com | *“Forget all the reasons why it won’t work and believe the one reason it will work**"* [image: https://s3.amazonaws.com/images.wisestamp.com/icons/facebook.png] <http://www.facebook.com/Apetanconsulting> [image: https://s3.amazonaws.com/images.wisestamp.com/icons/linkedin.png] <http://www.linkedin.com/company/apetan-consulting-llc?trk=top_nav_home> [image: https://s3.amazonaws.com/images.wisestamp.com/icons/twitter.png] <http://twitter.com/ApetanLLC> [image: e verify] *Disclaimer:* We respect your Online Privacy. This e-mail message, including any attachments, is for the sole use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply e-mail and destroy all copies of the original message. If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line at [email protected] and mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e mails to you. We are sorry for the inconvenience. -- You received this message because you are subscribed to the Google Groups "OracleD2K" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/oracled2k. For more options, visit https://groups.google.com/d/optout.
