Hi Please lookup the below position and if you feel comfortable ,then please send me your updated resume
*Role : Hadoop Developer * *Location : Iowa City, IA * *Duration : 4+ Month * *Interview : Phone / Skype * *Must have 8+ year’s total IT development experience 5+ years using Hadoop. * * Job Responsibilities: * · Engineer will be responsible for documenting detailed system specifications, design, development, testing using development tools & continuous integration processes. · Engineer has the capability to handle ad-hoc requests; understand and maintain internally developed applications. · Engineers lead efforts, oversee work results, provide training and serve as a technical resource for other Developers *Job Requirement * · Development Experience in Big Data architecture, Horton Works Hadoop stack, both HDP & HDF. · More than 5+ years of hands on development experience in the following: Spark / Pig / Python / Storm / Kafka / Hive / HDFS / Java / SQOOP · At least one year experience in Nifi/MiNifi, Atlas, Ranger · Assist and help lead proof of concepts as Big Data technology evolves. · Ensure solutions developed adhere to security and data best practices, including Atlas, Ranger & Kerberos processes. · Translate, load and present disparate data sets in multiple formats/sources including JSON · Exposure to ETL & Data warehousing · Experience and detailed knowledge of Hadoop development utilizing DevOps Tools such as Jenkins and Git · Hadoop Certified Developer (Horton Works preferred). · Advanced concepts including durable channels, fault tolerance, and interceptors. · Advanced techniques including block size configuration, in-memory column families, and compression. · Optimization including indexing, partitioning, compression, serialization · Streaming API for development. · Design, development, and execution of complex scalable and fault-tolerant workflows. · Application failure analysis and troubleshooting. · Advanced design techniques including graceful failure and recovery, parameterization, parallel action execution (parallelization), and decision control. *Optional skills:* · Exposure to MongoDB · Understanding and prior work experience in Education industry Ability to configure CloudBreak and Ambari Blueprint to be able to automate the scaling and deployment of the HDFS components of the Hortonworks platform. Experience with Docker and AWS required. [image: cid:image001.jpg@01D2DA68.60080D00] Neha Saral VSG Business Solutions 221 Cornwell Dr Bear, DE n...@vsgbusinesssolutions.com Phone: 302-261-3207 Ext: 107 GTalk:neha....@gmail.com P Please consider the environment before printing this email Important! This message is intended only for the use of the individual or entity to which it is addressed and may contain information that is privileged, confidential, and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, or the employee or agent responsible to deliver it to the intended recipient, you are hereby notified that reading, disseminating, distributing, or copying this communication is strictly prohibited. If you have received this communication in error, please immediately notify us by telephone, and discard the original message. Thank you. -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to android-developers+unsubscr...@googlegroups.com. To post to this group, send email to android-developers@googlegroups.com. Visit this group at https://groups.google.com/group/android-developers. To view this discussion on the web visit https://groups.google.com/d/msgid/android-developers/CAJdfOTQ7pQ84CFgkdv-Usc%3DcHQSD5RTYd7G0tV1sZAu7uEv8bw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.