Needed Profiles ASAP!!!!!!!!!!!!!!!!
Big Data DevOps
Location: Warren NJ
• Hadoop, Hive, HBase, Flume, Spark, Storm, Kafka, Ambari,
Nagios, Ganglia, Cloudera, Mahout, Talend, Sqoop, Oozie, Python, Java, Pig,
• Bachelor's or Master's degree in Computer Science, Computer
Engineering, or related field
• 6+ years experience performing DevOps primarily on Hadoop
ecosystems employing above stack elements, having owned/maintained 2 or
more unique running systems in that time period
• 6+ years of scripting experience with Python, R, Scala, Pig,
Oozie, Java or similar
• 3+ years of recent experience designing or maintaining
secured environments using Kerberos, PKI, ACLs, etc
• 2+ years of ETL experience with tools like Flume, Sqoop,
Talend or similar
• Streamline and enhance the day-to-day operational workflow of
an enterprise-level Hadoop environment.
• Constantly monitor, measure, and debug performance of a
system streaming GBs of data per day, focusing on data-driven metrics and
reliability and verification of data flows.
• Work closely with Big Data Architect and Business Owners to
ensure performance of system is consistent with intended design and
business cases, while looking for ways to simplify processes, improve data
ingestion, analysis, and delivery, and optimize the use of resources.
• Suggest future improvements, risks, challenges, or strategies
for the platform as it develops and grows into the future.
• Create and present reports, presentations, and visualizations
to technical leads and executives demonstrating functionality of the
platform and justifying operational behavior.
• Thorough and extensive knowledge of the Hadoop ecosystem and
distributed computing, including but not limited to Hadoop, Hive, HBase,
MapReduce, Zookeeper, YARN, Flume, Tez, Spark, Storm, Kafka, Ambari,
Mahout, Flink, Talend, Sqoop, Oozie, Zeppelin
• Expert at writing and debugging multiple scripting languages
(R, Python, Java, Pig, Oozie) for low-level processing, scheduling tasks,
analytics, and similar
• Understand multiple Linux distributions at a very deep level
(RHEL required) running in the cloud, containers, or bare metal
• Expert of monitoring and debugging tools and practices, and
capable of surfacing performance metrics and other KPIs to leadership to
provide operational summaries and checkpoints, such as Ganglia, Nagios,
Cloudera Manager and more.
• Knowledge of modern security best practices and techniques
for encrypting data in transit and at rest, protecting data privacy without
sacrificing performance or data analysis capabilities
• Knowledge and experience interacting with application servers
and web servers such as Nginx, Redis, IBM WebSphere, Tomcat, WebLogic, etc.
• Experience with ETL applications and techniques using Flume,
Sqoop, Talend, Sybase, etc.
• Experience with virtualization technologies and cloud
Other Desired Skills:
• Excellent interpersonal, oral, and written communication
• Highly motivated and success-driven with a strong sense of
• Comfortable working in a fast-paced, Agile, competitive
• Ability to work independently and in group environments
• Ability to problem solve effectively and efficiently
*✍Golden Resource Inc - WBE Certified*
*☞ www.goldenresource.com <http://www.goldenresource.com>*
[image: ZA102637858] <https://twitter.com/GoldenResource> [image:
*“Building Bridges Between Business and Technology.”*
[image: btn_viewmy_120x33] <http://www.linkedin.com/in/rishisyal/>
You received this message because you are subscribed to the Google Groups
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to firstname.lastname@example.org.
Visit this group at https://groups.google.com/group/latha56.
For more options, visit https://groups.google.com/d/optout.