| Duties & Responsibilites : | · 3+ years Hadoop experience in the area of setup, configuration, or management
· Design and implement data storage, schema and partition system as appropriate to Hadoop and related technologies like Hbase, Hive and Pig.
· Identify, assess, and recommend appropriate solutions to advice customer on cluster requirements and any limitations by applying industry bestpractices and expertise regarding emerging technologies, risk mitigation, and continuity planning to address back-up and recovery.
· Possess advanced Linux and Hadoop System Administration skills and networking, shell scripting, and system automation.
· Provides enterprise-level information technology recommendations and solutions in support of customer requirements.
· Use customer defined data sources and prototype processes to satisfy proof of concept.
· Develop design patterns for specific data processing jobs.
· Test various scenarios for optimized cluster performance and reporting.
· Prepare and deliver presentations to communicate deployment process for the proof of concept.
· Will serve as a point of contact between internal and external customers and program management.
· Ensures accurate documentation of technical specifications.
Required:
- 11 years of overall IT experience, can include internships or work done overseas
· Hadoop experience with the setup, configuration, or management of a multi-node (50 to 100) Hadoop cluster, specifically working with Cloudera’s Hadoop distribution.
· Experience with Hadoop technologies like Pig, Hive and Hbase.
· Experience with Kerberos and Securing Hadoop Clusters.
· Experience with systems monitoring tools (Nagios), helping tune, configure, and administer cluster
· Certified Linux System Administrator (e.g. Red Hat Linux)
· Bachelors degree from an accredited college in a related discipline, or equivalent experience/combined education, with 11 years of professional experience;
· 5+ years Linux or Unix System Administrator experience highly preferred
|