*Role: Hadoop Administrator & Developer*

*Location: Columbus, OH*

*Duration: 6 Months*

*Rate: $50/hr *



*Primary Skills: Hadoop Administration, Linux shell scripting, Hive, shell
scripting*

*Secondary Skills: python or spark*



*Hadoop Administrator & Developer:* This position will be responsible for
both administering Hadoop and also a developer role in setting up and
ongoing data ingestion tasks. The candidate is responsible for Hortonworks
Hadoop tools configuration, cluster configuration, LDAP integration,
Kerberos security implementation, monitoring utilities and performance
tuning for Hadoop cluster. Hadoop admin needs to have experience working
with Hortonworks ambari console. Also Hadoop admin need to have experience
setting up proper security, encryption and access controls using LDAP,
Ranger and other security tools on Hortonworks Hadoop Cluster. Will be
responsible for development of common reusable services in Hadoop.
Successful candidate should have end to end experience with ingestion
tasks: file/table ingestion, ingestion file formats, hive structures,
encryption, applying right compression, access set up in ranger. Experience
with either spark or python is required. Any experience with NIFI tool is a
plus. The developer should also have experience in choosing right file
format, partitioning mechanisms and creation of hive structures and also
experience with hive SQL.



*Skills & Experience*

·         *3 years of Hadoop Administration Experience*

·         *5 years of experience with Linux and Linux shell scripting*

·         *3 years of experience with python or spark programming*

·         *3 years of experience with development of common ingestion
framework, hive structure creation, compression, encryption steps*

·         *3 years of experience with Hive which includes creation of
schema structures, partitioning & performance tuning*

·         *4 years of Experience with shell scripting *

·         Experience with various file formats including parquet, ORC and
compression options

·         Experience with Nifi , HBase, Spark, pig, storm etc.

·         Experience setting up Encryption Zones in Hadoop

·         Experience with Kerberos set up on Hadoop clusters

·         Experience setting up security for Hadoop users using LDAP
profiles & Apache Ranger Tool

·         Performance tuning of Hadoop clusters

·         Responsible for tuning settings in ambari for Hive, HDFS etc. for
performance

·         Develop synchronization process between prod and DR systems

·         Monitor Hadoop cluster job performances and capacity planning

·         Monitor Hadoop cluster connectivity and security

·         Manage and review Hadoop log files.

·         File system management and monitoring.

·         HDFS support and maintenance.

·         Work with infrastructure team on Hadoop patches and tool installs

·         Point of Contact for Vendor escalation





*Thanks & Regards,*



*Muhammad Imran Khan*

*Senior Technical Recruiter*

*Saicon Consultants, Inc.*

(408) 216-2646 Ext 160 (W)

(913) 273-0058 (F)

 Email: [email protected]



*SBA 8(a) Certified /WBE/MBE/DBE/SDB*

*Inc.500 Company – 2006. 2007, 2008, 2009 & 2010*

*Ranked #1 " Fastest-Growing Area Businesses" - Kansas City Business
Journal - 2006*

*Ranked in Top 10 of Corporate 100 - Ingram's - 2006 & 2007*

*CMMI Level 3 *

*ISO 9001:2008 Certified*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Workflow" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.

Reply via email to