Hello,

 

Hope you are doing good!

This is very urgent opening. Please send your available matching candidates 
on [email protected]

 

Role : Hadoop Admin
Location : Atlanta GA, Omaha NE, and Wilmington DE
 Duration : long term 
Interview: phone / Skype

 

Green Card holders, EAD- GC holders, or US Citizens

 

 

Description
  JOB Description
Incredible opportunity for extremely talented, self-motivated Big Data 
professionals 
interested on building and operating the latest cutting edge technology in 
the rebirth of a progressive Fortune 500 company. We are looking for best and 
brightest thought leaders who are action oriented and driven to succeed in 
a dynamic, collaborative environment. Get in now to put your fingerprint on 
our Big Data environment.
• Responsible for the build out, day-to-day management and support of the Big 
Data clusters based on Hadoop and other NoSQL technologies
• Work Collaboratively with different teams to administer the build out and 
support of Big Data clusters including capacity planning, cluster set up, 
performance 
tuning and monitoring 
• Setup, configure and maintain security for Big Data clusters
• Focus on Automation to streamline tasks and manage the Big Data product 
support 
stack
• Responsible for cluster availability
• Administer and troubleshoot ETL Processes for Big Data clusters
• Implement system wide monitoring, alerts and automated recovery
• Expertise in system & data administration functions for a complex, 
multi-system, 
multi-platform network on UNIX and Linux based platforms. 
• The candidate must able to leverage experience with diagnosing network 
performance 
• Support development and production deployments
• Participate in 24x7 production on-call rotation
• Stay abreast with current releases of Hadoop including compatibility issues 
with operating systems, new functionalities and utilities
• Identify and initiate resolutions to user problems/concerns associated with 
big data functionality
Candidate Requirements
Candidates must be self-starters with strong communication and collaborative 
skills. They must have strong hands on experience working in Big Data as well 
as the technical propensity to thrive in a cutting edge technology environment. 
• Experience with large scale Hadoop and NoSQL environments build and support 
including design, capacity planning, cluster set up, performance tuning and 
monitoring
• Hands on experience with Hadoop eco system such as HDFS, MapReduce, HBase, 
Zookeeper, Pig, Hadoop streaming, Sqoop, oozie, Flume and Hive, etc.
• Experience with NoSQL platforms including Cassandra, MongoDB, etc.
• JAVA programming and Python scripting experience
• Strong network background with a good understanding of TCP/IP, firewalls 
and DNS
• Shell scripting experience
• Has strong collaboration skills and can communicate all aspects of the job 
requirements, including the creation of formal documentation 
Preferred Qualifications
• BS/MS degree in Computer Science or a related field and/or relevant job 
experience
• Experience with Cloudera and Hortonworks distribution of Hadoop
• Experience with Datastax distribution of Cassandra
• Knowledge of third party ETL products including Informatica and DataStage
• A deep understanding of Hadoop design principals, cluster connectivity, 
security 
and the factors that affect distributed system performance

Title
  Cassandra/Datastax Admin
  Location: Atlanta GA, Omaha NE, and Wilmington DE
  Rate: 60-65/HR

Description
  JOB Description
Incredible opportunity for extremely talented, self-motivated Big Data 
professionals 
interested on building and operating the latest cutting edge technology in 
the rebirth of a progressive Fortune 500 company. We are looking for best and 
brightest thought leaders who are action oriented and driven to succeed in 
a dynamic, collaborative environment. Get in now to put your fingerprint on 
our Big Data environment.
• Responsible for the build out, day-to-day management and support of the Big 
Data clusters based on Hadoop and other NoSQL technologies
• Work Collaboratively with different teams to administer the build out and 
support of Big Data clusters including capacity planning, cluster set up, 
performance 
tuning and monitoring 
• Setup, configure and maintain security for Big Data clusters
• Focus on Automation to streamline tasks and manage the Big Data product 
support 
stack
• Responsible for cluster availability
• Administer and troubleshoot ETL Processes for Big Data clusters
• Implement system wide monitoring, alerts and automated recovery
• Expertise in system & data administration functions for a complex, 
multi-system, 
multi-platform network on UNIX and Linux based platforms. 
• The candidate must able to leverage experience with diagnosing network 
performance 
• Support development and production deployments
• Participate in 24x7 production on-call rotation
• Stay abreast with current releases of Hadoop including compatibility issues 
with operating systems, new functionalities and utilities
• Identify and initiate resolutions to user problems/concerns associated with 
big data functionality
Candidate Requirements
Candidates must be self-starters with strong communication and collaborative 
skills. They must have strong hands on experience working in Big Data as well 
as the technical propensity to thrive in a cutting edge technology environment. 
• Experience with large scale Hadoop and NoSQL environments build and support 
including design, capacity planning, cluster set up, performance tuning and 
monitoring
• Hands on experience with Hadoop eco system such as HDFS, MapReduce, HBase, 
Zookeeper, Pig, Hadoop streaming, Sqoop, oozie, Flume and Hive, etc.
• Experience with NoSQL platforms including Cassandra, MongoDB, etc.
• JAVA programming and Python scripting experience
• Strong network background with a good understanding of TCP/IP, firewalls 
and DNS
• Shell scripting experience
• Has strong collaboration skills and can communicate all aspects of the job 
requirements, including the creation of formal documentation 
Preferred Qualifications
• BS/MS degree in Computer Science or a related field and/or relevant job 
experience
• Experience with Cloudera and Hortonworks distribution of Hadoop
• Experience with Datastax distribution of Cassandra
• Knowledge of third party ETL products including Informatica and DataStage
• A deep understanding of Hadoop design principals, cluster connectivity, 
security 
and the factors that affect distributed system performance

 

 

 

 

 




Samir Reddy

Sr. Technical Recruiter


50 Cragwood Rd, Suite

205,South Plainfield, NJ 07080


   Rider Consulting Inc.

Gtalk : Samir.recruiters

[email protected]

 

This email was sent using GroupMail - http://group-mail.com/

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT.Groups" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/us_itgroups.
For more options, visit https://groups.google.com/d/optout.

Reply via email to