*Devops Cloudera Consultant *

* Location- Westlake,TX *

*Duration- 6 mo.*

*Mode: Phone and skype*



Skills- *Automation, Cloudera, Hadoop, Kafka, Devops, Puppet or Chef, NoSQL*



We are in immediate need of a Cloudera Automation Consultant in Westlake,
TX. This is an infrastructure DevOps role using Cloudera inside a Big Data
team. Please see the job details below and let me know if you have anyone
that could be a good fit.





This role provides an exciting opportunity to roll out a new strategic
initiative within the firm-- Enterprise Infrastructure Big Data Service.
The Big Data Infrastructure DevOps Consultant serves as a development and
support expert with responsibility for the design, development, automation,
testing, support and administration of the Enterprise Infrastructure Big
Data Service.  This will involve building and supporting a general purpose
data analytics platform utilized by client’s data scientist community.  The
roles requires experience with both *Cloudera’s Enterprise Data Hub *and
*Kafka*.



This role is DevOps and support administration.  The position requires a
strong background in computer architecture, software development, data
management systems, distributed computing, and a solid understanding of the
open-source technology ecosystem.   An ideal prospect will have technical
expertise, customer engagement skills, excellent communication skills, and
a passion for organizing and analyzing data.



*Primary Responsibilities*

   - Development, support, and maintenance of the infrastructure platform
   and application lifecycle
   - Design, development and implementation of automation innovations
   - Development of automated testing scripts
   - Building and nurturing relationships with Fidelity’s developer and
   system administration communities
   - Contribution to all phases of the application lifecycle –
   requirements, development, testing, implementation, and support.
   - Responding and providing guidance to customers of the Big Data platform
   - Defining and implementing integration points with existing technology
   systems
   - Interacting with and participating in open-source software communities
   - Researching and remaining current on big data technology and industry
   trends and innovations
   - Participating in a 24 x 7 hour on-call support rotation



*Education and Experience*

   - B.S. Computer Science or equivalent
   - Master’s degree is a plus
   - 5+ years application development or systems administration experience
   - 5+ years of development experience in one or more of the following
   languages: Java, C++, Perl, Python
   - Experience deploying or managing open-source software
   - Strong experience with any Linux distribution
   - 2+ years Hadoop experience
   - Database administration experience a plus



*Skills and Knowledge*

   - Certification and experience working in *Hadoop*
   - Certification and working experience in a *NoSQL* database
   - Experience with *Splunk/HUNK* or Solr solutions and dashboards running
   on Big Data technologies such as Hadoop
   -  Strong background in Linux/Unix Administration
   -  *Agile Scrum* or *Kanban* experience
   - Global team experience
   - Experience with automation/configuration management using *Chef,
   Puppet* or an equivalent
   - Programming in CI/CD technologies is a plus
   - Knowledge designing scalable distributed systems
   - Awareness of both current and developing technologies
   - Strong desire to innovate and develop future technology


-- 
Best Regards,
Samarth Mishra
[email protected]
201 620 9700x125

-- 
You received this message because you are subscribed to the Google Groups 
"Vendors" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/vendors.
For more options, visit https://groups.google.com/d/optout.

Reply via email to