Hi All,

Please check and let me know

DevOps Engineer

Location- in San Jose, CA

Mode of interview- Phone then F2F/Skype

Duration- Long Term Project



DevOps Engineer to work in San Jose, CA in an Long term project.  The ideal
person would have experience with continuous integration tools, java
programming, kafka and have worked intensively in a Linux environment. The
highlighted lines are the absolute must haves.  The candidate must have
experience in those highlighted areas.





Must Have:

·         2+ to 5 years’ experience in Monitoring, SCM (Puppet/Ansible/Docker
etc.) and integration with tools like Jenkins.

·         5-10+ years of java core programming (ood, oop, core java
(collections, concurrency, GC, threads and multicore programming,
functional programming (lambda), JSVC, JUNIT Test, etc.)

·         Expertise with marathon (orchestration platform) and mesos
(cluster managers)- zookeeper as well

·         Strong experience with kafka (streaming data)

·         Shell scripting experience

·         Linux – Expertise with Linux (CentOS and RHEL)

·         Source Control - Experience in tools such as Git for source
control and Github or Stash web repository hosting service

·         Expertise with virtual systems using VMWare or Hyper-V or
OpenStack.

·




Plus


   - Experience with Open Source Integration
   - Containers/Virtual Systems – Expertise with Docker container
   technology and docker ecosystem like kubernets and swarm (orchestration);
   docker registry, dockerhub, Quay.io (registry); cAdvisor, datadog docker,
   sysdig, appFormix (monitoring).
   - Database Management – Expertise with Postgres, MySQL and NoSQL
   - End-to-end DevOps Process - End–to-End experience in deploying cloud
   based solutions using above tools
   - Monitoring and Log Management - Experience in tools such as Nagios for
   monitoring and tools like Splunk or Loggly or Elastic Search for log
   management
   - Message Bus / Big Data – Expertise with Kafka (high speed message
   bus), Flink and Spark Streaming (streaming processing for big data), Spark
   and Hadoop/HBase (batch processing for big data)
   -



Day to Day: they will be joining the CSTG team to work on the CSI project.
CSI is an innovative project that provides enterprise users with
just-in-time business insights using real-time data generated by devices
and business operations. It captures data as it is generated. It enables
complex computational topology to be operated on real-time data streams
over a wide range of connected network and host-servers.


Thanks

Nikhil Prasad

nik...@apetan.com

201-620-9700*130

Apetan Consulting LLC

-- 
You received this message because you are subscribed to the Google Groups 
"Citrix and Sap problems" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to citrix-and-sap-problems+unsubscr...@googlegroups.com.
To post to this group, send email to citrix-and-sap-problems@googlegroups.com.
Visit this group at https://groups.google.com/group/citrix-and-sap-problems.
For more options, visit https://groups.google.com/d/optout.

Reply via email to