Hi,


I have some urgent requirements with my client. Please send me your updated
resume along with your hourly rate / yearly salary expectations, if
interested. In case you are not interested, it will be nice to let your
friends know of this position who may be a potential fit.



*Big Data Developer*

*Location: Bay Area, CA*



*Skype drive Dec 8th. Interviews time is 10 to 1 Noon PST*



*Must Have:*

·         8 to 10 years of IT experience with at least 2 to 4 years of
Hands on experience in Big Data.

·         Work directly with customers' technical resources to devise and
recommend solutions based on understood requirements.

·         Worked in complex Big data environment with Parallel streaming
platform build out experience.

·         Experience on Micro services/ Rest API.

·         Hands-on Programming Experience on Kafka, Avro, Spark, Hadoop,
Python, Scala/Java.

·         Dev Ops experience



*Good To Have:*

·         ElasticSearch

·         AWS experience or any other cloud platform experience

·         Experience with relational (SQL)/ NPP and NoSQl Databases.

·         Experience with one or more statistical and machine learning
packages or frameworks such as R, SciKitLearn, SparkML(MLlib), Tensorflow.





*Hadoop Administrator *

*Bay Area, CA*



*Required Skills:*

·         Responsible for implementation and ongoing administration of
Hadoop infrastructure.

·         Aligning with the systems engineering team to propose and deploy
new hardware and software environments required for Hadoop and to expand
existing environments.

·         Working with data delivery teams to setup new Hadoop users. This
job includes setting up Linux users, setting up Kerberos principals and
testing HDFS, Hive, Pig and MapReduce access for the new users.

·         Cluster maintenance as well as creation and removal of nodes
using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open
Manage and other tools.

·         Performance tuning of Hadoop clusters and Hadoop MapReduce
routines.

·         Screen Hadoop cluster job performances and capacity planning

·         Monitor Hadoop cluster connectivity and security

·         Manage and review Hadoop log files.

·         File system management and monitoring.

·         HDFS support and maintenance.

·         Diligently teaming with the infrastructure, network, database,
application and business intelligence teams to guarantee high data quality
and availability.

·         Collaborating with application teams to install operating system
and Hadoop updates, patches, version upgrades when required.

·         Point of Contact for Vendor escalation







*Thanks and Regards,*

*Dev Chauhan *

Sr. IT Recruiter
Ph: 609-853-0818 Ext: 2290

*[email protected]* <[email protected]>

*www.nityo.com* <http://www.nityo.com/>

-- 
You received this message because you are subscribed to the Google Groups 
"BlazeAdvisorUserGroup" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/blazeadvisorusergroup.
For more options, visit https://groups.google.com/d/optout.

Reply via email to