*Please SEND RESUMES only to*

*[email protected] <[email protected]>*



*Please READ & SUBMIT*



*All Skills are a MUST*





*Data Modeller (10+ years of Exp) Raritan NJ 6+ Months Contract*



*MUST HAVE SKILLS – 10+ years of experience, *2-3 Big Data implementation,
HDFS, Hive, Spark, Sqoop, Kafka, NiFi, Python, PySpark, Java



*Job Description *

•        Bachelor's degree in Computer Science, Software Engineering,
Information Technology, or related field required

•        10-12 years of overall experience in architecting and building
large scale, distributed big data solutions.

•        Experience in at least 2-3 Big Data implementation projects.

•        Solid experience in Hadoop Ecosystem development including HDFS,
Hive, Spark, Sqoop, Kafka, NiFi, and real time streaming technologies and
host of big data open source stack.

•        Working experience in Cloudera distribution is preferred.

•        Must have Experience in Python, PySpark, Java.

•        Must possess excellent communication skills.

•        Strong analytical, technical and trouble-shooting skills.

•        Experience leading teams and/or managing workloads for team
members.

•        Nice to have working experience in Informatica BDM, StreamSets.



Rufus Christopher

Senior IT Recruiter
Desk: 734-610-8001
Email: *[email protected]* <[email protected]>

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/rtc-linux/CAOhqQiPqjv%3DQjFXMOwrQOiVRD%2BudAjZxhJ4ius6O%2Bfw%2B7rcQZg%40mail.gmail.com.

Reply via email to