*Role: Big Data / Hadoop Lead*

*Location:  Santa Clara, CA*

*Duration:  6 – 12 Months*

·         Min 10 years of exp

·         Strong understanding of Hadoop production environment which
includes HDFS, YARN, Hive, NiFi, Ranger, Atlas and Spark

·         Good understanding of data warehousing concepts and relational
star-schema database designs

·         Develop and implement ETL frameworks using Python and Spark
languages

·         Good knowledge of SQL and relational database models.

·         knowledge in Git repository branching and code versioning is plus

·         Understanding of machine learning models is a plus

-- 
Thanks & Regards,

*Srikanth*

*HCL Global Systems Inc*

*Desk: 248-473-0720 Ext: 179*

*Email: [email protected] <[email protected]>*

-- 
You received this message because you are subscribed to the Google Groups 
"project managment" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/project-managment/CAED9fV6ALky79tSD%3DaCPrkiFzqrOsHkfoLZBjx4sP3UmwN7%3DpQ%40mail.gmail.com.

Reply via email to