Make sure you will provide me Passport Copy/ No .......


Role : Big Data Engineer/ Administer (HDFS, Spark)

Location : San Jose , CA

Duration: 12+ Months

Client : TCS(WWW.TCS.COM)



Mandatory:

HDFS  deep expertise,

• Apache Spark deep expertise - Must

 Python / Scala - One of them required

• Apache Hadoop, Pig

• HBase



Must be Aware

Containerization of Big Data Systems using Docker / Kubernetes / Other
Orchestrators



Technical/Functional Skills:-

Develop scripts and administer Big Data Stack

* Data Structures how to use, Programming Skills, Fundamentals ( for ex.
indexing)   - key requirement
 Write Spark programs. Spark Context, Refactoring  - key requirement
* Programming proficiency in Python / Scala.
Install, Design and Administer Big Data Stack including but not limited to
Apache Hadoop, HDFS, Apache Spark, Pig and HBase.
* Manage Hadoop configurations for storage and performance efficiency.





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

[email protected]

[email protected]

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"project managment" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/project-managment.
For more options, visit https://groups.google.com/d/optout.

Reply via email to