*Hi,*

*I have the below requirements with my preferred vendor. Please do let me
know if you have any suitable profile*

*Please send resumes to **[email protected] <[email protected]>; **Direct:
201 308 –8704*

*Role: Tableau Developer with Strong experience in Big data and Talend*

*Location: Harrisburg, PA*

*Contract *

*Max Rate: $55/hr on C2C*

*Implement partner: Tech Mahindra ** (NO OPTS) *


*Note: Please don't send 1986 date of birth Resume*

*Need expert Tableau developer with Big Data and Talend skills*

*JD:*

•        7+ Years of IT Experience

•        3 + Years of experience on Hive/HQL and Pig.

•        3+ years of Big Data/Hadoop ETL experience using Hive/Pig/Oozie

•        3+ years of traditional ETL experience on RDBMS

•        Expertise with Talend in Hadoop Environment is a MUST.

•        Strong Linux shell scripting and Linux knowledge. Strong Expertise
working in / Understanding of Big data a technologies with strong focus on
Hortonworks .

•        Strong knowledge of Flume, Sqoop, Hive, Pig, Map Reduce, oozie,
Yarn applications with hands on experience – all or most of these.

•        Hands on Talend development and integration experience using
Talend enterprise integration. o Must have excellent and in-depth knowledge
in SQL, PL/SQL, stored Procedures

o   Must have experience in Software Design Patterns, Best Practices

o   Ability to create normalized/de-normalized database schemas

o   Ability to perform ETL operations/reporting from multiple sources
including internal, external using appropriate tools.

o   Demonstrated ability to implement and troubleshoot backup solutions,
database security, user management, and data maintenance tasks.

o   Experience with Data Integration/Business Intelligence tools such as
Pentaho, Talend, Informatica, etc.

o   Experience in analyzing text, streams with emerging Hadoop-based big
data, NoSQL.

o   Hands on experience with Running Pig and Hive Queries.

o   Analyzing data with Hive, Pig and HBase Data scrubbing and processing
with Oozie. Importing and exporting the data using Sqoop from HDFS to
Relational Database systems/mainframe and vice-versa. Loading data into
HDFS. Developing MapReduce Program to format the data.

-- 
You received this message because you are subscribed to the Google Groups 
"only.SAP" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/onlysap.
For more options, visit https://groups.google.com/d/optout.

Reply via email to