Hi,

Hope you are doing great,

Subject:  TalenD Tech Lead
Location: Southlake,TX
Duration:  6-12 month contract

*Reply Me At [email protected] <[email protected]>*

Skills Required:
TalenD, ClouderaHadoop, HDFS, HBase, Impala, Flume, AVRO (Nested), Parquet
(Nested), XML/XSD, Hive, HCatalog, Cloudera Navigator, Pig, SQL, Oozie,
Data Modeling

Experience:
• At least 4+ years of experience in Big Data Hadoop, DW/BI related
technologies and tools
• At least 4+ years of experience in software development life cycle.
• At least 4+ years of experience in Project life cycle activities on
development and maintenance projects.
• At least 3 years of experience in Design and architecture review. • At
least 2+ years of hands on experience with optimization and performance
tuning of ETL code on Talend.
• Good Understanding of Linux system internals file systems, and
administration.
• Hands on experience with shell scripting.
• Experience with data model concepts - star schema dimensional modeling
Relational design (ER) a plus.
• Ability to work in team in diverse/ multiple stakeholder environments.
• Experience to Airline (Travel / Transportation) domain.
• Analytical skills.
• Experience and desire to work in a Global delivery environment.

Qualifications
• Bachelor’s degree or foreign equivalent required from an accredited
institution. Will also consider three years of progressive experience in
the specialty in lieu of every year of education.
• At least 4 years of experience with Talend Suite of products

Job Description:
-You will be responsible for the design and development of ELT/ETL
technical solutions, Data Integrations solution utilizing TalenD and
ClouderaHadoop big data platform.
-Provide technical direction to the onshore & offshore developer team
-Strong understanding of complex business problems to ensure projects are
leveraging the appropriate technology and the technical design enables the
delivery of a comprehensive solution.
-Responsibilities will include defining technical requirements including
source to target mappings, loading data into HDFS using multiple ingestion
patterns including Real-time and Batch,

enabling Data Lake ingestion, data transformation using multiple file
formats (such as – XML, Nested AVRO, and Nested Parquet, automating jobs,
debugging jobs, scheduling and

productionalizing jobs.

Background preferred
 – Airline Industry, Migration of capability from Teradata to Hadoop,
working with different source systems

Skills Nice to have:
- Spark, Python/Scala
- Extensive SQL
- Data Modeling
- Architecting, developing, implementing and maintaining Real-time & Batch
Big Data solutions using Apache - ClouderaHadoop
Any of the additional skills and experience below would be advantage:
- Data Mining/Data Warehousing/Business Intelligence
- Unix/Linux
- Governance, Lineage Tools - ETL tools (ODI)
- MPP systems
- Java
- NoSQL

Regards,
Jack Anderson -Sr.Technical IT Recruiter
ITBrainiac Inc.,
[email protected]
Direct :  609-935-3773 X 103
116 Village Blvd, Suite 200 - Princeton, NJ 08540
www.itbrainiac.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
BASIS" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sap-basis.
For more options, visit https://groups.google.com/d/optout.

Reply via email to