Hi,

Hope you are doing well

Kindly send the suitable profiles for the requirement ASAP



*Role                :- **ETL LEAD *

*Location     :- Philadelphia-Initial Remote *



*Looking for ETL Lead Developer Requirements:  *



• Over 8-10 years of experience in the field of Information Technology with
proficiency in ETL design/development and Data Warehouse
Implementation/development.

• Experienced in Design, Development and Implementation of large - scale
projects in Financial, Shipping and Retail industries using Data
Warehousing ETL tools (Pentaho) and Business Intelligence tool.

• Knowledge about Software Development Lifecycle (SDLC), Agile, Application
Maintenance Change Process (AMCP).

• Excellent data analysis skills.

• Experience in Architecting and building Data Warehouse systems and
Business Intelligence systems including ETL using Pentaho BI Suite (Pentaho
Data Integration Designer / Kettle).

• Hands-on experience on Data warehouse Star Schema Modeling, Snow-Flake
Modeling, FACT & Dimension Tables, Physical and Logical Data Modeling.

• Installed and configured Pentaho BI Server on different operating systems
like Red Hat, Linux and Windows Server.

• Hands on experience on the whole ETL (Extract Transformation & Load)
process.

• Experience in creating ETL transformations and jobs using Pentaho Kettle
Spoon designer and Pentaho Data Integration Designer and scheduling them on
Pentaho BI Server.

• Used bunch of steps in Pentaho transformations including Row Normalizer,
Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence,
Add Constants and various types of inputs and outputs for various data
sources including Tables, Access, Text File, Excel and CSV file.

• Integrating Kettle (ETL) with Hadoop and other various NoSQL data stores
can be found in the Pentaho Big Data Plugin. This is a Kettle plugin and
provides connectors to HDFS, MapReduce, HBase, Cassandra, MongoDB, CouchDB
that work across Pentaho Data Integration.

• Loaded unstructured data into Hadoop File System (HDFS)

• Experience in performing Data Masking/Protection using Pentaho Data
Integration (Kettle).

• Experience in writing shell scripting for various ETL needs.

• Deep knowledge of RDBMS (SQL Server, MySQL, DB2 etc) and NoSQL databases
such as MongoDB, DynamoDB and Cassandra.

• Proficient in writing Confidential - SQL Statements, Complex Stored
Procedures, Dynamic SQL queries, Batches, Scripts, Functions, Triggers,
Views, Cursors and Query Optimization.

• Quick understanding of Relational Source Database Systems and data models
for building accurate transformation logic that can be used in Data
Migration and Data Integration.

• Good to have Supply Chain Knowledge.

• Motivated team player with excellent communication, interpersonal,
analytical and problem-solving skills.



*Thanks & Regards *

*shanmukha*  *I*  *Recruiter*

*3S Business Corporation*

*Office: 281-823-9222*547*

*shanmukha.gud...@3sbc.com <shanmukha.gud...@3sbc.com>*

*Hangout*/*:* *shanmukha3...@gmail.com <shanmukha3...@gmail.com>*

*Richmond Avenue | Houston, TX – 77082*

-- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-developers+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-developers/CAFAffyysfYwDboxPzZFnQZibgXiVEGW7A%3D0VY8j7pN%2By2FHmFg%40mail.gmail.com.

Reply via email to