*I have the following need for a Informatica / ETL Developer. Please let me know if you have anyone that meets all of the requirements below. *
* ***** *Position* - Informatica / ETL Developer**** *Experience *- 3 - 5 years**** *Location - *Metropark. NJ**** *Length* - 6 month Contract**** *Interview Process - 1st: *Phone Interview / *2nd*: On-Site Interview **** **** The ETL Developer / Tech lead will be responsible for the delivery of all ETL artifacts from the design to the delivery phase. The will be 100% accountable for the quality of all ETL code within their control. They will work with their business partners to ensure the business requirements are met for specific data integration projects throughout the organization.**** Expected Responsibilities:**** • Expertise in Extraction, Cleansing, Validating and Integrating of data from multiple systems**** ** ** Targeted Skills & Competencies:**** The ETL Developer / Tech lead will be responsible for the delivery of all ETL artifacts from the design to the delivery phase. The will be 100% accountable for the quality of all ETL code within their control. They will work with their business partners to ensure the business requirements are met for specific data integration projects throughout the organization.**** Expected Responsibilities:**** • Expertise in Extraction, Cleansing, Validating and Integrating of data from multiple systems. • Hands on experience in Transforming Business specific rules into functional Specs. • Work with the Data Integration Architect to create ETL standards, High Level and Low-level design documents. • Strong experience in designing Parallel, Server Jobs, Job Sequencers and Batch Jobs in DataStage 8.1.2 and 8.5 • Strong experience in Informatica ETL including complex mappings, tranformations and workflows. • Extensively worked with partioningfor splitting multiple data streams into sub chunks to distribute the load evenly across all available processors to achieve maximum thru put. • Strong Knowledge of Data Warehouse Architecture, Star Schema, Snow flake Schema, Fact tables and Dimensional tables. • Experience in Integration of various data sources like TeraData, DB2UDB, SQL Server, Oracle, Sybase and Ms-Access. • Strong experience in writing database procedures, functions and fine-tuning SQL for performance optimization. • Good experience in writing Shell Scripts to automate file manipulation and data loading procedures. • Responsible for Production Support and involved in On-Call for Data Integration applications. • Hands on experience in tuning mappings, identify and resolving performance bottlenecks in various levels like source and target mappings. ** ** ** ** Targeted Skills & Competencies:**** The ETL Developer / Tech lead will be responsible for the delivery of all ETL artifacts from the design to the delivery phase. The will be 100% accountable for the quality of all ETL code within their control. They will work with their business partners to ensure the business requirements are met for specific data integration projects throughout the organization.**** Expected Responsibilities:**** • Expertise in Extraction, Cleansing, Validating and Integrating of data from multiple systems. • Hands on experience in Transforming Business specific rules into functional Specs. • Work with the Data Integration Architect to create ETL standards, High Level and Low-level design documents. • Strong experience in designing Parallel, Server Jobs, Job Sequencers and Batch Jobs in DataStage 8.1.2 and 8.5 • Strong experience in Informatica ETL including complex mappings, tranformations and workflows. • Extensively worked with partioningfor splitting multiple data streams into sub chunks to distribute the load evenly across all available processors to achieve maximum thru put. • Strong Knowledge of Data Warehouse Architecture, Star Schema, Snow flake Schema, Fact tables and Dimensional tables. • Experience in Integration of various data sources like TeraData, DB2UDB, SQL Server, Oracle, Sybase and Ms-Access. • Strong experience in writing database procedures, functions and fine-tuning SQL for performance optimization. • Good experience in writing Shell Scripts to automate file manipulation and data loading procedures. • Responsible for Production Support and involved in On-Call for Data Integration applications. • Hands on experience in tuning mappings, identify and resolving performance bottlenecks in various levels like source and target mappings. ** ** ** ** ** ** *Thanks & Best Regards ,***** *Bajireddy,*** *[image: cid:[email protected]]***** *346 Georges Road, Suite # 1,Dayton, NJ 08810 ***** *Ph: **(732) 438 1906 Ext:109* <%28732%29%20438%201906%20Ext%3A109>* | Fax: **(732) 438-6973* <%28732%29%20438-6973>**** *Email: **[email protected]* <[email protected]>* l GMail : ** [email protected]* <[email protected]>**** *www.nihaki.com* <http://www.nihaki.com/> / *MWBE & SBA certified Company.** *** *For open positions please visit **www.nihaki.com/careers.asp*<http://www.nihaki.com/careers.asp> Note: Under Bill s.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered Spam as long as we include contact information and a remove link for removal from our mailing list.Click here to REMOVE from the mailing list<[email protected]?subject=REMOVE/mailto:[email protected]?subject=REMOVE> **** ** ** -- bajireddy atla -- bajireddy atla -- You received this message because you are subscribed to the Google Groups "oraapps" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/oraapps?hl=en.
<<image001.jpg>>
