*Hi,*

*Hope you are doing good,*

*Please find the below requirement.*



*Position: DW ETL Developer*

*Location: Chicago IL*

*Duration: Contract- 6 months +*



*Please submit local or candidates able to come in for face to face
interview.  *



Send resumes at *bv...@activesoftinc.com <bv...@activesoftinc.com>*


*Job description:*



*Required skills:*

Java / Python -Sqoop/Flume/Kafka/Pig/Hive/(DataStage or similar ETL tool) /
HBase / NoSQL / Datamee



This position is responsible for the design of data movement into an
throughout the TIL, including but not limited to the Operational Data
Store, Atomic Data Warehouse, Dimensional Data Warehouse and Master Data
Management. Mentoring of designers for detailed design. Development of
enterprise design view and application to project level. Essential
Functions: • Review all Project Level Data Movement Designs for adherence
to Standards and Best Practices • Suggest changes to Project Level Designs
• Develop New Data Movement Design Patterns where Required • Guide the
Coding and Testing of standard data movement reusable components
Requirements: • Strong Analytical and problem solving skills.



Build distributed, scalable, and reliable data pipelines that ingest and
process data at scale and in real-time.

Collaborate with other teams to design and develop data tools that support
both operations and product use cases.

Source huge volume of data from diversified data platforms into Hadoop
platform

Perform offline analysis of large data sets using components from the
Hadoop ecosystem.

Evaluate big data technologies and prototype solutions to improve our data
processing architecture.

Knowledge of Healthcare domain is an added advantage



*Candidate Profile: *



8+ years of hands-on programming experience with 3+ years in Hadoop
platform

Proficiency with Java and one of the scripting languages like Python etc.

J2EE, EJB, WAS deployments, RESTful service

Good grasp of data movement approaches and techniques and when to apply
them

Strong hand on experience with databases like Db2, Teradata

Flair for data, schema, data model, how to bring efficiency in big data
related life cycle

Ability to acquire, compute, store and provision various types of datasets
in Hadoop platform

Understanding of various Visualization platforms (Tableau, Qlikview,
others)

Strong object-oriented design and analysis skills

Excellent technical and organizational skills

Excellent written and verbal communication skills

Top skill sets / technologies:

Java / Python

Sqoop/Flume/Kafka/Pig/Hive/(DataStage or similar ETL tool) / HBase / NoSQL
/ Datameer / MapReduce/Spark

Data Integration/Data Management/Data visualization experience



*Would appreciate if you could please send the updated resume with the
following details:*

·         Full Name:

·         Contact#

·         Email:

·         Current location:

·         Relocation:

·         Currently working:

·         Availability:

·         Visa Status:

·         Skype id:





*Thanks & Regards,*



Venu Boddupally

*Active Soft Inc.*

*|| *Accelerate Your Success ||

Phone: 404-496-4368 * 113
*b*v...@activesoftinc.com
www.activesoftinc.com
<http://www.google.com/url?q=http%3A%2F%2Fwww.activesoftinc.com%2F&sa=D&sntz=1&usg=AFQjCNHcG41H--kjGyUwgLo_c8WgNoev2A>

-- 
You received this message because you are subscribed to the Google Groups 
"Citrix and Sap problems" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to citrix-and-sap-problems+unsubscr...@googlegroups.com.
To post to this group, send email to citrix-and-sap-problems@googlegroups.com.
Visit this group at http://groups.google.com/group/citrix-and-sap-problems.
For more options, visit https://groups.google.com/d/optout.

Reply via email to