*!! Must have 2 Years of experience with HADOOP !!*


*Informatica Developer with Hadoop Dunwoody, GA Long Term Contract*

*Description:*
We are looking for a Big Data Engineer (BDE) with Business Intelligence
experience to join our Data Services team.  The BDE will work with their
associated delivery teams to solution projects using Java, Python, SQL, and
internal tools on Hadoop and Netezza.   As a full functioning scrum team
member, you will be responsible for a variety of tasks needed to get the
work completed including analysis, design, programming, and testing.

*Specific Responsibilities include, but are not limited to:*
Gather and process raw, structured, semi-structured, and unstructured data
at scale, including writing scripts, developing programmatic interfaces
against web APIs, scraping web pages, processing twitter feeds, etc.
Design, review, implement and optimize data transformation processes in the
Hadoop (primary) utilizing Java, Python, Scala, and/or Hive
Design, review, implement and optimize data transformation processes in
Oracle and Netezza utilizing ETL/Informatica
Analyze data needs and recommend data models or changes to data models
while supporting reporting
Ability to turn data into metrics and reporting through visualization tools
Create solutions for data movement or transformation utilizing Java,
Python, Scala, and/or Hive
Test and prototype new data integration tools, techniques and methodologies
Active participant in the working of all assigned tasks/projects with the
ability to multi-task between projects and assignments easily
Practice active ownership in their assigned pieces of functionality and
ensure successful on time quality completion and maintenance
Respond quickly and effectively to production issues and taking
responsibility for seeing those issues through to resolution
Proactively communicate project status to program team and delivery team
Mentor less experienced team members
Ability to collaborate with other development teams
Expected to present project replays and relevant external learnings to
local team in the form of Knowledge Transfer sessions
Stay current with changes in the technical area of expertise by being
actively engaged in both internal and external learning opportunities to
provide better solutioning

*Qualifications*
*Bachelor’s degree in Computer Science, Information Services, Mathematics,
Statistics, or other applicable area from a four-year university or
equivalent industry experience
*5+ years maintaining and developing software in a large scale data or web
application environment like Oracle and Netezza
*2+ years in business intelligence including big data and a more advanced
understanding of EDW concepts of warehousing, data movement, and data
transformation
*3-5 years developing for Hadoop environment (including data
transformations)
*1+ years in Agile methodology
*Expert knowledge in SQL
*Good knowledge of object oriented analysis, design, and programming in
Java or Python
*Solid understanding of software engineering basics including data and
architecture in areas like Site Activity
*High performer with the ability to demonstrate their higher-level
technical experience on a daily basis

*Preferred Qualifications*
* Cloudera Hadoop
* Hive
* Pig
* SOLR
* MicroStrategy SDK and report development
* AWS
* Redshift
* Cloud technologies
* Platfora
* Datameer

My Best,
Kavi Johnson|
*Sr. Technical Recruiter CAT Technology, Inc*
*[email protected]*
Voice: 201-257-5081 Ext 358
Fax: 201-342-2385
*www.catamerica.com*

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Workflow" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.

Reply via email to