*Hi Please read carefully I am in desperate need for you help I will add
you in my personal list. *

*Please reply on harry.radcliffATpanzersols.com
<http://harry.radcliffATpanzersols.com>*

*Position: Senior ETL Consultant with HADOOP *

*Duration: 6 MonthsLocation: Irvine, CA*

*Must Provide H1 Copy or EAD GC Copy*

*Must have stronger Hadoop related experience*.

Qualifications
Required Skills and Experience:
• A BS degree in Computer Science, related technical field, or equivalent
work experience; Masters preferred.
Minimum of three years’ experience with the following:
• Experience architecting and integrating the Hadoop platform with
traditional RDBMS data warehouses.
• Experience with major Hadoop distributions like Cloudera (preferred),
HortonWorks, MapR, BigInsights, or Amazon EMR is essential.
• Experience with ETL tools such as Informatica
• Experience developing within the Hadoop platform including Java MapReduce,
Hive, Pig, and Pig UDF development.

Job Descriptions:
The individual must be capable of understanding Hadoop ecosystem and the
complex object design and the underlying data model of the system. The
individual must be comfortable with developing data-centric applications
using Hadoop tools, Netezza, Informatica, Informatica BDE (Big Data
Edition), HIVE MapReduce, Spark and able to develop ETL packages, and is
expected to develop queries and stored procedures involving complex
database structures and Hadoop Distributed File System(HDFS). The
individual must have excellent communication skills, work well in a team
environment, enjoy solving complex problems and be able to work in a fast
paced environment.

The Responsibilities:
• Define technical scope and objectives through research and participation
in requirements-gathering and definition of processes
• Gather and process raw, structured, semi-structured, and unstructured
data at scale, including writing scripts, developing programmatic
interfaces against web APIs, Web logs, processing real time feeds, etc.
• Design, review, implement and optimize data transformation processes in
the Hadoop (primary) and Informatica ecosystems
• Test and prototype new data integration tools, techniques and
methodologies
• Adhere to all applicable development policies, procedures and standards
• Participate in functional test planning and testing for the assigned
application integrations, functional areas and projects.
• Work with the team in an Agile/SCRUM environment to ensure a quality
product is delivered
• Rapid response and cross-functional work to deliver appropriate
resolution of technical, procedural, and operational issues.

Preferred Skills & Experience:
• Experience with logical, 3NF or Dimensional data models.
• Experience with NoSQL databases like HBase, Cassandra, Redis and MongoDB.
• Experience with Hadoop ecosystem technologies like Flume, Kafka and Spark.
• Experience with Netezza and Oracle.
• Experience with Informatica Big Data Edition.
• Certifications from Cloudera, HortonWorks and/or MapR.
• Knowledge of Java SE, Java EE, JMS, XML, XSL, Web Services and other
application integration related technologies
• Familiarity with Business Intelligence tools and platforms like Datameer,
Platfora, Tableau and Microstrategy a plus.

Best Regards,
*Harry Radcliff |Technical Recruiter | Panzer Solutions LLC*
Direct: 203-652-1444 Ext 174 Fax: 292-286-1457.

-- 
You received this message because you are subscribed to the Google Groups "SAP 
ABAP" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-abap+unsubscr...@googlegroups.com.
To post to this group, send email to sap-abap@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-abap.
For more options, visit https://groups.google.com/d/optout.

Reply via email to