*Hi,*

*Hope you are doing well…..*

*Please send resumes to **[email protected] <[email protected]>*



*Role: Sr. Big Data Engineer (MapReduce, Hadoop)*

*Location: Bridgewater, NJ*

*Duration: Long Term*

*Rate: $65/hr on C2C*



*Looking for GC and USC*

*Note: For H1 While submitting I need consultant I 797 OR ID Proof.*



*Responsibilities*

·      Design, build and maintain all aspects from how data is collected,
processed to data reporting and analytics for a group of cloud products

·      Design events, metrics and develop customer-facing reports and
dashboards on BI tools such as Jasper, Tableau, Cognos based on customer
requirements.

·      Develop and support ETL jobs with MapReduce/Spark/Impala

·      Import data from external sources, including log parsing into big
data platform.

·      Export data from big data platform or integrate to BI platform

·      Detect data abnormalities or quality issues and their root causes.
Drive data quality improvements across development teams

·      Partner with Product Managers and other stakeholders to understand
their use cases and business goals

·      Provide advice and education in the usage and interpretation of data
to product managers and development teams

·      Evolve the way we organize data in big data platform



*Requirements*

·      3+ years in writing efficient, complex SQL and Hive queries – joins,
unions

·      2+ years ETL experience working with Hive/Hue and building MapReduce
jobs

·      3+ years programming experience (Java and Python/other scripting
languages)

·      Hands on and deep experience with schema design and dimensional data
modeling

·      Experience with SQL or NoSQL database such as MySQL, Cassandra,
MongoDB, Hbase, Amazon

·      Redshift

·      Experience in resolving complex analytical problems using
quantitative approaches

·      Experience in dealing with TBs of data is a plus

·      Passion for data insights, goal-oriented, highly self-motivated,
good collaboration inside and

·      outside team

·      Enjoy challenges and to solve complex problems on a daily basis

·      Excellent communication skills, including the ability to identify
and communicate data driven

·      insights

·      Minimum a Bachelor degree in related technical field with
outstanding academic record



*Nice to have*

·      Experience in dealing with TBs of daily events and PBs of data in
Hadoop is a big plus

·      Experience with new big data technologies : Spark, Impala, Presto,
Parquet, GraphX is a big plus

·      Experience with Machine Learning, recommendation engine, R, or other
advance analytics is a

·      big plus

·      Experience with analytic tools (Tableau, Qlik, Jasper, Cognos) Client



-----------------------
Thanks&Regards,
Praveeth.chava,
Accounts Manager
Ph: 248-473-0720 EXT 163
Email: [email protected]

Gtalk: *[email protected] <[email protected]>*

Please consider the environment before printing this e-mail

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Workflow" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.

Reply via email to