Hi,


Hope you are doing well !!!



I have an urgent requirement for a *Senior Big Data Engineer*.. Please let
me know if you have any consultants and get back to me with their updated
resume in .doc format immediately. Given below is the job description.



Job Title: *Senior Big Data Engineer*

Location: Irvine, CA

Duration: 6+ Month

Total Experience: 10+ Years ( Mandatory )

Interview Mode: Phone / F2F



*Position Summary: *

·         The Senior Big Data Engineer will work within the Enterprise Data
Services team in Agile environment to install, update, maintain, monitor
and support the Hadoop and Enterprise Data Warehouse/Enterprise Business
Intelligence tools, applications and databases to support all Kelley Blue
Book Business units. The individual will work closely in a team consisting
of other big data and data warehouse engineers, business analysts, business
intelligence engineers, system analysts, quality assurance engineers and
database administrators in developing data warehouse solutions to meet
stated requirements, develop detailed specifications and unit test plans
for implementation. Since EDW works with multiple business units to create
cross functional solutions, the Senior Big Data Engineer must be able to
interface with various business units to understand the requirements and
prepare documentation to support development.

·         The individual must be capable of understanding Hadoop ecosystem
and the complex object design and the underlying data model of the system.
The individual must be comfortable with developing data-centric
applications using Hadoop tools, Netezza, Informatica, Informatica BDE (Big
Data Edition), HIVE MapReduce, Spark and able to develop ETL packages, and
is expected to develop queries and stored procedures involving complex
database structures and Hadoop Distributed File System(HDFS). The
individual must have excellent communication skills, work well in a team
environment, enjoy solving complex problems and be able to work in a fast
paced environment.



*The Responsibilities:*

·         Define technical scope and objectives through research and
participation in requirements-gathering and definition of processes

·         Gather and process raw, structured, semi-structured, and
unstructured data at scale, including writing scripts, developing
programmatic interfaces against web APIs, Web logs, processing real time
feeds, etc.

·         Design, review, implement and optimize data transformation
processes in the Hadoop (primary) and Informatica ecosystems

·         Test and prototype new data integration tools, techniques and
methodologies

·         Adhere to all applicable development policies, procedures and
standards

·         Participate in functional test planning and testing for the
assigned application integrations, functional areas and projects.

·         Work with the team in an Agile/SCRUM environment to ensure a
quality product is delivered

·         Rapid response and cross-functional work to deliver appropriate
resolution of technical, procedural, and operational issues.





*Minimum of three years’ experience with the following: *

·         Experience architecting and integrating the Hadoop platform with
traditional RDBMS data warehouses.

·         Experience with major Hadoop distributions like Cloudera
(preferred), HortonWorks, MapR, BigInsights, or Amazon EMR is essential.

·         Experience with ETL tools such as Informatica

·         Experience developing within the Hadoop platform including Java
MapReduce, Hive, Pig, and Pig UDF development.

·         Excellent oral and written communication skills

·         Excellent customer service skills

·         Excellent analytical and problem solving skills

·         Working knowledge of Linux O/S environments



*Preferred Skills & Experience: *

·         Experience with logical, 3NF or Dimensional data models.

·         Experience with NoSQL databases like HBase, Cassandra, Redis and
MongoDB.

·         Experience with Hadoop ecosystem technologies like Flume, Kafka
and Spark.

·         Experience with Netezza and Oracle.

·         Experience with Informatica Big Data Edition.

·         Certifications from Cloudera, HortonWorks and/or MapR.

·         Knowledge of Java SE, Java EE, JMS, XML, XSL, Web Services and
other application integration related technologies

·         Familiarity with Business Intelligence tools and platforms like
Datameer, Platfora, Tableau and Microstrategy a plus.

·         Experience in working in an Agile/SCRUM model.





*Sadiq Shaik*

ASAP Solutions Group LLC,

678-221-4992 (ext) 217

ssh...@myasap.com <tdamoda...@myasap.com>

-- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-developers+unsubscr...@googlegroups.com.
To post to this group, send email to android-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/android-developers.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-developers/CAMy78avWv%2BLdigY_FCPVuBoMZ1OTXnsO986q4V8dzUiO1%2BqcxA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to