Hi,


I have some urgent requirements with my direct client. Please send me your
updated resume along with your hourly rate / yearly salary expectations, if
interested. In case you are not interested, it will be nice to let your
friends know of this position who may be a potential fit.



*Big Data Engineer (Hadoop, Talend)*

*Location: Portland, OR*



*Skype drive will be from 5-8 PM PST on Wednesday 27th*



*Required Technical Expertise*

·         Participate in technical planning & requirements gathering phases
including design, coding, testing, troubleshooting, and documenting *big
data-oriented software applications*. Responsible for the ingestion,
maintenance, improvement, cleaning, and manipulation of data in the
business’s operational and analytics databases, and troubleshoots any
existent issues.

·         Implements, troubleshoots, and optimizes distributed solutions
based on modern big data technologies like *Hive, Hadoop, Spark, Elastic
Search, Storm, Kafka*, etc. in both an on premise and cloud deployment
model to solve large scale processing problems

·         Experience with Big Data & Analytics solutions *Hadoop, Pig,
Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure
(HDInsight, Data Lake Design)* and other technologies

·         Exposure to *MS Azure platform, Healthcare and Analytics*
technical leadership skills to drive the development team and business in a
right direction

·         Design, enhance and implement *ETL/data ingestion *platform on
the cloud.

·         Strong Data Warehousing skills, including: Data cleanup, ETL, ELT
and handling scalability issues for enterprise level data warehouse

·         Create ETLs/ELTs to take data from various operational systems
and create a unified/enterprise data model for analytics and reporting.

·         Create and maintain ETL specifications and process documentation
to produce the required data deliverables *(data profiling, source to
target maps, ETL flows). *

·         Strong data modelling/design experience. Experience with data
modeling tool *(ER/Studio). *

·         Capable of investigating, familiarizing and mastering new data
sets quickly

·         Strong troubleshooting and problem-solving skills in large data
environment

·         Experience with building data platform on cloud (AWS or Azure)

·         Experience in using *Python, Java *or any other language to
solving data problems

·         Experience in implementing SDLC best practices and Agile methods

·         Knowledge on Big Data concepts and technologies like MDM, Hadoop,
Data Virtualization, *Reference Data/Metadata Management *preferred.

·         Experience in working with Team Foundation Server/JIRA/GitHub and
other code management toolsets

·         Strong hands-on knowledge of/using solutioning languages like*:
Java, Scala, Python *

·         Healthcare domain knowledge is a plus



*Technology Stack*

·         big data-oriented software applications

·         Hive, Hadoop, Spark, Elastic Search, Storm, Kafka

·         Pig, Spark SQL Storm, AWS (EMR, Redshift, S3, etc.)/Azure
(HDInsight, Data Lake Design)

·         MS Azure platform, Healthcare and Analytics

·         ETL/data ingestion platform on the cloud.

·         Java, Scala, Python

·         MDM, Hadoop, Data Virtualization, Reference Data/Metadata
Management preferred

·         SDLC best practices and Agile methods



*Years of Experience*

·         Bachelor’s Degree with a minimum of 10+ year’s relevant
experience or equivalent.

·         10+ years of industry experience in data architecture/Big Data/
ETL environment.

·         10+ years of experience in designing and operating very large
Data platforms

·         6+ years of experience with any ETL design using tools *Informatica,
Talend, Oracle Data Integrator (ODI), Dell Boomi *or equivalent.

·         4+ years of experience with Big Data & Analytics solutions *Hadoop,
Pig, Hive, Spark, Spark SQL *Storm, AWS (EMR, Redshift, S3, etc.)/Azure
(HDInsight, Data Lake Design) and other technologies

·         3+ years of experience in building and managing hosted big data
architecture, toolkit familiarity in: *Hadoop with Oozy, Sqoop, Pig, Hive,
HBase, Avro, Parquet, Spark, NiFi *









Thanks and Regards,



[image: Description: Description: Description: Description: Description:
cid:[email protected]]

 Nityo Infotech Corp.
 666 Plainsboro Road,
 Suite 1285, Plainsboro, NJ  08536



 *Dev Chauhan *

* Sr. IT Recruiter*
* Ph:* *609-853-0818 Ext: 2290*

 *[email protected] <[email protected]>*

 *www.nityo.com* <http://www.nityo.com/>




USA | Canada | India | Singapore | Malaysia | Indonesia | Philippines |
Thailand  | UK | Australia / Zealand
*Nityo Infotech has been rated as One of the top 500 Fastest growing
companies by INC 500*

*Disclaimer:* http://www.nityo.com/Email_Disclaimer.html

-- 
You received this message because you are subscribed to the Google Groups 
"BlazeAdvisorUserGroup" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/blazeadvisorusergroup.
For more options, visit https://groups.google.com/d/optout.

Reply via email to