*This is Anuradha from Coco Soft; I am working on below requirement. Please
help me with updated resume ASAP………….*

*Reply to [email protected] <[email protected]>*


*Role: Big Data Developer (Spark Streaming. Kafka, Storm)*

*Location: New Jersey- 1 and Wilmington, DE-3*

*Duration: Contract*


Need very strong profiles, ensure to check I-94 status and Photo ID are
mandatory



• Minimum 8 years of experience with 3+ years of development experience in
Big Data technologies like Hadoop and Scala / Python.

• Ability to own and establish physical Architecture for Big Data platform.

• Ability to design and support development of a data platform for data
processing (data ingestion and data transformation) and data repository
using Big Data Technologies like Hadoop stack including HDFS cluster,
MapReduce, Spark, Scala, Hive and Impala.

• Past experience to build proof of concepts using Big Data technologies to
test various use cases.

• Ability to support logical data model design and convert it into physical
data model.

• Ability to design and support development of a data mart using Oracle and
Ab Initio ETL platform.

• Ability to design and support RESTful API based web services for data
distribution to downstream applications.

• Past experience working with best practices/standards for Big data
platform and web services.

• Past experience in translating functional and technical requirements into
detail design



 Best Regards

Anu

Coco Soft Inc.,

#3909 Washington Blvd, SUT 202, Fremont, CA, 94538. U.S.A

Direct: 510-936-6956 | Email: [email protected]

Fax: 510-338-9819

Web: http://coco-soft.com

An E - Verified Company

-- 
You received this message because you are subscribed to the Google Groups 
"Oracle Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/oradev.
For more options, visit https://groups.google.com/d/optout.

Reply via email to