*Hi All,*

*Please check and let me know*


*Kindly mail me at [email protected] <[email protected]>*


*Position    : Data Engineer*

*Location   : San Francisco CA*

*Duration   : 6 Months*

*Interview : In person*


*ONLY GC/USC*


Looking for a senior profile.Our team is looking for an experienced Data
Engineer who has a passion to build data products and data systems.

As a key member of Data team you will be responsible for designing and
developing major components of big data stream and batch processing
applications. As a Data Engineer you should be familiar with and have
hands-on experience with all aspects of big data engineering from data
ingestion of various types of sources and common data cleansing and
transformation techniques. *You should have proven expertise to develop a
publish-subscribe distributed logging system using Kafka and data ingestion
framework Camus in Avro serialized format. *You should be able to write
scalable Map Reduce jobs to extract data from HDFS into Hive, HBase and
Amazon Redshift as necessary.  You should be proficient in Python to build
a scheduler pipeline using Airflow or similar technologies. Agility and
innovativeness are the keys to success in this role.

*Key Responsibilities:*
Build and maintain code to populate HDFS, Hadoop with log events from Kafka
or other SQL production systems
Design, build and support pipeline of data transformation,conversion and
validation
Design and support effective storage and retrieval of 2 Petabytes Big data
ecosystem
Design and support Avro serialized schema repository and use Hive or Spark
as necessary for different use cases

*Support and Tune a big data pipeline starting from Kafka to HDFS,Hbase and
Amazon Redshift*
Lead the effort of *building a unified Kafka cluster* to support multiple
consumers
Participate in Kafka upgrade to latest version
Lead the company initiative to migrate Hadoop Hortonworks Data Platform to
Amazon Elastic Map Reduce

*Qualifications:*
Experience with Hadoop stack (HIVE, Spark,HBase,Hadoop streaming),MapReduce
Familiarity with different data formats and serialization, like JSON, AVRO
Strong grasp of algorithms and data structures
Database experience with MySQL, Postgres
Proficient in these languages: Java, Python
Experience with Test Driven Code Development and SCM tools such as GIT
Good familiarity with Linux/Unix scripting
MS in Computer Science/Engineering is required
Strong communication skills



Thanks

Nikhil Prasad

[email protected]

201-620-9700*130

Apetan Consulting LLC

-- 
You received this message because you are subscribed to the Google Groups 
"Citrix and Sap problems" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/citrix-and-sap-problems.
For more options, visit https://groups.google.com/d/optout.

Reply via email to