Dear all,

Please let me know if you would be interested for the below mentioned

position, please send the resume to



bramhprakash.mis...@infogium.com



Position – Big Data/Spark developer

Location – New York,NY

Duration-6+month




*Must be Spark Certified and strong in SQL, Data Structures and Algorithms
along with SPARK.*

*Job Description: *
Specific responsibilities include:
** 1-2 years of SPARK Streaming experience with Scala/ Python and JAVA in
real time production exp.*

   - Develop and maintain our *Big Data pipeline that transfers and
   processes* several terabytes of data using *Apache Spark, Python, Apache
   Kafka, Hive & Impala.*
   - Design and build reports and dashboards using Tableau or other
   reporting tools
   - Perform adhoc data analysis, data processing, data visualization using
   SQL and other scripts
   - Work directly with product stake holders at the company to help them
   solve their most difficult problems.

*Required Skills:*
** Excellent Programming skills using Python, Scala, Perl, Shell, etc.*
** Familiarity with SQL and NoSQL technologies.*
** Experience with Production Hadoop ecosystem.*

   - Proficiency with relational databases and SQL.
   - Experience with Tableau or other reporting tools.
   - Well versed in software development principles and processes,
   including: analysis, design, and continuous delivery.
   - Good communication skills that can deal with diverse types of people
   from Data Science, Marketing, Finance, Product Management and QA groups.



Regards…

Bramha

Sr.Technical Recruiter bramhprakash.mis...@infogium.com
+16786488258           www.infogium.com

-- 
You received this message because you are subscribed to the Google Groups 
"MCMS" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mcms+unsubscr...@googlegroups.com.
To post to this group, send email to mcms@googlegroups.com.
Visit this group at https://groups.google.com/group/mcms.
For more options, visit https://groups.google.com/d/optout.

Reply via email to