*Title: Hadoop Engineer*
*Location: Pleasanton, CA*
*Position Type: Contract 12 Months*
*Required local who can do F2F*
The Hadoop Engineer shall support the State Fund enterprise architecture
team. The consultant will provide professional services to support the
long term IT strategy and planning to include high level analysis,
professional reports and presentations, and mentoring, support and
State Fund, at their discretion, can utilize Consultant resource in any
capacity to meet the needs of State Fund. Consultant resources may be
assigned by State Fund to lead teams, work independently, work on teams
lead by State Fund, Consultant resources or, teams lead by other consulting
The tasks for the Hadoop Engineer include, but are not limited to, the
· Translate client user requirements into technical architecture
vision and implementation plan
· Design and implement an integrated Big Data platform and
· Design and implement data collectors to collect and transport
data to the Big Data Platform.
· Implement monitoring solution(s) for the Big Data platform to
monitor health on the infrastructure.
*Technical Knowledge and Skills:*
Consultant resources shall possess most of the following technical
knowledge and experience:
· 2+ years of hands-on development, Deployment and production
Support experience in Hadoop environment.
· 4-5 years of programming experience in Java, Scala, Spark, Python.
· Proficient in SQL and relational database design and methods for
· Knowledge of NoSQL systems like HBase or Cassandra
· Hands-on experience in Cloudera Ddistribution 5.x ,Java and Solr.
· Hands-on experience in creating, indexing Solr collections in
Solr Cloud environment.
· Hands-on experience building data pipelines using Hadoop
components Sqoop, Hive, Pig, Solr, MR, Spark, Spark SQL.
· Must have experience with developing Hive QL, UDF’s for analyzing
semi structured/structured datasets.
· Must have experience with Spring framework.
· Hands-on experience ingesting and processing various file formats
like Avro/Parquet/Sequence Files/Text Files etc.
· Must have working experience in the data warehousing and Business
· Expertise in Unix/Linux environment in writing scripts and
· Successful track record of building automation scripts/code using
Java, Bash, Python etc. and experience in production support issue
· Hands-on experience working in Real-Time analytics like
· Working knowledge of Aspire Content Processing Platform and Query
· Experience working with users on requirements gathering
· Agile development methodologies.
*Thanks & Regards,*
*Saicon Consultants, Inc.*
(408) 216-2646 Ext 149 (W)
(913) 273-0058 (F)
*SBA 8(a) Certified /WBE/MBE/DBE/SDB*
*Inc.500 Company – 2006. 2007, 2008, 2009 & 2010*
*Ranked #1 " Fastest-Growing Area Businesses" - Kansas City Business
Journal - 2006*
*Ranked in Top 10 of Corporate 100 - Ingram's - 2006 & 2007*
*CMMI Level 3 *
*ISO 9001:2008 Certified*
You received this message because you are subscribed to the Google Groups "SAP
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to firstname.lastname@example.org.
Visit this group at https://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.