>
> Hi,
>
> Hope doing great…!!!
>
> I have an urgent requirement below, please go through it and  share your
> updated profile ASAP at *a
> <[email protected]>[email protected]
> <[email protected]>*
>
>
>
>
> *Location:  Raleigh, NC*
>
> *Duration:  10-12 month+*
>
>
>
> *Responsibilities:*
>
>    - Work across technology teams to understand and define system and
>    technical requirements for Big Data – especially in the context of data
>    preparation for predictive modeling.
>    - Execute in the design/development of data ingestion and data
>    transformation for big data applications
>    - Point of contact for development team to the project manager, lead
>    systems analyst and QA lead
>    - Design and deliver complex architectures for customers
>    - Participates in requirements gathering and scope estimation meetings
>    - Responsible for Technical Systems Design (TSD) document delivery and
>    reviewing all project artifacts including requirements (SRA) and functional
>    specifications (SDS)
>    - Manage development for content pipeline and processing systems for
>    real-time and predictive analytics
>    - Capable of performance tuning Hadoop jobs (hive, pig, map reduce,
>    etc.)
>    - Rapidly prototype and validate big data applications
>    - Perform on-call support responsibilities with rotational schedule
>
> *Education and Experience*
>
>    - Bachelor Degree, Computer Science or other technical discipline
>    preferred
>    - *7+* years of Software Development
>    - *5+* years of Oracle PL/SQL [ or any other RDBS ] development
>    experience
>    - *2+* years of Big Data Development experience
>    - *5+* years of experience with scripting languages (Shell, Perl &
>    Hive)
>    - Control-M experience a plus
>    - Waterfall and Agile Software Lifecycle Methodology experience
>    - Experience delivering enterprise Java applications or web services
>
> *Skills and Knowledge*
>
>    - Proven intermediate to advanced knowledge of Cloudera Distribution
>    on Hadoop [ CDH5.X]
>    - Expertise with database and big data technologies are required
>    - Real project experience implementing data transformation and
>    processing solutions within tools like: Hive, Pig, Sqoop or MapReduce
>    - Strong understanding of MapReduce internals, parameter tuning and
>    monitoring
>    - Working experience with messaging systems and data pipelines
>    preferred
>    - Fundamental understanding of HDFS: File formats, compression codecs,
>    block splits
>    - Understanding of NoSQL databases; such as HBase, Cassandra, Mongo DB
>    a plus
>    - Knowledge with open source tools such as Maven, Ant, Git, Java unit
>    testing a plus
>    - Strong scripting knowledge using Perl, Korn Shell, Python or other
>
> Languages and Applications utilized
>
>
>    - Java, Perl, Python, Korn Shell
>    - Big Data Languages: Hive, Pig, Map Reduce
>    - Big Data Ingestion: Sqoop
>    - Operating Systems : Hadoop (HDFS and MR), YARN, Linux/Unix/AIX
>
>

> *Amith*
>
*Sr. Recruitment Manager*
>
> *Zenith tech Solutions*
> *Desk: **518-621-004* <518-621-0048>*6*
> *Fax: **518-244-4977* <518-244-4977>
> *3 park Hill*
>
> *Albany, NY 12204*
> *zenithtechsolutions.com <[email protected]>*
>

-- 
You received this message because you are subscribed to the Google Groups "Hot 
List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/directclienteq.
For more options, visit https://groups.google.com/d/optout.

Reply via email to