*Hi       *
*Please lookup the below position and if you feel comfortable ,then please
send me your updated resume.   *

*Position               :               **Hadoop Developer*
*Location               :               **Dallas , TX *

*Duration              :               6+ Months *

*Interview            :               **Phone then Face to Face  *

*GC , USC Only*

*Job Description*

·         The *Senior Hadoop Developer* is responsible for designing,
developing, testing, tuning and building a large-scale data processing
system, for Data Ingestion and Data products that allow client to improve
quality, velocity and monetization of our data assets for both Operational
Applications and Analytical needs.

·         This position supports this goal with strong experience in
software engineering and development of solutions within the Hadoop

*Job Responsibilities:            *

·         Responsible for design, development and delivery of data from
operational systems and files into ODSs (operational data stores),
downstream Data Marts and files.

·         Troubleshoot and develop on Hadoop technologies including HDFS,
Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools
such as Informatica,

·         Translate, load and present disparate data sets in multiple
formats and multiple sources including JSON, Avro, text files, Kafka
queues, and log data.

·         Will implement quality logical and physical ETL designs that have
been optimized to meet the operational performance requirements for our
multiple solutions and products, this will include the implementation of
sound architecture, design, and development standards.

·         Has the experience to design the optimal performance strategy,
and manage the technical metadata across all ETL jobs.

·         Responsible for building solutions involving large data sets
using SQL methodologies, Data Integration Tools like Informatica in any
Database preferably in an MPP platform.

·         Has strong Core Java Programming experience to apply in Data

·         Works with BA's, end users and architects to define and process
requirements, build code efficiently and work in collaboration with the
rest of the team for effective solutions.

·         Deliver projects on-time and to specification with quality.

*Job Requirement       *

·         5 years' experience with ETL tool development with sr. level
Hadoop eco system experience

·         Experience working in Data Management projects.

·         Experience working in Hive or related tools on Hadoop,
Performance tuning, File Format, executing designing complex hive HQL's,
data migration conversion.

·         Experience working with *Spark for data manipulation,
preparation, cleansing.*

·         Experience working with *ETL Tools ( Informatica/DS/SSIS) for
data Integration.*

·         Experience designing and developing automated analytic software,
techniques, and algorithms

* Additional Essential Functions:  *

·         Ability to apply mastery knowledge in one of the relational data
base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i)

·         Ability to apply mastery knowledge in one of the Data Integration
Tools (Informatica, SSIS.)

·         Expert ability and hands on experience in SQL and Core Java a

·         Experience with Unix/Linux and shell scripting.

·         Ability to demonstrate experience in distributed UNIX

·         Ability to work both independently and in a collaborative

·         Excellent problem solving skills, communication skills and
interpersonal skills.

·         Ability to analyze information and use logic to address work
related issues and problems.

·         Ability to demonstrate proficiency in Microsoft Access, Excel,
Word, PowerPoint and Visio.

·         Ability to present to a group.

·         Experience working in Agile, DevOps environment a plus.

·         Experience or knowledge of web architecture, (JavaScript,
SOAP/XML, Weblogic, Tomcat) is a plus.

·         Experience with an ORM framework, SOA Architecture, Micro
services is a plus.

·         Experience with Middleware components (ESB, API Gateway) is a

*Minimum Education:     *

·         BS in Computer Science, Information Systems, or related field
preferred and/or equivalent experience; Master's preferred

Regards ,

Kailash Negi

VSG Business Solutions

221 Cornwell Dr Bear, DE 19701

Email ID : *kail...@vsgbusinesssolutions.com*

Phone: 302-261-3207 x 102

You received this message because you are subscribed to the Google Groups "Visa 
Transfer OPT, CPT, PT, H1,H4" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to paul_talluri+unsubscr...@googlegroups.com.
To post to this group, send email to paul_talluri@googlegroups.com.
Visit this group at https://groups.google.com/group/paul_talluri.
For more options, visit https://groups.google.com/d/optout.

Reply via email to