*Hi,*

*Hope you are doing great.*

*Please find the requirement below and send me matched resumes to
**[email protected]
<[email protected]>**  (OR) **732-343-7688 x 3049* *.*



*POSITION: Teradata ETL Developer*

*LOCATION: Temple Terrace, FL*

*DURATION: 6 Months*



MUST HAVE EXP WITH HADOOP



*Rate:* $55/hr



Minimum of 7 years of relevant work experience in the information
technology field. Participation in full Software Development Life Cycle
(SDLC) of data warehousing projects; project planning, business requirement
analysis, data analysis, logical and physical database design, setting up
the warehouse physical schema and architecture, developing reports,
security and deploying to end users. Good communication skills and business
knowledge.

Minimum of 5+ years strong *Teradata and SQL, UNIX, Data Warehouse
Modeling, Teradata Utilities, Aggregates and building efficient views;
other SQL and UNIX experience plus Teradata certification may be
substituted for years of Teradata experience.*

*Installed and configured Apache Hadoop, Hive and Pig environment on the
prototype server.*
Configured *MySql Database to store Hive metadata.*
Responsible for loading unstructured data into *Hadoop File System (HDFS).*



Design and code from specifications, analyzes, evaluates, tests, debugs,
documents, and implements complex software apps
- Uses coding methods in specific programming languages to initiate or
enhance program execution and functionality
Applicant must be able to demonstrate the following:

Design and write system specifications.
Use *Teradata utilities fastload, multiload, tpump, and TPT to load data.*
Write *BTEQ s*cripts to transform data.
Write Fastexport scripts to export data.
Write, test and implement *Teradata Fastload, Multiload and Bteq scripts,
DML and DDL.*
Construct *Kornshell driver routines* (write, test and implement UNIX
scripts).
Write views based on user and/or reporting requirements.
Will be expected to be available for on-call response in order to provide
24x7 emergency response capabilities.
SQL proficient.
Over 2 years of experience with Big Data Technologies like *Hadoop, Map
Reduce, Hive, Pig and Sqoop*
Experience with *NoSql *databases like *MongoDB and HBase*
Experience in Installation and Configuration of *Hadoop cluster*

*Thanks for your Time and Support*

*Rajesh | Technical Recruiter*

*Software Programming Group*

*15 Corporate Place South, Suite #421, Piscataway, NJ - 08854*

*Tel: **732-343-7688 x 3049*  *|| Fax: **732-343-7689*

*Email: **|[email protected]* <%[email protected]>* | URL : *
*www.spgamerica.com* <http://www.spgamerica.com/>

*Hangout:bvrajeshgowd*

*IT Services ||  Consulting*

*USA || INDIA*

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT_ Jobs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/chandrakants.
For more options, visit https://groups.google.com/d/optout.

Reply via email to