For immediate response please reply on kum...@selectsourceintl.com

Need USC OR GC Only (for H1B need Visa Copy at the time of submission)


*Role         : Hadoop Developer *

*Duration : 6 Months    *

*Client       : AT&T*

*Location  : San Francisco CA 94108*



*Qualifications:*

Qualification Rating Developer Build programs that leverage the parallel
capabilities of Hadoop and MPP platforms * 0 (No Familiarity) of 5 (Expert)
and 3+ Yrs. Collaboration * 4 (Very Strong) of 5 (Expert) Columnar DB
solutions Vertica - Cassandra - Greenplum for Data Management * 0 (No
Familiarity) of 5 (Expert) and 3+ Yrs. Hortonworks Hadoop distribution
components and custom packages * 0 (No Familiarity) of 5 (Expert) and 3+
Yrs. Loading external data to Hadoop environments using Flume * 0 (No
Familiarity) of 5 (Expert) and 3+ Yrs. Loading external data to Hadoop
environments using MapReduce * 0 (No Familiarity) of 5 (Expert) and 3+ Yrs.
Loading external data to Hadoop environments using Sqoop * 0 (No
Familiarity) of 5 (Expert) and 3+ Yrs. PIG scripting to manage data * 0 (No
Familiarity) of 5 (Expert) and 3+ Yrs. Self-motivated * 4 (Very Strong) of
5 (Expert) Teamwork * 4 (Very Strong) of 5 (Expert)



*Responsibilities:*

*Overall Purpose: *Responsible for the development of distributed computing
tasks which include MapReduce, NoSQL, and other distributed environment
technologies based on need or preference to solve the problems that arise.
Using programming languages and technology, writes code, completes
programming and documentation, and performs testing and debugging of
applications. Analyzes, designs, programs, debugs and modifies software
enhancements and/or new products used in local, networked, or
Internet-related computer programs. May interact with users to define
system requirements and/or necessary modifications . Roles &
Responsibilities: 1) 8-10 years experience in developing software
applications including: analysis, design, coding, testing, deploying and
supporting of applications. 2) Proficient in application/software
architecture (Definition, Business Process Modeling, etc.). 3) Project
Management experience. 4) Experience building Big Data solutions using
Hadoop technology. 5) Extensive experience with software development and
the complete software lifecycle (analysis, design, implementation, testing,
quality assurance). 6) Ability to work with non-technical resources on the
team to translate data needs into Big Data solutions using the appropriate
tools. 7) Extensive experience developing complex MapReduce programs
against structured and unstructured data. 8) Experience with loading data
to Hive and writing software accessing Hive data. Qualification Rating
Developer Build programs that leverage the parallel capabilities of Hadoop
and MPP platforms * 0 (No Familiarity) of 5 (Expert) and 3+ Yrs.
Collaboration * 4 (Very Strong) of 5 (Expert) Columnar DB solutions Vertica
- Cassandra - Greenplum for Data Management * 0 (No Familiarity) of 5
(Expert) and 3+ Yrs. Hortonworks Hadoop distribution components and custom
packages * 0 (No Familiarity) of 5 (Expert) and 3+ Yrs. Loading external
data to Hadoop environments using Flume * 0 (No Familiarity) of 5 (Expert)
and 3+ Yrs. Loading external data to Hadoop environments using MapReduce *
0 (No Familiarity) of 5 (Expert) and 3+ Yrs. Loading external data to
Hadoop environments using Sqoop * 0 (No Familiarity) of 5 (Expert) and 3+
Yrs. PIG scripting to manage data * 0 (No Familiarity) of 5 (Expert) and 3+
Yrs. Self-motivated * 4 (Very Strong) of 5 (Expert) Teamwork * 4 (Very
Strong) of 5 (Expert) Additional Site (1) (No Value) Additional Site (2)
(No Value) Project Name BDaaS Additional Job Posting Description Details
The candidate will share ownership of Hadoop cluster configuration,
deployment, and maintenance. Troubleshoot component and/or application
failures, and work with Operations and Development teams to resolve them.
Ideal candidate will contribute to make our Hadoop stack more robust and
reliable. Responsibilities -Operational management of Hadoop clusters
offered as a service -Deploy and manage Hadoop using Savanna on OpenStack
infrastructure -Optimize Hadoop configurations -Expert level systems
knowledge with Linux Systems -Trouble shoot Hadoop installations Is
candidate required to be a U.S. Citizen or U.S. National? No

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT.Groups" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to us_itgroups+unsubscr...@googlegroups.com.
To post to this group, send email to us_itgroups@googlegroups.com.
Visit this group at http://groups.google.com/group/us_itgroups.
For more options, visit https://groups.google.com/d/optout.

Reply via email to