Send your resumes at

*HADOOP DEVELOPER and ETL background*

*Phoenix, AZ (Onsite only)*

*Multiyear long-term*

Our Client (American Express) is embarking on an exciting transformation
driven by an energetic new team of high-performers. This group is nimble
and creative with the power to shape our technology and product roadmap. If
you have the talent and desire to deliver innovative payment products and
services at a rapid pace, serving our customers seamlessly across physical,
digital, mobile, and social media, join our transformation team - "Big Data
Labs" You will be part of a fast-paced, entrepreneurial team responsible
for delivering world class platform capabilities on our Big Data Platform.

You will be challenged with identifying innovative ideas and proof of
concept to deliver against the existing and future needs of our customers.

Experience with MAPR Distribution a big plus.


• Core java

• Hdfs/MapReduce fundamentals

• Native java MapReduce (writing/tuning/customizing)

• Hbase (working knowledge of writing batch/real-time java applications)
​  ​

​ ​
Must also have ETL experience.


·         Design and build robust solutions for Big Data problems.

·         Guide the full lifecycle for the Big Data solution including
technical architecture design, solution design, solution development,
testing & deployment.

·         Define and document design and build standards

·         Provide subject matter expertise and stay abreast of latest
technologies to solve problems associated with business intelligence and
analytics in Big Data world.

·         Provide technical leadership & coaching to junior developers.

·         Researches new technologies for the team.


·         Bachelor degree in Engineering or Computer Science or equivalent
OR Master’s in Computer Applications or equivalent

·         Minimum of 3-6 years of relevant experience big data development.
Should possess a bachelor's or master's degree in Computer Science.

·         Hands-on development experience with Big Data technology stack
(HDFS, Map Reduce, Hive, Pig, Message Queues, SOLR/Elastic Search, Talend,

·         Expertise in Java/C++ and Python.

·         Experience in high scale or distributed RDBMS

·         Expert on Hadoop Architecture having knowledge on Hadoop, Map
Reduce, Hbase, Pig, Hive, YARN, Zookeeper , Solar, Sqoop, Flume.

·         Must have recent hands on 1+ years web application development
experience in J2EE (Struts/JSP/Servlet).

·         Must be proficient with SQ, PL/SQL and stored procedures,
Teradata experience would be plus.

·         Should have recent experience using WSAD and deploying to
WebSphere 6.x.

·         Experience in UNIX shell scripting is good to have

·         Should have recent experience using WSAD and deploying to
WebSphere 6.x.

·         Provides advanced knowledge of technical and functional

·         Effectively leverages General knowledge of the relationships that
exist between all relevant functional groups within American Express.

Thanks &Regards


*Unicom **Technologies Inc*

Phone:309-740-1565 Fax: 866-291-2541

Email: URL

Gtalk: unicomaditya

You received this message because you are subscribed to the Google Groups 
To unsubscribe from this group and stop receiving emails from it, send an email 
To post to this group, send email to
Visit this group at
For more options, visit

Reply via email to