Greetings form *Diverse Lynx, LLC*.

We have an urgent need for a *Hadoop Developer *for one of our clients *Denver,
CO*. Please go through the below requirement if you or your consultants are
open for projects and interested in the below requirement ,Please respond
back with latest resume along with details ASAP.

*Title: *Hadoop Developer

*Location: *Denver, CO

*Duration: *12+ Months

*Req Details:*

*Core Responsibilities:*

·         Develop Big Data solutions on a Hadoop platform, leveraging
current ecosystem tools

·         Develop solutions for real-time and batch-mode event/log
collecting from various data sources

·         Analyze massive amounts of data and help drive prototype ideas
for new tools and products.

·         Developing enterprise-grade integration solutions, leveraging 3rd
party and custom integration frameworks

·         build and support APIs and services that are exposed to other
internal teams

·         Actively participate in team Agile planning and sprint execution


·         Bachelors or Masters in Computer Science or equivalent

·         Proven track record of delivering backend systems that
participate in a complex ecosystem.

·         8+ years designing and developing Enterprise-level data,
integration, and reporting solutions

·         3+ years’ experience developing applications on Hadoop, utilizing
Pig, Hive, Sqoop, or Spark- *MUST*

·         Experience with Hadoop 2.0 and Yarn applications

·         Proven experience with data modeling, complex data structures,
data processing, data quality, and data lifecycle

·         Current knowledge of Unix/Linux scripting, as well as solid
experience in code optimization and high performance computing.

·         Good communicator, able to analyze and clearly articulate complex
issues and technologies understandably and engagingly.

·         Great design and problem solving skills, with a strong bias for
architecting at scale.

·         Good understanding in any: advanced mathematics, statistics, and

·         Adaptable, proactive and willing to take ownership.

·         Keen attention to detail and high level of commitment.

·         Experience in messaging and collection frameworks like Kafka,
Flume, or Storm.

·         3+ years of distributed database experience (HBase, Accumulo,
Cassandra, or equivalent).

·         Knowledge in Big Data related technologies and open source
frameworks preferred.

·         Experience in software development of large-scale distributed

·         Experience with integration tools such as Pentaho or Informatica
Big Data Edition

·         Experience using Enterprise scheduling tools such as UC4, Tidal,
or Autosys

*Experience:* 9- 12 years.

*Thanks & Regards*

*Ankur Gulati*

Diverse Lynx* | ** |*

300 Alexander Park, Suite 200 Princeton, NJ 08540
P: 732-452-1006 Ext 238 | *

You received this message because you are subscribed to the Google Groups 
To unsubscribe from this group and stop receiving emails from it, send an email 
To post to this group, send email to
Visit this group at
For more options, visit

Reply via email to