Good Morning,
Please let me know if you have any contract opportunity of Hadoop/Java
Developer. I have an excellent senior resource Durga, below and attached is
his profile for your review.

----------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------

*Durga*

*(Certified Developer)*


[email protected]

425 209 1434



*Professional Summary*

·         Over 9+ years of IT experience and played various roles such as
Developer, Senior Developer, Technology lead and *Big Data Hadoop engineer*.

·         More than 2 years of working experience in *HDFS*, *Map Reduce
programming model using Java, Scala*.

·         Expertise in *Hadoop Eco system tools such as Hive, Pig, Sqoop,
Kafka, Oozie & Zoo keeper*.

·         Hands on experience in *NOSQL databases* such as *HBASE with
Apache Phoenix.*

·         Hands on experience in working with large volume of *Structured
and Un-Structured data.*

·         Working experience in 100 nodes *Cloudera Hadoop cluster*
production environment.

·         Working experience in *Hadoop batch job configuration, Batch jobs
scheduling*.

·         4+years of Experience in *Core Java and J2EE technologies such as
Spring MVC.*

·         Hands on experience using *Agile methodology for software
development*.

·         Having strong analytical and presentation skills to communicate
complex quantitative analysis in a clear, precise, and actionable manner.



*Education*

·         *Bachelor of Engineering* in Computer Science (B.E), Anna
University, INDIA.



*Certifications & Trainings*

v  *Certified in Cloudera Certified Developer* for Apache Hadoop.

v  Oracle Certified Professional, *Java SE Programmer*.



*Major Achievements*

v  Received “On the Spot” award for cost savings suggestion in AEGON.

v  Received appreciation from client for successful implementation on NDM
project in Nielsen.



*Software/Technical Skills*



*Big Data Skills/Tools: *Map Reduce, HDFS, Hadoop Eco Systems (Sqoop,
Hive, Pig, Oozie, Zookeeper),  Kafka, Apache Spark, Scala, HBASE, YARN.



*Java Skills/Tools:* Spring, Servlets, JSP, JDBC, JMS



*Web Technologies: *HTML, JavaScript, CSS, JQuery.



*Database Skills/Tools*: SQL, MySQL, HBASE(NoSQL).





*Project Experience*



*Nielsen,
Schaumburg-IL
     May 2014 to Present*

*Big Data Hadoop / Java Consultant*

Project Name #1: Nielsen Data Manager



*Project Description:* Nielsen Data Manager (NDM) is a web application
developed to replace existing desktop application developed in .Net. The
use-case is to support daily data volume in range of 50 GB to 1 TB per
day,  and handling data about 10-15 TB ultimately. The project is divided
into three modules such as Element Recognition, Fact Calculation and
Publishing & visualization.



*Responsibilities:   *

   - Involved in discussions for solution design with Architects.
   - Involved in various POC's to choose right big data tools for business
   use cases.
   - Translated complex functional and technical requirements into detailed
   design.
   - Migrated data from SQL Server into HDFS using Sqoop tool.
   - Developed Chain Map-Reduce jobs from Data Ingestion to Fact
   Calculation.
   - Defined workflow and scheduled the workflows for the jobs defined
   using Oozie.
   - Used Spring –JDBC pattern for handling web application REST requests
   and service them.
   - Proposed best practices/standards.
   - Used MRUnit for Map reduce code testing.


Environment: HDFS, Map Reduce, HIVE,  Sqoop, HBase, CDH5, Yarn, Shell
Script, Oozie, Apache Phoenix



*USAA, SA-TX

       Jun 2012 to March 2014*

*Java Developer  *

Project Name #4: PAS – Auto (enhancement)



*Project Description:* The PAS – Auto (enhancement) project is to add
enhanced functionality to existing application and also modify front end
systems to suite the requirement. The project was executed in Agile
methodology and is kind of pilot project.



*Responsibilities:   *

   - Applied J2EE design patterns using Singleton, DAO, DTO and DI.
   - Involved in business meetings to understand the business use cases and
   created functional, technical  design documents.
   - Provided best solutions to the business problems.
   - Designed the system architecture by analyzing different
   tools/components.
   - Designed and implemented the DAO layer using Spring and Hibernate for
   online processing.
   - Designed and implemented Batch processing components using Spring
   batch.
   -  Involved in development of Restful Web services.
   - Designed UI components using JSP, Javascript and Ajax.
   - Used Agile methodology for Project development.
   - Used JUnit Framework for the Unit testing of all the java components.


Environment: Spring 2.5,  Javascript, Ajax,  Spring Batch, JDBC,  JBoss,
Oracle, Sql server 2005



*AEGON, Edinburgh, UK

                January 2011 to May 2012*

*Java Developer  *

Project Name #6: Group Pension Plan enhancement (GPP)



*Project Description:* GPP is an migration project, which converts existing
GPP product in mainframe and unisure to Java based system. This system
consists of 3 major modules. The modules involved are quotes, policy
documentation and policy administration. There are other sub modules that
are constituents of main modules.

*Responsibilities:   *

   - Involved in understanding the business needs, functional and
   non-functional requirements.
   - Interaction with Business analysts for business queries. ow
   - Offshore Coordination.
   - Performed impact analysis and gap analysis.
   - Developed Functional and Technical Design documents.
   - Performed code changes and testing.
   - Involved in design and code logic to generate questions in random
   order for online test.



*Environment:* DAO, XML, JavaScript, Ajax, J2EE API, Spring, DB2.







*Bank of America, US

            May 2007 to December 2010*

*IT Analyst  *

Project Name #7:  Bank Online Support Systems.


Project Description: Bank Online Support Systems is an application in Bank
of America wherein all the customer related information is stored and
processed. Operation includes both online impact and batch impacts. Online
programming comes under HOGAN Umbrella. This is a utility which makes ease
of developing mainframe programs with easier methods like Activity, DG,
PDF, CDMF functions. Bank Online Support Systems equally details with Batch
operations and sending and receiving files from various financial and other
legal systems.



*Responsibilities:   *

   - Capturing and analyzing the System requirements, performing the Impact
   analysis
   - Performing the Effort estimation for the work requests and doing
   detailed Schedule planning
   - Preparation of High level & Low level design for the initiatives
   - Preparation of the test plan and scripts for Testing and Execution of
   the same for initiatives
   - Involved in conversion of customers, accounts and addresses of 3
   different banks(USTRUST, La Salle, Countrywide Financials) acquired by Bank
   of America.
   - Lead a team of four in creating of new products under relationship
   pricing and combined statements project which was one of the focus projects
   of the client.

 *Environment:* Hogan, Cobol, Jcl , Db2, Ims Db, Easytrieve,Rexx, DB2, IMS,
File-Aid,CA7Scheduler, MQ,SAR





I look forward to your reply!



Best Regards,

*Kailash Sahoo*

Phone: 425.209.1434

Email: [email protected]

-- 
You received this message because you are subscribed to the Google Groups "Hot 
List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/directclienteq.
For more options, visit https://groups.google.com/d/optout.

Attachment: Durga_res.doc
Description: MS-Word document

Reply via email to