Immediate role for Senior Hadoop Developer

2020-02-18 Thread deepak kannoji
*Job Details:*

*Job Title: Sr. Hadoop Developer*

*Location: Edison, New Jersey*

*Duration: Long Term*



*Primary SkillsHbase, Hive, Hadoop, HDFS, MapReduce*

*# Of Positions   3*



*Kindly share the resumes to dee...@kanisol.com  or
reach me on 609-651-4663*


*Role Overview:*

*JOB DESCRIPTION*

· 9+ years of IT experience in all phases of Software Development
Life Cycle (SDLC) with skills in data analysis, design, development,
testing and deployment of software systems.

· Strong team player, ability to work independently and in a team
as well, ability to adapt to a rapidly changing environment, commitment
towards learning, Possess excellent communication, project management,
documentation, interpersonal skills.



*CHARACTERISTIC DUTIES AND RESPONSIBILITIES:*

· At least 3+ years of strong experience, working on components
like MapReduce, HDFS, HBase, Hive, Pig, Oozie, Zookeeper, Flume, Spark,
Python with CDH4&5 distributions and EC2 cloud computing with AWS.

· Well conversed and hands on experience in UNIX, LINUX and NoSQL

· Key participant in all phases of software development life cycle
with Analysis, Design, Development, Integration, Implementation, Debugging,
and Testing of Software Applications in client server environment, Object
Oriented Technology and Web based applications.

· Strong in Developing MapReduce applications, Configuring the
Development Environment, Tuning Jobs and Creating MapReduce Workflows.

· Experience in performing data enrichment, cleansing, analytics,
aggregations using Hive and Pig.

· Knowledge in Cloudera Hadoop distributions and few other majorly
used distributions like Horton works and MapR.

· Hands on experience in working with Cloudera CDH3 and CDH4
platforms.

· Proficient in big data ingestion and streaming tools like Flume,
Sqoop, Kafka, and Storm.

· Experience with different data formats like Json, Avro, parquet,
RC and ORC and compressions like snappy & bzip.

· Experienced in analysing data using HQL, PigLatin and extending
HIVE and PIG core functionality by using custom UDFs.

· Good Knowledge/Understanding of NoSQL data bases and hands on
work experience in writing applications on NoSQL databases like Cassandra
and MongoDB.

· Good knowledge on various scripting languages like
Linux/Unixshell scripting and Python.

· Good knowledge of Data ware housing concepts and ETL processes.

· Involved in importing Streaming data using FLUME to HDFS and
analyzing using PIG and HIVE.

· Experience in importing streaming data into HDFS using Flume
sources, and Flume sinks and transforming the data using Flume interceptors.

· Configured Zookeeper to coordinate the servers in clusters to
maintain the data consistency.

· Used Oozie and Control - M workflow engine for managing and
scheduling Hadoop Jobs.

· Diverse experience in working with variety of Database like
Teradata, Oracle, MySql, IBM DB2 and Netezza.



*NICE TO HAVE*

· Good knowledge in understanding Core Java and J2EE technologies
such as Hibernate, JDBC, EJB, Servlets, JSP, JavaScript, Struts and spring.

· Experienced in using IDEs and Tools like Eclipse, Net Beans,
GitHub, Jenkins, Maven and IntelliJ.

· Implemented POC to migrate map reduce programs into Spark
transformations using Spark and Scala.

· Ability to spin up different AWS instances including EC2-classic
and EC2-VPC using cloud formation templates.

-- 



*Thanks and regards,*
*Deepak Kannoji*
*Kani Solutions Inc*
*Phone: 609-651-4663*
*Email: dee...@kanisol.com  *
*Skype: kannoji.deepak*
*https://www.linkedin.com/in/deepakkannoji/
<https://www.linkedin.com/in/deepakkannoji/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAOUkwruWdwMEBGTsr%3Dgzzde_nEH4DvPjknK%2BtmYO_-0RuEOCEA%40mail.gmail.com.


Hadoop Developer and Lead - Charlotte, NC - Immediate Position

2020-01-29 Thread Vineet Mishra
*Position: Hadoop Developer and Lead*

*Location: **Charlotte, NC*

*Duration: Long Term *



*No OPT EAD and if Candidate on H1B or Any EAD Need Passport Number*


*Developer Position – 4 and Lead Position - 2*

*Rate: (Depending on the experience)*

Hadoop, Map Reduce, Talend, Hive, Pig, Oozie, HBase, Sqoop, Spark, Kafka,
Cassandra, TPT, ETL, BigQuery, Connect Direct, TWS, Tableau



*Regards,*



*Vineet Mishra*

*Sr. Technical Recruiter*

*Quantum World Technologies Inc.*

*199 W Hillcrest Dr, Suite#0112, Thousand Oaks, CA 91360 *

*250 Yonge street  , Suite #2201, Unit #10, Toronto, ON, M5B 2L7, Canada*

*160 City Road , London EC1V 2NX , United Kingdom  *

*Office: 805-222-0532 Ext-339*

*Fax : 805-834-0532*

*E: **vin...@quantumworld.us *

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAGcXK%3D9hrMMg4FqPuaHwmgsBf3bxasPmWdOW-MCgHef3UZcFZw%40mail.gmail.com.


Urgent Position : Hadoop Developer / Big data Developer | Reston, VA

2020-01-28 Thread Anchit Bajpai
Reply to -

*Anchit*@quantumworld.us 


No OPT EAD and if Candidate on H1B or Any EAD Need Passport Number



*Hadoop Developer/Big data Developer *

*Reston, VA*

*Long Term Contract *



*Required –*



   - AWS
  - Spark
  - Java







*Cordially,*



*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 W Hillcrest Dr, Suite#0112, Thousand Oaks, CA 91360 *

*250 Yonge street  , Suite #2201, Unit #10, Toronto, ON, M5B 2L7, Canada*

*160 City Road , London EC1V 2NX , United Kingdom  *

*Office: 805-222-0532 Ext-338*

*Fax : 805-834-0532*

*E: *anc...@quantumworld.us

*LinkedIn: *https://www.linkedin.com/in/anchit-bajpai-ab4458129/

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUAX0OJei2NTqVW97CmduokRccF-K%2Bd_3ZqVZiATRmh0qQ%40mail.gmail.com.


INTERVIEW TOMORROW : HADOOP DEVELOPER

2019-12-19 Thread Deepak Gulia
*Please send me the profiles at **deepak.gu...@simplion.com*




*Position : Hadoop Developer *

*Location : Sunnyvale /CA*

*Duration : Long term*

*Positions : 4*





*Professional Qualifications: *

• Bachelor’s Degree

• 3+ years managing and troubleshooting production Hadoop jobs

• Have scala




* Deepak Gulia *| Simplion – cloud*.* made simple
Fax: 408-935-8696 | Email: deepak.gu...@simplion.com
*GTALK :-  **deepakgulia.rgtal...@gmail.com*


-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CANidSO3zdFYC1sbAY1tx1y0muE2KCUCTd3cPwbcEzX2XLSb-dg%40mail.gmail.com.


INTERVIEW TOMORROW : HADOOP DEVELOPER

2019-12-19 Thread Deepak Gulia
*Please send me the profiles at **deepak.gu...@simplion.com*




*Position : Hadoop Developer *

*Location : Hartford, CT*

*Duration : Long term*

*Positions : 4 ( 2 in the below shift and 2 in regular)*



*TITLE: Production Support Specialist  *



*JOB DESCRIPTION:*

We are seeking a Production Support Specialist to provides complex
technical support and assistance to our product’s stakeholders.  You will
work within a Big Data platform on a cutting-edge recommendation engine.
You will provide stakeholders with preventive maintenance and configuration
recommendations to improve product usability, performance, and customer
satisfaction.  You will work use deep knowledge of the product to execute
(standardize) efficient and prompt resolution methods for all issues*.  *

*Fundamental Components/Job Description: *

• Daily technical support for a production recommendation
system on a Big Data Platform

• Facilitate Root Cause Analysis (RCA) ensuring implementation
of corrective actions

• Identifies opportunities to improve day-to-day operations,
processes and procedures and implements them

• Promote a high-quality customer centric focus when
interacting with customers.

• Maintains Standard Operating Procedures and other production
support material to be used by on-call staff.  Grow and maintain system
knowledge through documentation

• Work with management teams to ensure all support guidelines
are met – to include business continuity planning and disaster recovery.

• Provide quality production support and ensure application
service level commitments (SLA’s) are met or exceeded

• Continually improving the health and performance of
applications by identifying problems

• Work with Engineering to implement automated recovery steps
or preventative measures

• Possess experience in streamlining and improving existing
support processes thereby improving incidents’ turnaround time

• Work with onshore/offshore engineers to analyze, develop and
improve the Job Schedules.

• Create production system health alerting and reporting.

• Independently review technical design documents including
data mappings to debug production issues.



*Professional Qualifications: *

• Bachelor’s Degree

• 4+ years of tier 2 IT production support experience

• 3+ years managing and troubleshooting production Hadoop jobs

• 2+ years working with production job scheduling technologies
(TWS, Oozie, Airflow, etc.)

• 2+ years with Linux via CLI and Shell Scripting

• 2+ years writing SQL queries and understanding logic of
existing queries

• Experience with advanced analytics (R, Python, H2O, etc.)
enabled products is desired




* Deepak Gulia *| Simplion – cloud*.* made simple
Fax: 408-935-8696 | Email: deepak.gu...@simplion.com
*GTALK :-  **deepakgulia.rgtal...@gmail.com*


-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CANidSO2rC8OZJDqbzmD_x1rG6pyObG5bP74mgeCo4KcroSjQDA%40mail.gmail.com.


IMMEDIATE NEED : Hadoop Developer

2019-12-19 Thread Deepak Gulia
*Please send me the profiles at **deepak.gu...@simplion.com*




*Position : Hadoop Developer *

*Location : Hartford, CT*

*Duration : Long term*

*Positions : 4 ( 2 in the below shift and 2 in regular)*



*TITLE: Production Support Specialist  *



*JOB DESCRIPTION:*

We are seeking a Production Support Specialist to provides complex
technical support and assistance to our product’s stakeholders.  You will
work within a Big Data platform on a cutting-edge recommendation engine.
You will provide stakeholders with preventive maintenance and configuration
recommendations to improve product usability, performance, and customer
satisfaction.  You will work use deep knowledge of the product to execute
(standardize) efficient and prompt resolution methods for all issues*.  *

*Fundamental Components/Job Description: *

• Daily technical support for a production recommendation
system on a Big Data Platform

• Facilitate Root Cause Analysis (RCA) ensuring implementation
of corrective actions

• Identifies opportunities to improve day-to-day operations,
processes and procedures and implements them

• Promote a high-quality customer centric focus when
interacting with customers.

• Maintains Standard Operating Procedures and other production
support material to be used by on-call staff.  Grow and maintain system
knowledge through documentation

• Work with management teams to ensure all support guidelines
are met – to include business continuity planning and disaster recovery.

• Provide quality production support and ensure application
service level commitments (SLA’s) are met or exceeded

• Continually improving the health and performance of
applications by identifying problems

• Work with Engineering to implement automated recovery steps
or preventative measures

• Possess experience in streamlining and improving existing
support processes thereby improving incidents’ turnaround time

• Work with onshore/offshore engineers to analyze, develop and
improve the Job Schedules.

• Create production system health alerting and reporting.

• Independently review technical design documents including
data mappings to debug production issues.



*Professional Qualifications: *

• Bachelor’s Degree

• 4+ years of tier 2 IT production support experience

• 3+ years managing and troubleshooting production Hadoop jobs

• 2+ years working with production job scheduling technologies
(TWS, Oozie, Airflow, etc.)

• 2+ years with Linux via CLI and Shell Scripting

• 2+ years writing SQL queries and understanding logic of
existing queries

• Experience with advanced analytics (R, Python, H2O, etc.)
enabled products is desired




* Deepak Gulia *| Simplion – cloud*.* made simple
Fax: 408-935-8696 | Email: deepak.gu...@simplion.com
*GTALK :-  **deepakgulia.rgtal...@gmail.com*


-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CANidSO2TPEkS%2B%3D3Bp7KN7KYo8HifkB%2Bw6T%3D-tDM7%3DeQBSNJpzg%40mail.gmail.com.


Only C2C : Bigdata/Hadoop developer OR Admin | Tampa, FL & Irving TX

2019-11-12 Thread Anchit Bajpai
Hi,

Hope you are doing great!!

Currently we are having a full time job opening of *Big Data/Hadoop
developer**  OR  Admin* at *Tampa, FL  & Irving TX *with our client. If you
are interested then please reply me with your updated resume or you can
call me at *805-222-0532 Ext-338.*

 Reply to - anc...@quantumworld.us

*Title :* *Bigdata/Hadoop developer  OR Admin *
*Long Term Contract*
*Location: Tampa, FL  & Irving TX*

*Visa -- Only OPT, GC EAD, H4 EAD, L2 EAD, TN, GC , USC*

*Description --*

   - Candidate should have minimum of 6 years of IT experience and must
   have following key skillsets
   - Experience in Java, HBase.
   - Should have good experience in providing technical designs using HBase
   and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analyzing
   vast amounts of raw data from various systems using Spark/HBase to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase.
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements

. *Cordially,*



*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360
<https://www.google.com/maps/search/199+West+Hillcrest+Dr+Suite+%23+0112,+Thousand+Oaks+CA+%E2%80%93+91360?entry=gmail=g>*

*www.quantumworld.us <http://www.quantumworld.us/>*

*E: *anc...@quantumworld.us

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUCVn9znHABrUJo8BrHOOu2VfRn4MQfJ_Cp_dL5dHvWwqA%40mail.gmail.com.


Reg;Sr. Hadoop Developer

2019-11-04 Thread kumar bis
Hi





Position: Sr. Hadoop Developer

Location: Newark, NJ

Duration: 12 Months



•  Working with database architects, developers, business analysts and
domain experts to understand our complex system.

•  Developing and building a data strategy to improve data quality and
operational efficiencies.

•  Improving reporting capabilities including business intelligence
dashboards and data visualization software.

•  Instituting a data architecture and governance discipline to advance
rapid delivery of products and data insights.

•  Evolving a future state application architecture that supports business
priorities and objectives while being flexible.

•  Handling the local architecture governance process that reviews and
monitors the design of new solutions and ensures technology solutions
follow the enterprise technology policies and standards.

•  Strategizing and leading the integration and monetization of Data.

•  Proposing and presenting technical solutions and advising the business
on the technical and business value of the proposition.

•  Providing architecture and technology support across a range of
technologies including but not limited to Cloudera Hadoop stack, SQL,
Tableau, Informatica, and Collaboration platforms.

•  Keeping up-to-date on new technology, standards, protocols and tools in
areas related to the rapidly changing digital environment.

•  Financial Services experience with a consistent record of delivering
projects on-time, on budget and within scope.



Technical

•  Information Services data solutions implementation

•  Enterprise Data warehouse and Data Lake design and implementation

•  Data architecture

•  Data management or data integration/conversion

•  Data modeling, using industry standard tools (i.e. ER Studio, Enterprise
Architect, CA Erwin etc.)

•  Enterprise Architecture and Data Governance on Hadoop clusters.

•  Current Industry knowledge of forward-looking data technologies like Big
Data, Hadoop, MongoDB, In memory DB’s and streaming data.

•  Expert knowledge of Cloudera Hadoop technology stack including Hive,
Pig, Spark, HBase, Kafka, Streaming technologies.

•  Experience in performance tuning of Hadoop applications.

•  Experience of using third party frameworks for data ingestion and data
management.

•  Broad knowledge of Agile project methodology (SAFe) and ITIL standard
methodologies



Please share with me below information for  submittal.



Consultant full name:

Email ID:

Contact Number:

Current Location:

Skype id:

Visas status:

Education details  :

Availability to  join in project:

Linked in id:



Thanks and Regards,



*Kumara Swamy*
*Teksoft Systems Inc.*

*  IT Consulting | Offsite Projects Development in USA  Custom Development
| eBusiness / eCommerce | CMS | Portal Dev.*
Ph: 469-405-6363
k...@teksoftsystems.com
www.teksoftsystems.com
Specialized in: ATG Dynamo / Commerce | TIBCO | Documentum | BizTalk /
SharePoint


  Note: This email is not intended to be a solicitation. It is targeted at
recruiting & consulting professionals. If you have received this email in
error reply to k...@teksoftsystems.com with the word REMOVE in the SUBJECT
line. We regret any inconvenience caused to you and thank you for your
valuable time.

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAA-KW7rkO0b0Qx7kx%2BuPPDTHsq3nV_%2BGmOhidCaE8sh%3DX_zt6Q%40mail.gmail.com.


Reg;Sr. Hadoop Developer

2019-11-04 Thread kumar bis
Hi



Position: Sr. Hadoop Developer

Location: Newark, NJ

Duration: 12 Months



·  Working with database architects, developers, business analysts and
domain experts to understand our complex system.

·  Developing and building a data strategy to improve data quality and
operational efficiencies.

·  Improving reporting capabilities including business intelligence
dashboards and data visualization software.

·  Instituting a data architecture and governance discipline to advance
rapid delivery of products and data insights.

·  Evolving a future state application architecture that supports business
priorities and objectives while being flexible.

·  Handling the local architecture governance process that reviews and
monitors the design of new solutions and ensures technology solutions
follow the enterprise technology policies and standards.

·  Strategizing and leading the integration and monetization of Data.

·  Proposing and presenting technical solutions and advising the business
on the technical and business value of the proposition.

·  Providing architecture and technology support across a range of
technologies including but not limited to Cloudera Hadoop stack, SQL,
Tableau, Informatica, and Collaboration platforms.

·  Keeping up-to-date on new technology, standards, protocols and tools in
areas related to the rapidly changing digital environment.

·  Financial Services experience with a consistent record of delivering
projects on-time, on budget and within scope.



*Technical*

·  Information Services data solutions implementation

·  Enterprise Data warehouse and Data Lake design and implementation

·  Data architecture

·  Data management or data integration/conversion

·  Data modeling, using industry standard tools (i.e. ER Studio, Enterprise
Architect, CA Erwin etc.)

·  Enterprise Architecture and Data Governance on Hadoop clusters.

·  Current Industry knowledge of forward-looking data technologies like Big
Data, Hadoop, MongoDB, In memory DB’s and streaming data.

·  Expert knowledge of Cloudera Hadoop technology stack including Hive,
Pig, Spark, HBase, Kafka, Streaming technologies.

·  Experience in performance tuning of Hadoop applications.

·  Experience of using third party frameworks for data ingestion and data
management.

·  Broad knowledge of Agile project methodology (SAFe) and ITIL standard
methodologies









Thanks and Regards,



*Kumara Swamy*
*Teksoft Systems Inc.*
  *IT Consulting | Offsite Projects Development in USA*
*Custom Development | eBusiness / eCommerce | CMS | Portal Dev.*
Ph: 469-405-6363
k...@teksoftsystems.com
www.teksoftsystems.com
Specialized in: ATG Dynamo / Commerce | TIBCO | Documentum | BizTalk /
SharePoint


  Note: This email is not intended to be a solicitation. It is targeted at
recruiting & consulting professionals. If you have received this email in
error reply to *k...@teksoftsystems.com * with the
word REMOVE in the SUBJECT line. We regret any inconvenience caused to you
and thank you for your valuable time.

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAA-KW7p8m4n7ZVg-JxPWb%2BZ5skSo4oupdvMfkOhgxb1F34mHbw%40mail.gmail.com.


BigData Hadoop Developer @ Charlotte, NC

2019-11-01 Thread ANUDEEP
*Job Title: BigData Hadoop Developer *

*Location: **Charlotte**, NC*

*Rate: DOE, # Duration: Long-term*


*Primary Skills: BigData, Hadoop, ETL, DBA, MAP Reduce *


*End Client: Wells Fargo*



Big Data, Hadoop, Python, Scala, Apache Spark, Hive, Spark SQL, File
formats, Sqoop, Hbase, MapReduce, Kafka/spark streaming, Java, Azure Cloud,
Greenplum



*Description*:

Work closely with Enterprise Risk CTO vertical Equity Aggregation team to
understand the data requirements.

Analyze incoming data requests for Equity Aggregation and determine
appropriate target solutions.

Develop data service design

Lead the data service build, test and deployment

Partner with Enterprise data teams such as Data Management & Insights and
Enterprise Data Environment (Data Lake) and identify the best place to
source the data

Work with business analysts, development teams and project managers for
requirements and business rules.

Collaborate with source system and approved provisioning point (APP) teams,
Architects, Data Analysts and Modelers to build scalable and performant
data solutions.

Effectively work in a hybrid environment where legacy ETL and Data
Warehouse applications and new big-data applications co-exist

Work with Infrastructure Engineers and System Administrators as appropriate
in designing the big-data infrastructure.

Work with DBAs in Enterprise Database Management group to troubleshoot
problems and optimize performance

Support ongoing data management efforts for Development, QA and Production
environments

Utilizes a thorough understanding of available technology, tools, and
existing designs.
Acts as expert technical resource to programming staff in the program
development, testing, and implementation process

*Thanks*

*Anudeep*

Email: anudeep.vanam...@oakridgeit.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAJOb5BYdFaoOAf01bO3z3tM614YXbuCUmbDvWY9R3QW%2BwCqbFw%40mail.gmail.com.


Hadoop Developer(Backfill Position)

2019-10-01 Thread Bhushan Rao
Hello,

This is Backfill position. My consultant back out from the project today i
need to replace that position ASAP

I can close this week


*Position – Hadoop Developer*

*Location – Edisson, NJ*

*Duration-  12+ Months*

*Interview: Telephonic, Skype.*



*Client:  Infoscept*



*Need visa copy along with passport number*







*Requirements*

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL - Hive

*Requirements*

· Extensive Experience as a Hadoop Developer.

· Experience with Sqoop, Kafka, Spark, Pig, and Hive.

· Experience with Shell Scripts

· Strong in SQL - Hive

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CANvUxK5Crxs57o-qbN-%3DcJ-f2Vt_2gtZsYwp%3DwVsYM9jDY2GrQ%40mail.gmail.com.


Position : BigData/Hadoop Developer with Spark Exp | Tampa, FL & Irving TX

2019-09-04 Thread Anchit Bajpai
Mail to -- anc...@quantumworld.us


*Job Title : BigData/Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL  &   Irving TX *


*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUAYJVP-shVF-8Omi_Aaiijuamn0f1Axq7WXPeCPoXyrTA%40mail.gmail.com.


Hadoop Developer @ Buffalo or Schenectady, NY

2019-09-04 Thread ANUDEEP
   -

   *Job title: Hadoop Developer*

   *Location: Buffalo or Schenectady, NY*

   *Duration:6-12 months+ (With extentions)*



   *Job Description*

   • Responsible for designing the data intake into the Hadoop and
   make it available to business in query able format

   • Responsible for implementation, ongoing technical support of
   Hadoop eco-system (including access, incident and problem management)-
   Provide technical leadership and collaborate with developers and architects
   for implementations on the Hadoop Platform.

   • Design and Configuration of the Hadoop Platform and various
   associated components (including 3rd party tools) for data ingestion,
   transformation, migration, processing, and reporting.

   • Responsible to work with the infrastructure, network,
   database, application, and business intelligence teams to achieve high data
   quality, performance, availability and security of the platform.

   • Mentor other Hadoop developers and administrators

   • Responsible for documenting the design and processes in
   implementation for ongoing support

   • Assist to optimize and integrate new infrastructure via
   continuous integration methodologies.

   • Setup and maintain CI/CD application server environments and
   pipelines with tools and technologies like Docker, Jenkins and Kubernetes.

   • Follow company policies, procedures, controls, and processes
   for the job.

   • In addition to the above key responsibilities, you may be
   required to undertake other duties from time to time as the Company may
   reasonably require.



   *Required Skills:*

   • A minimum of bachelor's degree in computer science or
   equivalent.

   • Experience working on any Hadoop distribution, such as
   Cloudera/Hortonworks and have at least coded in Apache Hadoop, Spark,
   Kafka, Hive, Pig, Drill for 6 years (or) more

   • Experience with Data lineage, Data Tagging following data
   driven security model

   • Experience in NiFi, Spark streaming, Elastic Search,
   Tensorflow, Pytorch

   • Strong knowledge of relational databases (Oracle, SQL Server ,
   Postgres) and Expert in SQL language

   • Experience with languages such as Python, Go and Java is
   required.

   • Experience with Agile, DevOps and GITOps automation.

   • Proficient in utilizing cloud computing virtualization
   technologies, storage architecture & AWS/Azure technologies

   • Knowledge of working with various Hadoop connectors

   • Healthcare knowledge is an advantage

   • Must have experience with source control tools

   • Must have strong problem-solving and analytical skills

   • Must have the ability to identify complex problems and review
   related information to develop and evaluate options and implement solutions.

   • Kubernetes Containerization experience

*Regards*

*Anudeep| Sr. Manager- Recruitment's 770-838-3849 |
   Email: anud...@cysphere.net *
   *CYBER SPHERE LLC*
   *An E-Verify Company!!*
   *Website: www.cysphere.net
   
<http://r20.rs6.net/tn.jsp?f=001ldHSH3N6aSfL5foeFpZowJwipNKZU5Jt3QBkrikAm3j7_50y1UxcAF0Jw-1heZ-AqU8Cfx-XA1oiB7-Gom7bRTS_uHyJzznvALlfDNZez8diD2z5xCPXjHLW8FD0R1Gj49yL-I15bfrk90Nz3rZFrw===atpnekxiMt1PqxfnMag9n0po258FEO547eRrb8KcBHirPtNVcS4PRg===_EvOgqBd15ubNcXnbAtvjkEbo2tRbfAWLe08Ko1jbjNghSM3YF30eg==>*


<https://www.avast.com/en-in/recommend?utm_medium=email_source=link_campaign=sig-email_content=webmail_term=default3=0423851b-e7b0-4cf2-994e-5ee6eb47dd19>
I’m
protected online with Avast Free Antivirus. Get it here — it’s free forever.
<https://www.avast.com/en-in/recommend?utm_medium=email_source=link_campaign=sig-email_content=webmail_term=default3=0423851b-e7b0-4cf2-994e-5ee6eb47dd19>

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAJOb5BZAqSKg59BbwmOchU6%2ByQ94Swq%3DjvBr1Xh-W5BWDve%3Drw%40mail.gmail.com.


Job Title : BigData/Hadoop Developer with Spark Exp | Tampa, FL & Irving TX

2019-09-04 Thread Anchit Bajpai
 Mail to -- anc...@quantumworld.us


*Job Title : BigData/Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL  &   Irving TX *


*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUAP6Ht1cefeJD-peH3OuGwGxfDgdgWSJQaz0qDEU6%2B81Q%40mail.gmail.com.


Position : BigData/Hadoop Developer with Spark Exp | Tampa, FL

2019-09-03 Thread Anchit Bajpai
Mail to -- anc...@quantumworld.us


*Job Title : BigData/Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL*



*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUCO%2BdZ4yevn3xVTOKuWyoXN2JhHy9PEfpx6GH4om6RyrA%40mail.gmail.com.


Position : BigData/Hadoop Developer with Spark Exp | Tampa, FL

2019-08-26 Thread Anchit Bajpai
Mail to -- anc...@quantumworld.us


*Job Title : BigData/Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL*



*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUBmUDntNzVyiCoMT-ovjmQ5HKVJiRTnozfUiaLdxOVhfw%40mail.gmail.com.


Position : Hadoop Developer with Spark Exp | Tampa, FL

2019-08-22 Thread Anchit Bajpai
 Mail to -- anc...@quantumworld.us


*Job Title : Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL*



*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUCJ1P6OspeLQ0pYd7vp0RRiQRjJoyPNgJG%2BVDpNdUJz9A%40mail.gmail.com.


Position : Hadoop Developer with Spark Exp | Tampa, FL

2019-08-21 Thread Anchit Bajpai
 Mail to -- anc...@quantumworld.us


*Job Title : Hadoop Developer with Spark Exp*

*Duration: Long Term Contract*

*Location: Tampa, FL*



*Need Passport Number For H1B Candidates*



*Description --*



   - Candidate should have minimum of 7+ years of experience and must have
   following key skillsets.
   - Experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analysing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*


*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUAiiejDAQReQ4A-Uh5L959OYFMXaRMy_fEkjcFw4XnO%3DQ%40mail.gmail.com.


IMMEDIATE INTERVIEW : HADOOP DEVELOPER

2019-08-19 Thread Deepak Gulia
*Please send me the profiles at **deepak.gu...@simplion.com*


*Position: HAdoop Developer*

*Location : Sunnyvale, CA*

*Duration : Long Term *



*Must Have Skillset*: Java/J2EE, Springboot, Kafka *Job Description:* Senior
software engineer - Backend

Position Description:

·Hands on development on building n-tier applications using
RESTful Services, Java/J2EE, Oracle and related technologies.

·Participate in managing code & configurations for multiple
environments, release management process, creating and maintaining
environment configuration and controls, code integrity and work
closely with platform team

·Analyzing business requirements, story-boards and similar
artifacts of the scrum process, work in an agile development
environment with a quick turnaround time and iterative builds.

·Leads the discovery phase of medium to large projects to come
up with high level design

·Leads the work of other small groups of three to five engineers

·Troubleshoots business and production issues

·Ensures inclusion of business vision and industry trends to
enable results that drive business

·Problem solving and troubleshooting design and development
issues and provide appropriate solutions

·Ability to communicate effectively, both written and verbal,
with technical and non-technical cross-functional teams

·Provide guidance and mentorship to the junior engineers

·Knowledge of standard tools for optimizing and testing code

·A desire to work in a fast-paced and challenging work environment



*Minimum Qualifications*:

   - 4+ years of experience programming in Java/J2EE, REST Services, and
   related technologies.
   - Hands on Experience with Hibernate, Spring, CXF. Hands on experience
   with RDBMS (Oracle), PL/SQL.
   - Experience with no-sql technologies like Couchbase or Cassandra
   Experience with distributed publish-subscribe messaging system like Kafka
   Experience configuring & deploying applications on J2EE application server
   (Apache Tomcat, TomEE).
   - Expertise in SaaS application development.
   - Experience with UNIX shell and scripting.

*Preferred Qualifications*:



   - Experience with Frontend technologies (React/Redux).
   - Experience with cryptography/ key management



 *Deepak Gulia *| Simplion – cloud*.* made simple
Fax: 408-935-8696 | Email: deepak.gu...@simplion.com
*GTALK :-  **deepakgulia.rgtal...@gmail.com*


-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CANidSO2LgReAeoe0qRL6stMK8QLengYCsKZ1bky72TgJGtpvxA%40mail.gmail.com.


Position : Java Developer OR Hadoop developer with Spark Experience | Tampa, FL

2019-08-06 Thread Anchit Bajpai
 Mail to -- anc...@quantumworld.us



*Job Title :* *Java Developer  ORHadoop developer* with *Spark*
Experience.

*Duration:* Long Term Contract

*Location:* Tampa, FL



*Need Passport Number for H1B Candidates.*


*Note -- Must Have Spark Exp *



   - Candidate should have minimum of 7 years of experience and must have
   following key skillsets
   - Strong experience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analyzing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*



*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*

*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUD8RKjScDkyyQm_j3fwKd3%2B%3DBGSA6tE662RncCofWmi%2BA%40mail.gmail.com.


Position : Java/Spark Developer OR Java/Hadoop Developer | Tampa, FL

2019-07-30 Thread Anchit Bajpai
 Mail to -- anc...@quantumworld.us

*Job Title : Java/Spark Developer  OR   Java/Hadoop Developer*

*Duration: Long Term Contract*

*Location: Tampa, FL*



*Need Passport Number For h1B Candidates*



*Description --*



   - Candidate should have minimum of 10 years of experience and must have
   following key skillsets
   - Strong exprerience in Spark with Java, Hbase/Phoenix.
   - Should have good experience in providing technical designs using
   Spark, Hbase/phoenix and other related Hadoop bigdata technologies
   - Should have prior experience in sourcing, transforming and analyzing
   vast amounts of raw data from various systems using Spark to provide
   ready-to-use data to business team.
   - Should have sound knowledge on creating Spark jobs for data
   transformation and aggregation and produce unit tests for Spark
   transformations and helper methods
   - Should have strong experience working with Hbase/Phoenix
   - Should posses deep understanding of distributed systems
   - Should have strong communication skills to work with business end
   users and gather requirements





*Cordially,*



*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*www.quantumworld.us <http://www.quantumworld.us/>*
*Office: 805-222-0532 Ext-338*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAMYaxUDCK8Z2d2%3DcJ0yc3C3Q8p0JBJmR81AKYxtmgvnGZfbi0A%40mail.gmail.com.


Immediate role for Hadoop Developer

2019-07-10 Thread deepak kannoji
Open for all Visa, but need passport number for submission

*Title - SQL + Hadoop Data Engineers*

*Location - San Jose, CA*

*Duration: 6 months*



*Kindly share the resumes or hotlist to dee...@kanisol.com
*


*Job Description & Project over view:*

· Excellent SQL and advanced SQL skills

· Experience in Spark and possibly Scala (Hadoop ecosystem)

· Knowledge of enterprise Data Warehouses

· Basics of Python (Good to have)

· Good communication skills

-- 



*Thanks and regards,*
*Deepak Kannoji*
*Kani Solutions Inc*
*Phone: 609-651-4663*
*Email: dee...@kanisol.com  *
*Skype: kannoji.deepak*
*https://www.linkedin.com/in/deepakkannoji/
*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/corptocorp/CAOUkwrtNMKtTWKJsOiCk6oE-0xx2tXwhoXtDB4KQ7AAM3sbzCQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


Position: Java Hadoop Developer Location: San Jose, CA Duration: 6+ Months

2019-05-01 Thread VEER PARTAP
*Position: Java Hadoop Developer Location: San Jose, CADuration: 6+ Months*





*Required:*



· Strong on Core Java, Multi-Threading, OOPS Concept, writing
parsers in Core Java

· Should have strong knowledge on Hadoop ecosystem such as
Hive/Pig/MapReduce

· Strong in SQL, NoSQL, RDBMS and Data warehousing concepts

· Writing complex MapReduce programs

· Should have strong experience in pipeline building such as Spark
or Storm or Cassandra or Scala.

· Gather and process raw data at scale (including writing scripts,
web scraping, calling APIs, write SQL queries, etc.).

-- 
-- 
*Thanks & Regards,*
*Veer Partap*
*IT Recruiter*
*Direct: 732-645-1917*
*veerpartaphmgamer...@gmail.com *
*www.hmgamerica.com <http://www.hmgamerica.com/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Java Hadoop Developer in San Jose, CA for Contract role.

2019-04-19 Thread Prasad Rao
Hi,


Reply back on *naren...@keyphaseinc.com * for
further process.




*Job Title: Java Hadoop Developer (3-5 years will work)Job Location: San
Jose, CA.Duration: 6+ Months*



*Required:*



1. Strong on Core Java, Multi-Threading, OOPS Concept, writing parsers in
Core Java

2. Should have strong knowledge on Hadoop ecosystem such as
Hive/Pig/MapReduce

3. Strong in SQL, NoSQL, RDBMS and Data warehousing concepts

4. Writing complex MapReduce programs

5. Should have strong experience on pipeline building such Spark or Storm
or Cassandra or Scala.

6. Gather and process raw data at scale (including writing scripts, web
scraping, calling APIs, write SQL queries, etc.).

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Client Need- Hadoop developer- Phoenix , AZ- 12+ months

2019-04-16 Thread xperttech niranjan
*Please send your resumes at **harish_rok...@bioinfosystems.com
*



*Hi*



*This is Harish From BioinfoSystems; hope you are doing well.*

*Please go through below description and reply with your resume, contact
details and current location, if you feel comfortable.*



*Title : Sr. Hadoop developer*

*Location : Phoenix , AZ*

*Duration:12+ months*

*Visa Status: H1B,GC,USC.*

*5+* Years of experience Expertize and hands-on experience on *Java/Scala*
- Must Have Good knowledge of HiveQL & *SparkQL* - Must Have Good knowledge
of Shell script - Must Have Good Knowledge of one of the Workflow engine
like Oozie, Autosys - Good to Have Good knowledge of Agile Development-
Good to Have Passionate about exploring new technologies - Good to Have
Automation approach - Good to Have Good Communication Skills - Good to Have



Candidate name:

Last 4 digits of SSN:

Visa status: (Validity)

Source – Direct/ Vendor:

Rate :

Availability (Notice period):

Current employment status:

Willing to relocate

Current Location

Contact Number:

Email ID

Skype ID

DOB (MM/DD)

Best time to Skype interview

Any offers or interview in Pipeline



*Please do follow on our LinkedIn page:-
https://www.linkedin.com/company/bioinfo-systems-llc/
<https://www.linkedin.com/company/bioinfo-systems-llc/> *



*Thanks & Regards*

*Harish Rokkam*

*Technical Recruiter*

*harish_rok...@bioinfosystems.com  *

Desk: 860-967-0614|Fax: 860-722-9692

[image: Description: Description: Description: logo]

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


BigData and Hadoop Developer

2019-03-27 Thread Kumar
Hi,



Please send me your updates resume to *ku...@idctechnologies.com
*



Location: Wilmington, DE



*Role and Responsibilities*   6123087



Big data background with experience designing and implementing large scale
systems · Solid working experience with Hadoop, Hive-Tez, Enterprise Java
development, NoSql data platforms (Cassandra), Pub/sub messaging (Kafka,
ActiveMQ, JMS, etc), Stream processing (Storm, Hbase, Nifi, Spark
Streaming, etc), · At least 2 years of experience with kafka spark based
ingestion, NIFI integration



Solid experience working in Big Data Environment (Hortonworks and Cloudera)
· Big data background with experience designing and implementing large
scale systems · Solid working experience with Hadoop, Hive-Tez, Enterprise
Java development, NoSql data platforms (Cassandra), Pub/sub messaging
(Kafka, ActiveMQ, JMS, etc), Stream processing (Storm, Hbase, Nifi, Spark
Streaming, etc), · At least 2 years of experience with kafka spark based
ingestion, NIFI integration · Thorough understanding of spark framework and
tuning of spark applications · Extensive experience with horizontally
scalable and highly available system design and implementation, with focus
on performance and resiliency · Extensive experience profiling, debugging
data on complex distributed systems · Willingness to commit extra effort to
meet deadlines as required on a high profile and business critical project
· Solid experience with Python and Java · Strong performance tuning
experience of Hive/Spark/Kafka/Hbase · Extensive knowledge on Moving data
between various platforms like Hadoop/Hive · Big Data Engineer
Development/Operational experience in a multi-platform, multi-location,
7x24x365 global environment · Experience of working with cross-geographic,
cross-functional, and cross-lob teams · Knowledge of best practice in
change, problem, incident, configuration and system health management







*Thanks & Regards*



*Kumar *

IDC Technologies, Inc
*416-900-5332*

ku...@idctechnologies.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Walk in Drive on Friday - Hadoop Developer - Java Developer - Bellevue, WA

2019-03-26 Thread vamshinaga999


Hi ,

Hope you are doing great !!

 

*Any Visa is Fine*

*Passport Number is Needed for submission purpose*

*Need Locals who can do in person interview*

*Walk in Drive on Friday*

 

*Job Title:* *Hadoop Developer*

*Location: Bellevue, WA*

*Duration: Long Term Contract*

 

*Job Title:* *Java Developer*

*Location: Bellevue, WA*

*Duration: Long Term Contract*

 

*Thanks & Regards,*

*Vamshi*

*Email: vamsh...@itechus.net *

*Direct: 802-227-0236 / Main: 802 383 1500 Ext. 109*

 

 

 

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Looking for Hadoop Developer at Atlanta, GA (No OPT Please)

2019-03-20 Thread Surendra T
Dear Partner,

Here is our Direct client requirement which can be filled immediately.
Kindly respond to this requirement with *your consultant resume, contact
and current location* info to *speed up the interview process*. Please
forward your resumes to *suren...@softhq.com *

*Job Title: Hadoop Developer*
*Primary Skill: Hadoop Developer*
*Work Location & Reporting Address: Atlanta, GA *
*Contract Duration: 6+ Months*
*Minimum years of experience: 7+*
*Education: BS in Computer Science or Information Systems*
*Certifications Needed: No - 31117*

*JOB DESCRIPTION :*

   - Sr Developer Engineer with HBASE/ Hadoop/ Spark/ Scala skill set
   - Should have experience with working on Big data projects/ Kafka queues
   set up.
   - Minimum 5+ year work experience in Big data space.
   - Must be able to do SQL queries against relational DB's.

*Thanks,*
*Surendra Tummalacherla*
*SoftHQ Inc*
*Consulting – Development – Staffing *
*Phone: 858-658-9200 X 618*
*Direct: 858-295-4387 | Fax: 858-225-6834 | *
*E-mail: suren...@softhq.com  *
*G-Talk: surendra.usitrecrui...@gmail.com
*
*Linkedin: https://www.linkedin.com/in/surendra-t-990a77a5
<https://www.linkedin.com/in/surendra-t-990a77a5>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Here is HOT Senior Hadoop Developer requirement with are a prime vendor for your consultants at NYC,NY

2019-03-19 Thread vivek nagare
Here is HOT  *Senior Hadoop Developer **requirement with are a prime
vendor *for your consultants at *NYC,NY*



*Title:  Senior Hadoop Developer*

*Location: NYC,NY*

*Duration:  Long term  *

*Interview:  Telephonic + Skype *





*Job Qualifications :*

12+ years of relevant experience in Supply chain management, Quality,
Project management in Aerospace manufacturing domain. Leadership experience
of handling 15 + members team

*Primary Skills:*

Ø  Strong leadership and collaborative skills

Ø  Ability to interface with all internal and external stakeholders

Ø  Good interpersonal skills

Ø  Strong mentoring skills

Ø  Fluent in English

Ø  Continuous improvement mindset

Ø  Experience in Aero segment is an advantage

Ø  PMP or Supply chain certification will be an added advantage

*Educational Requirements:*

Ø  Graduate/Post Graduate in Mechanical/Production/Aerospace or equivalent
engineering



Vivek Nagare

*Team Lead *

*ApTask is a global, diversity certified staffing and recruiting company
that specializes in IT, finance and accounting, and blockchain developer
placements.*

(7322430147) | viv...@aptask.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer

2019-03-19 Thread Sansi
* Direct Client Requirement*



*Hadoop Developer  with Snowflake migration experience*

*Location: San Jose ,CA *

*Duration: Long Term*



*Description:*

*Spark, Hive, Hadoop, Linux with Snowflake migration experience.*







Regards,

*Sansi * | *SPK Consultants INC*

*Office:  (408) 933-9546*

Email : sa...@spkconsultantsinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer

2019-03-19 Thread Sansi
* Direct Client Requirement*



*Hadoop Developer  with Snowflake migration experience*

*Location: San Jose ,CA *

*Duration: Long Term*



*Description:*

*Spark, Hive, Hadoop, Linux with Snowflake migration experience.*







Regards,

*Sansi * | *SPK Consultants INC*

Email : sa...@spkconsultantsinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer

2019-03-18 Thread Sansi
* Direct Client Requirement*



*Hadoop Developer with Snowflake migration experience*

*Location: San Jose ,CA *

*Duration: Long Term*



*Description:*

*Spark, Hive, Hadoop, Linux with Snowflake migration experience.*



 Please send resumes to sa...@spkconsultantsinc.com



Regards,

*Sansi * | *SPK Consultants INC*

Email : sa...@spkconsultantsinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop developer role in Weston, FL !!

2019-03-18 Thread Gautam Pareek
*Please share me resume at*: gau...@holistic-partners.com

Hello All,



Hope you doing well,

Please let me know if you have any consultant available for the given below
position.

*Make sure consultant should have all the experience required*.



*Title*: Hadoop Developer

*Location*: Weston, FL

*Duration*: 6 months

Phone and Skype



*JD*:



o   HCM experience is highly desired

o   Microsoft SQL Server 2012 and/or greater

o   RabbitMQ

o   MongoDB

o   Cloud Experience


Regards,



*Gautam Sharma* *| **IT Technical Recruiter*



(D): 408-400-3343

(O): 408-400-3356 Ext: 103

(E): gau...@holistic-partners.com

(A): 11 Bowles Ave West Boylston, MA 01583

*“**Believe you can and you’re halfway there**.**”*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer with Snowflake migration experience

2019-03-18 Thread Sansi
*Hadoop Developer **with Snowflake migration *

*Location: San Jose ,CA *

*Duration: Long Term*



*Description:*

*Spark, Hive, Hadoop, Linux with Snowflake migration experience.*


 --

Regards,

*Sansi * | *SPK Consultants INC*

Email : sa...@spkconsultantsinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


IMMEDIATE INTERVIEW : HADOOP DEVELOPER

2019-03-07 Thread Deepak Gulia
*Please send me the profiles at **deepak.gu...@simplion.com*




*Role: Hadoop Developer with StreamSets experience*

*Location: Northbrook, IL*

*Duration Long Term *

 1. Understanding and exposure to DataOps where data ops meets data
integration.

2. Hand on experience to Build & Monitor Hybrid Cloud Dataflow and
experience in Building Streamsets pipelines in 1/10th the Time.
3. Exposure to develop Stream sets pipelines using SDC Edge, SDC and
Control Hub.
4. Experience is developing custom JAVA modules and using them in
Streamsets pipelines.
5. Experience in creating Real time data flow , data drift and schema
evolution using Stream Sets pipelines.
6. Understanding of Stream sets data protector. Hands on experience will be
a plus.
7. Experience in writing custom GROOVY modules for custom transformations
in Streamsets pipeline.
8. Experience in tuning the pipelines by evaluating the tradeoff of
standalone processing vs. cluster processing.
9. Efficiently build, test, deploy and maintain any-to-any StreamSets
dataflow pipelines.

Experience in using control hub for code management.




*Deepak Gulia *| Simplion – cloud*.* made simple
Fax: 408-935-8696 | Email: deepak.gu...@simplion.com
*GTALK :-  **deepakgulia.rgtal...@gmail.com*


-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Hadoop Developer --- { Only USC and Contract and Full-time } ----

2019-02-26 Thread harish karnala


Hi

Hope you are doing well !!   

My name is  *Harish Karnala* and I am a 
*technical 
recruiter* with *United Software group*. I reviewed your information online 
and I am impressed with your qualifications. One of my clients in the  
Bellevue, 
WA and Charleston, SC (South Carolina)  area currently has *Hadoop 
Developer* that they are looking to fill. Here are the details on the role.

 

*Note: *It is only US Citizens and Open for Contract and Full-time.

 

*JD:*

 

*Job Title*

*Hadoop Developer*

*Relevant Experience (Yrs) *

· 5+ Years of experience in design and development on 
*Hadoop/Hortonworks* platform.

· Develop highly scalable and extensible Big Data platform which 
enables collection, storage, modeling, and analysis of massive data sets 
from numerous channels.

· Hands-on experience in Big Data Components/Frameworks such as 
*Hadoop, 
Spark*.

· Experience in architecture and implementation of large and highly 
complex projects.

· Deep understanding of cloud computing infrastructure and 
platforms.

· History of working successfully with multinational, 
cross-functional engineering teams.

*Technical/Functional Skills *

· Experience with Big Data and Machine Learning Frameworks

· Knowledge of *Teradata, SOLR, SPARK, Zeppelin, Accumulo* will be 
an added advantage

· Ability to work in challenging situations

· Excellent communication & team management skills

· Good written and verbal skills & team skills

· Strong problem solving skills & proactive attitude

*Roles & Responsibilities*

*Hadoop Developer*

*Work Location*

Bellevue, WA and Charleston, SC (South Carolina)

 

 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>

Dublin, OH 43017 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>
 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

Hangouts: harirecruite...@gmail.com

www.usgrpinc.com

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Opening for Big Data Hadoop Developer - Bloomfield, CT

2019-02-25 Thread vamshinaga999


Hi ,

Hope you are doing great !!

 

We have opening for Big Data Hadoop Developer

 

*Title:  Big Data Hadoop Developer*

*Location: Bloomfield, CT*

*Duration: Long Term Contract*

 

*10+ Years of experience needed*

 

*REQUIREMENTS *

· 5 -7 + years the Apache Big Data suite: Hadoop, Spark, Storm, 
Kafka

· 10+ years with object-oriented languages: C, C++, Java, Scala, or 
other OO compiled language

· 5+ years with scripting languages: JavaScript, Python, R, Ruby, 
Perl, etc.

· 2+ years in Web development frameworks: React, Angular (or other 
Node.js frameworks), Django, React-native, etc.

· Experience in noSQL document-oriented databases models is a plus 
- (i.e. MongoDB, Cassandra, Couchbase or similar)

· RDBMS knowledge (MySQL, PostgreSQL, etc.) and experience working 
with large data sets

· Experience with Git/SVN

· Excellent troubleshooting skills

· Bachelor's Degree in Engineering or Computer Science preferred or 
the equivalent in education and experience

 

 

*Thanks & Regards,*

*Vamshi*

*Email: **vamsh...@itechus.net*  

*Direct: 802-227-0236 / Main: 802 383 1500 Ext. 109*

*iTech US, Inc. |  **www.iTechUS.net* <http://www.itechus.net/>

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Big Data Hadoop developer with AWS @ Atlanta, GA / Dallas, TXd

2019-02-20 Thread Sansi
*Big Data Hadoop developer with AWS *

*Location: Atlanta, GA / Dallas, TX*

*Duration: 6 Months*






* Job Description:*



*We are looking for Big Data Hadoop developer with AWS expertise with 6+
years of experience. *



*The resource should have in depth knowledge & experience in Hadoop around
all the Hadoop ecosystem- HDP, HDF, Nifi , M/R, Hive, pig, Spark/scala,
kafka, Hbase and having ETL Background. The resource should be able Develop
the framework of Data Ingestion into Data lake  with Utilities around this
Platform. *

· 6+ years total experience in development mainly around Java and
all related technologies in the Java stack (e.g. Spring).

· 6+ year in depth knowledge & experience in Hadoop around all the
Hadoop ecosystem (HDP, HDF, M/R, Hive, pig, Spark/scala, kafka, Hbase,
Elastic search and log stash a plus) .

· 4+ years of experience working in Linux/Unix .

· Good understanding & experience with Performance and Performance
tuning for complex S/W projects mainly around large scale and low latency.

· Experience with leading Design & Architecture. Hadoop/Java
certifications is a plus.  Ability to work in a fast-paced, team oriented
environment.


 --

Regards,

*Sansi * | *SPK Consultants INC*

Email : sa...@spkconsultantsinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Hadoop Developer --- [ Passport No. Required } ---

2019-02-04 Thread harish karnala


Hi

Hope you are doing well !!  

My name is  *Harish Karnala* and I am a 
*technical 
recruiter* with *United Software group*. I reviewed your information online 
and I am impressed with your qualifications. One of my clients in the *Dallas, 
TX* Area currently has *Hadoop Developer* that they are looking to fill. 
Here are the details on the role.

 

JD:

 

*Job Title*

*Hadoop Developer*

*Relevant Experience (Yrs) *

· 5+ Years of experience in design and development on 
Hadoop/Hortonworks platform.

· Develop highly scalable and extensible Big Data platform which 
enables collection, storage, modeling, and analysis of massive data sets 
from numerous channels.

· Hands-on experience in Big Data Components/Frameworks such as 
Hadoop, Spark.

· Experience in architecture and implementation of large and highly 
complex projects.

· Deep understanding of cloud computing infrastructure and 
platforms.

· History of working successfully with multinational, 
cross-functional engineering teams.

*Technical/Functional Skills *

· Experience with Big Data and Machine Learning Frameworks

· Knowledge of Teradata, SOLR, SPARK, Zeppelin, Accumulo will be an 
added advantage

· Ability to work in challenging situations

· Excellent communication & team management skills

· Good written and verbal skills & team skills

· Strong problem solving skills & proactive attitude

 

 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>

Dublin, OH 43017 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>
 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

Hangouts: harirecruite...@gmail.com

www.usgrpinc.com

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Re: Position : Big Data or Hadoop Developer with Java & AWS experience..| Phoenix, AZ

2019-01-17 Thread sandeep v
Hi, please respond me to *sand...@solwareittech.com
*
Attached is the resume for the below job role, review it and let me know
your interest, h1/pp yes

On Thu, Jan 17, 2019 at 9:14 PM Anchit Bajpai  wrote:

> Hi,
> Hope you are doing great !
> I'm Sharing you our client requirement details below.
>
>
>
>
>
> *Position : Big Data or Hadoop Developer with Java & AWS experience..*
>
> *Location: Phoenix, AZ*
>
> *Contract: Long term*
>
>
>
> *Note - NO OPT EAD if H1B need Passport Number.*
>
>
>
> Skill –
>
>
>
> Java & AWS experience
>
>
>
>
>
> *Cordially,*
>
>
>
> *Anchit Bajpai*
>
> *Technical Recruiter*
>
> *Quantum World Technologies Inc.*
>
> *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*
>
> *Direct – 917-781-6463*
>
> *Office: 805-222-0532 Ext-310*
>
> *Fax : 805-834-0532*
>
> *E: *anc...@quantumworld.us
>
> --
> You received this message because you are subscribed to the Google Groups
> "CorptoCorp" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to corptocorp+unsubscr...@googlegroups.com.
> To post to this group, send email to corptocorp@googlegroups.com.
> Visit this group at https://groups.google.com/group/corptocorp.
> For more options, visit https://groups.google.com/d/optout.
>


-- 





*Thanks & Regards,*


*Sandeep | Solware IT Technologies*


*Phone: 469-415-0053*
*Email: sand...@solwareittech.com *

*G.Hangouts – sandeepstaffing3*

*Address: 520 Central Parkway East, Suite 105, Plano, TX,75074*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Rahul_Hadoop H1.docx
Description: MS-Word 2007 document


Position : Big Data or Hadoop Developer with Java & AWS experience..| Phoenix, AZ

2019-01-17 Thread Anchit Bajpai
Hi,
Hope you are doing great !
I'm Sharing you our client requirement details below.





*Position : Big Data or Hadoop Developer with Java & AWS experience..*

*Location: Phoenix, AZ*

*Contract: Long term*



*Note - NO OPT EAD if H1B need Passport Number.*



Skill –



Java & AWS experience





*Cordially,*



*Anchit Bajpai*

*Technical Recruiter*

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*Direct – 917-781-6463*

*Office: 805-222-0532 Ext-310*

*Fax : 805-834-0532*

*E: *anc...@quantumworld.us

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hiring for Hadoop Developer | New York , NY - F2F

2019-01-10 Thread praveensinghrecruiting


Hi

 

Please share the updated resume for below role


*Role :Hadoop Developer*

*Location: New York , NY*

*Duration: Long Term Contract *

*Interview : Face to face  *

 
* Experience :* 6+ Years
* Skills *: 

   - Experienced  as a lead in developing Spark applications and dealing 
   with multiple stakeholders.   
   - Experienced as  Hadoop Developer for application development in 
   Hive/Impala, Sqoop, Spark or any real-time applications. 

 

Thanks & Regards

Praveen Singh

Technical Recruiter

Nityo Infotech Corp

666 Plainsboro Road,  

Suite 1285

Plainsboro, NJ 08536

pravee...@nityo.com

Phone: 609-853-0818 Ext: 2391

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer/Admin @ San Jose, CA

2018-12-21 Thread DILEEP k
*Please find the below requirements and forward suitable resumes
to recruit...@aptivacorp.com  *



*Note: PASSPORT & Visa Copy MUST for Submission  No H1 Transfer
candidate No*



*Position: **Hadoop Developer/Admin*

*Location: **San Jose, CA*

*Duration: **12 months*



*Job description:*

·Designing, develop & tune data products, applications and integrations on
large scale data platforms (SQL server, HANA, Hadoop, Kafka Streaming, etc)
with an emphasis on performance, reliability and scalability and most of
all quality.

·  Analyze the business needs, profile large data sets and build custom
data models and applications to drive the Adobe business decision making
and customers experience

·  Develop and extend design patterns, processes, standards, frameworks and
reusable components for various data engineering functions/areas.

· Collaborate with key stakeholders including business team, engineering
leads, architects, BSA's & program managers.

·  Analyze the business needs, profile large data sets and build custom
data models and applications to drive the Adobe business decision making
and customers experience

· Develop and extend design patterns, processes, standards, frameworks and
reusable components for various data engineering functions/areas.

· Collaborate with key stakeholders including business team, engineering
leads, architects, BSA's & program managers.

· Demonstrated skill working with SQL and PL/SQL programming.

· Demonstrated skill designing, developing and supporting database
applications.

· Performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and
related scripts.

·  SQL Server ETL Development experience using SSIS. Strong ETL, data
warehouse, T-SQL skills.

· Extensive experience in implementing large scale data warehouse and data
mart architecture and implementation.

·  *Extensive experience on *MSBI Stack SQL Server DBMS, SQL Server
Analysis Services (SSAS) and SQL Server Integration Services (SSIS)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Opening for Hadoop Developer - Bentonville, AR

2018-12-20 Thread vamshinaga999


Hi ,

Hope you are doing great !!

 

We have opening for Hadoop  Developer

 

*Job Title: Hadoop  Developer*

*Location:* *Bentonville, AR*

*Duration: Long Term Contract*

 

*Any Visa is Fine*

*Passport Number needed for submission purpose.*

 

*Job Description:*

 

   - Minimum 6+ Years of experience needed. 
   - Design, develop, document Hadoop applications 
   - Excellent communication skills are a must for this position. 

 

 

*Thanks & Regards,*

*Vamshi*

*Email: **vamsh...@itechus.net*  

*Direct: 802-227-0236 / Main: 802 383 1500 Ext. 205*

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer/Admin @ San Jose, CA

2018-12-20 Thread DILEEP k
*Please find the below requirements and forward suitable resumes
to recruit...@aptivacorp.com  *



*Note: PASSPORT & Visa Copy MUST for Submission  No H1 Transfer
candidate No*



*Position: **Hadoop Developer/Admin*

*Location: **San Jose, CA*

*Duration: **12 months*



*Job description:*

·Designing, develop & tune data products, applications and integrations on
large scale data platforms (SQL server, HANA, Hadoop, Kafka Streaming, etc)
with an emphasis on performance, reliability and scalability and most of
all quality.

·  Analyze the business needs, profile large data sets and build custom
data models and applications to drive the Adobe business decision making
and customers experience

·  Develop and extend design patterns, processes, standards, frameworks and
reusable components for various data engineering functions/areas.

· Collaborate with key stakeholders including business team, engineering
leads, architects, BSA's & program managers.

·  Analyze the business needs, profile large data sets and build custom
data models and applications to drive the Adobe business decision making
and customers experience

· Develop and extend design patterns, processes, standards, frameworks and
reusable components for various data engineering functions/areas.

· Collaborate with key stakeholders including business team, engineering
leads, architects, BSA's & program managers.

· Demonstrated skill working with SQL and PL/SQL programming.

· Demonstrated skill designing, developing and supporting database
applications.

· Performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and
related scripts.

·  SQL Server ETL Development experience using SSIS. Strong ETL, data
warehouse, T-SQL skills.

· Extensive experience in implementing large scale data warehouse and data
mart architecture and implementation.

·  *Extensive experience on *MSBI Stack SQL Server DBMS, SQL Server
Analysis Services (SSAS) and SQL Server Integration Services (SSIS)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Direct client req: Hadoop developer at Atlanta, GA and Java developer at CA

2018-11-30 Thread Amar NaReddy
Hi,

Hope You Are Doing Great



Here is a urgent requirement form one of my clients. Kindly share  matching
profiles with me ASAP:





*Java developer at CA*



Experience in working with Java

· Experience in working with Building Rest API and micro services

· Experience in working with CI/CD

· Experience in working with Agile

· Experience in working with react or angular is a plus





*Hadoop developer at Atlanta, GA*

*Job Description:-*



· Knowledge of hadoop ecosystem and its components –HBase, Pig,
Hive, Sqoop, Flume, Oozie, etc.

· Know-how on the java essentials for hadoop.

· Know-how on basic Linux administration

· Analytical and problem solving skills.

· Business acumen and domain knowledge

· Knowledge of scripting languages like Python or Perl.

· Data modelling experience with OLTP and OLAP

· Good knowledge of concurrency and multi-threading concepts.

· Understanding the usage of various data visualizations tools like
Tableau, Qlikview, etc.





*Thanks & Regards,*

Amar NaReddy

Email: a...@vstconsulting.com   Desk: 732-491-8687

*Celebrating 15 Years 2002-2017** | *www.vstconsulting.com

200 Middlesex Essex Tpke, Suite 102, Iselin, NJ 08830

http://www.inc.com/inc5000/profile/vst-consulting

*[image: cid:image002.jpg@01D3E6D2.D46FBE50]*
<http://www.inc.com/inc5000/index.html> *Company**
2008/2009/2010/2011/2012/2013*

*Ranked #24in NJ Finest for 2008.*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Senior Hadoop Developer (10+ Experience) , NYC

2018-11-27 Thread Naveen
We are looking for senior hadoop developer ( 10+ experience) . 
Location : NYC
Duration : 12+
Rate : DOE 
recrui...@digixforminc.com

Prefer - USC , GC , H4 Ead.

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Opening for Big Data Hadoop developer - Phoenix, AZ

2018-11-16 Thread vamshinaga999


Hi ,

 

We have opening for Big Data Hadoop developer

 

*Job Title: Big Data Hadoop developer*

*Location: Phoenix, AZ*

*Duration: 6 + Months Contract*

 

*Passport Number Needed for H1B*

 

*Job Description:*

 

*Big Data, Hadoop Platforms, Java, Web Services*

 

   - Responsible for creating and maintaining large data sets. 
   - Streaming data in hadoop environment. 
   - Developing and maintaining webservices. 
   - Big data development (Spark, Scala, Java) 
   - Hadoop Platform (Solr, HBase, HDFS, Kafka, Flume) 
   - Restful webservice (JAX-RS, Spring) 
   - Full stack web development and deployment on a RHEL platform 
   Websphere, Tomcat serverTest Driven Design. 

 

*Thanks & Regards,*

*Vamshi*

*Email: **vamsh...@itechus.net*  

*Direct: 802-227-0236 / Main: 802 383 1500 Ext. 205*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Role :: Hadoop (Developer/Admin) Engineer | Phoenix, AZ

2018-11-02 Thread Anchit Bajpai
Hi,
Hope you are doing great !
I'm Sharing you our client requirement details below.


*Client –*
*IMPETUS/AMEX*





Hadoop (Developer/Admin) Engineer

Job Loc :: Phoenix, AZ
Long Term

All Visa ( Please No OPT )






*Required –*



Skills ::: Hadoop, Hive, Spark, Java.







*Cordially,*



*Anchit Bajpai*

*Technical Recruiter *

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360*

*Direct – 917-781-6463*

*Office: 805-222-0532 Ext-310*

*Fax : 805-834-0532*

*E: *anc...@quantumworld.us

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer || Bentonville, AR/ Sunnyvale, CA

2018-10-22 Thread neha nityo
!!Max Rate $55-58/hr on c2c ||Must i-94 Copy !!

Kindly share suitable resumes ASAP to @neh...@nityo.com



Role : Hadoop Developer

Location : Bentonville, AR/ Sunnyvale, CA

Duration : Contract for 1+ yr

Client : TCS/ Wal-Mart

Client: LNT/Apple





Required/Desired Skills:

• 5+ years software development experience in multiple languages
(Cassandra, Kafka, Spark and Hadoop ).
• Must have expertise in Cassandra
• Ability to perform data related benchmarking, performance analysis and
tuning
• Strong skills in In-memory applications, Database Design, Data
Integration.
• Experience in cloud environment
• Experience writing shell scripts using (ksh, bash, perl or python).
• Excellent written and oral communication skills.

• Excellent communication skills (both written and verbal) with strong
presentation and facilitation skills
• Demonstrated ability to influence and consult (providing options with
pros, cons and risks) while providing thought leadership to
sponsors/stakeholders in solving business process and/or technical problems









Thanks & Regards,

Neha Gupta

Desk no : 609-853-0818 Ext-2105

Email id : neh...@nityo.com

LinkedIN: www.linkedin.com/in/nehag6
(www.nityo.com)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Req : Big Data Developer OR Hadoop Developer | Multiple Location

2018-10-22 Thread Anchit Bajpai
Hi,
Hope you are doing great !
I'm Sharing you our client requirement details below.



*Please Reply at - anc...@quantumworld.us  *


*Who We Are..*

QUANTUM WORLD TECHNOLOGIES INC , is a venture of “QUANTUM GROUP OF
COMPANIES”. We interest in IT and ITES, Real Estate, Hospitality,
Pharmaceuticals, Entertainment Industries across the globe, having its head
quarter located at Thousand Oaks, CA and also has presence in Dartmoor, Ct.
Moorpark CA, Parsippany – NJ, Seattle – WA, India, UAE and New Zealand. We
as one of the leading service provider, provides effective business
solutions in IT Consulting, Staff Augmentation, and Business Process
Outsourcing Services. We are an e-Verified employer.



*Client..*

*Virtusa/JPMC*






*Req : Big Data Developer  OR  Hadoop Developer Location:* *Jersey City,
NJ/ Wilmington, DE/ Columbus OH*

*Long Term *

*USC, GC*

*Need Passport number (Validation Process) – H1B, GC EAD, TN*



*Required Skills:*

*10+ years in Total IT*

*3+ years’ experience in **BigData tools – Scala, Spark, Hadoop, Hive,
Hbase**, NOSQL*

*3+ years experience in **building applications in Java*



*Required Skills and Qualifications*



· Masters or Bachelors in *Computer Science* with 3+ years
experience in building applications in Java.

· Strong CS fundamentals, data structures, algorithms with good
understanding of Object-Oriented Design Principles, architecture
and prevalent design patterns

· Strong in Object Oriented Development and Java platform.

· Hands on experience in big data technologies including Storm or
Spark, Hadoop, Kafka to name a few.

· Experience with big data technologies is a must.

· Excellent communication skills are a must for this position





*Cordially,*



*Anchit Bajpai*

*Technical Recruiter *

*Quantum World Technologies Inc.*

*199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA - 91360*

*Office: 805-222-0532 Ext-310*

*Fax : 805-834-0532*

*E: *anc...@quantumworld.us

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement For Hadoop Developer in Jacksonville, FL

2018-10-22 Thread sean patrick
*NEED VISA COPY AND DL COPY FOR SUBMISSION*



Hello Partner,



Hope you are doing well..!!!



*Role*: Hadoop Developer

*Location:* Jacksonville, FL

*Duration:* Long Term



*Job Description**:*



*Desired Skills:*

· Knowledge of Scala/Spark technology

· Knowledge of Java technology

· Knowledge of Hive/HBase/Phoenix

· Knowledge of SQL technology



*Job Responsibilities:*

· Develop an understanding of the business and share this business
knowledge with the decision makers.

· Collaborating with clients and other stakeholders to effectively
integrate and communicate analysis finding

· Identify client organization's strengths and weaknesses.

· Understand the business goals, objectives and strategies of the
client and to use that knowledge to help design and implement new business
systems that align with the business vision.

· Use a variety of communication skills (e.g. interviews, meetings,
and facilitated sessions) to analyse business processes, data and systems

· Take a “bottom-up” analysis to better understand problems areas
and areas of particular complexity



Thanks & Best regards,

 *Nelson White*

Talent Acquisition Team

*Conquest Tech Solutions, Inc. *| www.conq-tech.com

19 C Trolley Square, Wilmington, DE – 19806

P: 302-286-9010 EXT: 109 | Fax: 302-288-6485

E: nel...@conq-tech.com | *Hangout*:



Coming together is a Beginning... Keeping together is Progress...Working
together is a Success!!!

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Require : Hadoop Developer Tomorrow and Monday slots are available..............

2018-10-18 Thread akshithkalvakota098



Require : Hadoop Developer
Location: Plano,TX
Duration: Long term

Any Visa is OK for us ...

Only H1B and GC-EAD must share there Passport number

DOB must be not more than 1991 please

Tomorrow and Monday slots are available..

*Job Description:*
• Should be from Java Back ground.
• Manage individual projects and work as an individual contributor
• Broad and extensive knowledge of the software development process 
and its technologies
• 3 years full stack development using Java and Java Script. 
AngularJS preferred.
• 5+ years data modeling and database design experience with 
relational and NoSQL databases
• Scripting experience using shell and Python
• Experience with Rest APIs, Micro Services and Docker
• Experience with the Hadoop environment. Hive, Pig and Scala.
• A data-driven problem solver
• Self-motivate and highly productive
• Communicates effectively

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Require : Hadoop Developer,Location: Plano,TX,Duration: Long term

2018-10-17 Thread Amar


Require : Hadoop Developer
Location: Plano,TX
Duration: Long term

*_Job Description: _*

• Should be from Java Back ground.
• Manage individual projects and work as an individual contributor
• Broad and extensive knowledge of the software development 
process and its technologies
• 3 years full stack development using Java and Java Script. 
AngularJS preferred.
•         5+ years data modeling and database design experience with 
relational and NoSQL databases

• Scripting experience using shell and Python
• Experience with Rest APIs, Micro Services and Docker
• Experience with the Hadoop environment. Hive, Pig and Scala.
• A data-driven problem solver
• Self-motivate and highly productive
• Communicates effectively

*Thanks and Regards**
Amarinder Singh (Sr. IT Associate)
Kalven Technologies Inc. 2300, E Higgins Rd, Suite 211, ELK Grove 
Village, IL-60007
   1701, E.Wood Field Rd, Suite 
300, Schaumburg, IL-60173
Phone: 312-667-0211 | Email id : amarin...@kalventech.com 
<mailto:amarin...@kalventech.com> | LinkedIn : Amar Singh | Skype id : 
Amarinderkalven**

**http://www.kalventech.com**
Product Engineering | Systems Integration | Professional Services.***
**
*
**

*Note: Under Bill s.1618 Title III passed by the 105th U.S. Congress 
this mail cannot be considered Spam as long as we include contact 
information and a remove link for removal from our mailing list. To be 
removed from our mailing list reply with "remove" and include your 
"original email address/addresses" in the subject heading. Include 
complete address/addresses and/or domain to be removed. We will 
immediately update it accordingly. We apologize for the inconvenience 
caused..



--
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Require : Hadoop Developer,Location: Plano,TX,Duration: Long term

2018-10-17 Thread Amar


Require : Hadoop Developer
Location: Plano,TX
Duration: Long term

*_Job Description: _*

• Should be from Java Back ground.
• Manage individual projects and work as an individual contributor
• Broad and extensive knowledge of the software development 
process and its technologies
• 3 years full stack development using Java and Java Script. 
AngularJS preferred.
•         5+ years data modeling and database design experience with 
relational and NoSQL databases

• Scripting experience using shell and Python
• Experience with Rest APIs, Micro Services and Docker
• Experience with the Hadoop environment. Hive, Pig and Scala.
• A data-driven problem solver
• Self-motivate and highly productive
• Communicates effectively

*Thanks and Regards**
Amarinder Singh (Sr. IT Associate)
Kalven Technologies Inc. 2300, E Higgins Rd, Suite 211, ELK Grove 
Village, IL-60007
   1701, E.Wood Field Rd, Suite 
300, Schaumburg, IL-60173
Phone: 312-667-0211 | Email id : amarin...@kalventech.com 
<mailto:amarin...@kalventech.com> | LinkedIn : Amar Singh | Skype id : 
Amarinderkalven**

**http://www.kalventech.com**
Product Engineering | Systems Integration | Professional Services.***
**
*
**

*Note: Under Bill s.1618 Title III passed by the 105th U.S. Congress 
this mail cannot be considered Spam as long as we include contact 
information and a remove link for removal from our mailing list. To be 
removed from our mailing list reply with "remove" and include your 
"original email address/addresses" in the subject heading. Include 
complete address/addresses and/or domain to be removed. We will 
immediately update it accordingly. We apologize for the inconvenience 
caused..



--
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer || Bentonville, AR

2018-10-12 Thread neha nityo
!!Max Rate $60/hr on c2c ||Must i-94 Copy !!

Kindly share suitable resumes ASAP to @neh...@nityo.com



Role : Hadoop Developer

Location : Bentonville, AR

Duration : Contract for 1+ yr

Client : TCS(Tata)/ Wal-Mart



Required/Desired Skills:

• 5+ years software development experience in multiple languages
(Cassandra, Kafka, Spark and Hadoop ).
• Must have expertise in Cassandra
• Ability to perform data related benchmarking, performance analysis and
tuning
• Strong skills in In-memory applications, Database Design, Data
Integration.
• Experience in cloud environment
• Experience writing shell scripts using (ksh, bash, perl or python).
• Excellent written and oral communication skills.

• Excellent communication skills (both written and verbal) with strong
presentation and facilitation skills
• Demonstrated ability to influence and consult (providing options with
pros, cons and risks) while providing thought leadership to
sponsors/stakeholders in solving business process and/or technical problems









Thanks & Regards,

Neha Gupta

Desk no : 609-853-0818 Ext-2105

Email id : neh...@nityo.com

LinkedIN: www.linkedin.com/in/nehag6
(www.nityo.com)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Immediate hiring for Hadoop Developer with Spark, Hive

2018-10-11 Thread Raj
*Hello,*

*Greetings from ICS Global Soft INC.!*

We have a requirement for you and the details are as follows!!!



*Position: Hadoop Developer with Spark, Hive*

*Location: Plano, Texas*

*Duration: Long Term*



Hadoop Developer and Specifically on Big Data, Client is looking for
candidates having experience on the below technologies:



1. Spark

2. Hive

3. Pig

4. And also on Python and No-SQL Data Bases.



*Thanks & Regards*



*Rajender*

*Senior Technical Recruiter*

*1231 Greenway Drive | Ste 375 | Irving, TX 75038*

E-mail: rajen...@icsglobalsoftinc.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Spark Hadoop Developer || Sunnyvale, CA

2018-10-09 Thread neha nityo
!! Genuine OPT Will Work with i-94 !! Rate Max $55/hr on c2c!! 100%
interview



Role : Spark Hadoop Developer

Location : Sunnyvale, CA

Duration : Contract for 1+ Year

Client : L(Larsen & Toubro Infotech Ltd) / Apple

Interview Date : within 4 working days



Required/Desired Skills:

•  5+ years of IT experience

•  Strong background in Java programming.

•  2+ years of hands-on Spark expertise.

•  Experience with Hadoop – deep understanding of internals of
Hadoop, Hive, Oozie, MapReduce, Sqoop.

•  Should be proficient in writing Advanced SQLs and expertise in
performance tuning of SQLs / Hive queries.

•  Kafka experience is a plus.

•  Hands-on experience with RESTful Web Services and Spring
Framework.

•  Good understanding of object oriented and micro services design.

•  Should have strong verbal & written communication skills.

•  Experience in developing automated test scripts to help with
regression testing sharp troubleshooting skills to identify and fix issues
quickly.

•  Should be able think out of the box, drives for excellence and
is self-motivated.







Thanks & Regards,

Neha Gupta

Desk no : 609-853-0818 Ext-2105

Email id : neh...@nityo.com

LinkedIN: www.linkedin.com/in/nehag6
(www.nityo.com)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement For Hadoop Developer in Jacksonville, FL

2018-10-03 Thread sean patrick
*NEED PASSPORT NUMBER AND VISA COPY FOR SUBMISSION*


Hello Partner,

Hope you are doing great!!!

Immediate Need!!



*Job Title:* *Hadoop Developer *

*Duration: **Long Term*

*Location: Jacksonville, FL*



*Job Description*



*Desired Skills:*

· Knowledge of Scala/Spark technology

· Knowledge of Java technology

· Knowledge of Hive/HBase/Phoenix

· Knowledge of SQL technology



*Job Responsibilities:*



· Develop an understanding of the business and share this business
knowledge with the decision makers.

· Collaborating with clients and other stakeholders to effectively
integrate and communicate analysis finding

· Identify client organization's strengths and weaknesses.

· Understand the business goals, objectives and strategies of the
client and to use that knowledge to help design and implement new business
systems that align with the business vision.

· Use a variety of communication skills (e.g. interviews, meetings,
and facilitated sessions) to analyse business processes, data and systems

· Take a “bottom-up” analysis to better understand problems areas
and areas of particular complexity





Thanks & Best regards,

 *Nelson White*

Talent Acquisition Team

*Conquest Tech Solutions, Inc. *| www.conq-tech.com

19 C Trolley Square, Wilmington, DE – 19806

P: 302-286-9010 EXT. 111| Fax: 302-288-6485

E: nel...@conq-tech.com | *Hangout*:



Coming together is a Beginning... Keeping together is Progress...Working
together is a Success!!!

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Spark/Hadoop Developer -- { Need Locals Only & No OPT'S } ---

2018-10-01 Thread harish karnala


Hi,

Hope you are doing well !!   

My name is  *Harish Karnala* and I am a 
technical recruiter with *United Software group*. I reviewed your 
information online and I am impressed with your qualifications. One of my 
clients in the  area  *Richardson TX* currently has *Spark/Hadoop 
Developers* role that they are looking to fill. Here are the details on the 
role.

 

*JD:*

 

*Role:*Spark/Hadoop Developers 

*Location:* Richardson TX

*Duration:* 12 Months

   - As a Big Data Engineer, you will provide technical expertise and 
   aptitude to Hadoop technologies as they relate to the development of 
   analytics.  
   - Responsible for the planning and execution of big data analytics, 
   predictive analytics and machine learning initiatives.  
   - Assist in leading the plan, building, and running states within the 
   Enterprise Analytics Team and act in a lead role driving user story 
   analysis.  
   - By creating optimization and stability to the platforms, you will play 
   a key role in the architecture design and data modeling of the platform and 
   analytic applications.  
   - Engage in solving and supporting real business issues with your Hadoop 
   distributed systems and Open Source framework knowledge.  
   - Perform detailed analysis of business problems and technical 
   environments and use this data in designing the solution and maintaining 
   data architecture.  
   - Focus will be in creating strategy, researching emerging technology, 
   and applying technology to enable business solutions within the 
   organization.  
   - Design and develop software applications, testing, and building 
   automation tools.  
   - Design efficient and robust Hadoop solutions for performance 
   improvement and end-user experiences.  
   - Work in a Hadoop ecosystem implementation/administration, installing 
   software patches along with system upgrades and configuration.  
   - Conduct performance tuning of Hadoop clusters while monitoring and 
   managing Hadoop cluster job performance, capacity forecasting, and 
   security.  
   - Define compute (Storage & CPU) estimations formula for ELT & Data 
   consumption workloads from reporting tools and Ad-hoc users.  
   - Analyze Big Data Analytic technologies and applications in both 
   business intelligence analysis and new service offerings, adopting and 
   implementing these insights and standard methodologies.  

*REQUIREMENTS*

   - 8+ years of experience supporting various enterprise platforms, 
   performance tuning and application performance optimizations.  
   - 5+ years of experience working in Linux servers and platform 
   optimization.  
   - 3+ years of experience in architecture and implementation of large and 
   highly complex projects using Hortonworks (Hadoop Distributed File System) 
   with Isilon commodity hardware.  
   - 4+ years of experience with Big Data platforms and tools and Hadoop 
   implementation experience including the following:  
   - Hands on experience in the platform operations.  
   - Performance and delivery of Hadoop ecosystem (Hadoop, Hive, Spark, 
   Hbase, Ambari, Kafka, Pyspark & R). 
   - Application performance tuning.  
   - Experience as a DBA on any RDBMS or Hive Database.  
   - Experience managing Big Data platform operations or any other large 
   platform. 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 


Dublin, OH 43017 

 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

Hangouts: harirecruite...@gmail.com

www.usgrpinc.com

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop developer (4-8year Exp req) || Portland, OR

2018-09-27 Thread neha nityo
*Please share Genuine resume who will take the Video Call || I-94 is
Mandatory for every submission  *



Let me know if you have any candidate available for the below job position
we have, Thanks!! Kindly do share your updated resume for further process (
neh...@nityo.com).



Role : Hadoop developer (4-8year Exp req)

Location : Portland, OR

Job Type : Contract for 12 month

Client : TCS

Interview Date : ASAP(100%)

*For client submission we need your full details + your work authorization
copy + Your Id Proof + Your I-94 Copy/Passport No *



Job Details:- SQL/AWS/Python/Spark mandatory

- Hands-on experience with Spark SQL
-Hive queries to run with Spark engine on Hortonworks Data Platform
- Overall experience with writing complex SQL queries to manipulate large
data sets and gather analytics

Detailed JD for suitable profiles:
· Worked in Hadoop Hive HQL for 2-3 years and other associated technologies
in Hadoop
· Hadoop 2.x is preferred
· Spark SQL exp. for 1+ years. Must have hands-on experience
· Experience handling large volume of Data and Semantic processes







Thanks & Regards,

Neha Gupta

Talent Acquisition (Team Lead)

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

URL : www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Big data Hadoop Developer || Sunnyvale, CA/ Woonsocket , RI

2018-09-18 Thread neha nityo
!! Genuine OPT Will Work with i-94 !!



Let me know if you have any candidate available for the below job position
we have, Thanks!!

Kindly do share your updated resume for further process (neh...@nityo.com).



Role : Big data Hadoop Developer

Location : Sunnyvale, CA/ Woonsocket , RI

Duration : Contract for 1+ Year

Client : TCS/ CVS

Client : L/ Apple

Interview Date : ASAP



Required/Desired Skills:- Need very strong with Java Hive & Spark

•  5+ years of IT experience

•  Strong background in Java programming.

•  2+ years of hands-on Spark expertise.

•  Experience with Hadoop – deep understanding of internals of
Hadoop, Hive, Oozie, MapReduce, Sqoop.

•  Should be proficient in writing Advanced SQLs and expertise in
performance tuning of SQLs / Hive queries.

•  Kafka experience is a plus.

•  Hands-on experience with RESTful Web Services and Spring
Framework.

•  Good understanding of object oriented and micro services design.

•  Should have strong verbal & written communication skills.

•  Experience in developing automated test scripts to help with
regression testing sharp troubleshooting skills to identify and fix issues
quickly.

•  Should be able think out of the box, drives for excellence and
is self-motivated.





Thanks & Regards,

Neha Gupta

Talent Acquisition (Team Lead)

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

URL : www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


L Opening !! Hadoop Developer !! Sunnyvale, CA

2018-09-11 Thread neha nityo
Kindly share suitable resumes ASAP to @neh...@nityo.com || Desk No :
609-853-0818 * 2105



*I need ONLY genuine profiles with valid supporting docs ( Visa Copy, Photo
of DL Copy & i94 Copy).*

*Visa Preferences: (OPT Workable if applied for h1b(H1B Cap-Gap), USC, GC,
GC EAD, H4 EAD, L2 EAD*



Role : Spark Hadoop Developer

Location : Sunnyvale, CA

Duration : Contract for 1+ Year

Client : L(Larsen & Toubro Infotech Ltd) / Apple

Interview Date : within 4 working days



Required/Desired Skills:

•  5+ years of IT experience

•  Strong background in Java programming.

•  2+ years of hands-on Spark expertise.

•  Experience with Hadoop – deep understanding of internals of
Hadoop, Hive, Oozie, MapReduce, Sqoop.

•  Should be proficient in writing Advanced SQLs and expertise in
performance tuning of SQLs / Hive queries.

•  Kafka experience is a plus.

•  Hands-on experience with RESTful Web Services and Spring
Framework.

•  Good understanding of object oriented and micro services design.

•  Should have strong verbal & written communication skills.

•  Experience in developing automated test scripts to help with
regression testing sharp troubleshooting skills to identify and fix issues
quickly.

•  Should be able think out of the box, drives for excellence and
is self-motivated.







Thanks & Regards,

Neha Gupta

Desk no : 609-853-0818 Ext-2105

Email id : neh...@nityo.com

LinkedIN: www.linkedin.com/in/nehag6
(www.nityo.com)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer(Kafka, Cassandra) || Bentonville,AR

2018-09-07 Thread neha nityo
Kindly share suitable resumes ASAP to @neh...@nityo.com || Desk No :
609-853-0818 * 2105



Role : Hadoop Developer(Kafka, Cassandra)

Location : Bentonville,AR

Duration : Contract for 1+ yr

Client : TCS(Tata)/ Wal-Mart



Required/Desired Skills:

• 5+ years software development experience in multiple languages
(Cassandra, Kafka, Spark and Hadoop ).
• Must have expertise in Cassandra
• Ability to perform data related benchmarking, performance analysis and
tuning
• Strong skills in In-memory applications, Database Design, Data
Integration.
• Experience in cloud environment
• Experience writing shell scripts using (ksh, bash, perl or python).
• Excellent written and oral communication skills.

• Excellent communication skills (both written and verbal) with strong
presentation and facilitation skills
• Demonstrated ability to influence and consult (providing options with
pros, cons and risks) while providing thought leadership to
sponsors/stakeholders in solving business process and/or technical problems









Thanks & Regards,

Neha Gupta

Desk no : 609-853-0818 Ext-2105

Email id : neh...@nityo.com

LinkedIN: www.linkedin.com/in/nehag6
(www.nityo.com)

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Backfill opening !! Wipro opening || Hadoop Developer (Rate $60) || Mount Laurel-NJ

2018-09-06 Thread neha nityo
*Visa Applicable : Only USC/GC/L2/TN and E3*



Role : Hadoop Developer

Location : Greensboro, NC

Duration: 12+ Months

Client : Wipro/ TD Bank



Skills and Responsibilities: *Should be strong in hadoop and DWH*

Ø  Hadoop development and implementation.

Ø  Loading from disparate data sets.

Ø  Pre-processing using Hive and Pig.

Ø  Designing, building, installing, configuring and supporting Hadoop.

Ø  Translate complex functional and technical requirements into detailed
design.

Ø  Perform analysis of vast data stores and uncover insights.

Ø  Maintain security and data privacy.

Ø  Create scalable and high-performance web services for data tracking.

Ø  High-speed querying.

Ø  Managing and deploying HBase.









Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Immediate interview for Big Data - Hadoop Developer || Hadoop Architect || Bellevue,WA

2018-09-04 Thread recruiter logan
Hi ,

Greetings from HumacInc.

Hope you are doing good

Please go through the JD and let me know your interest

Please reach me at :

*Email: **sant...@humacinc.com* 

*Call :   +1 623-242-2594 *



*Job Title: Technology Lead | Big Data - Hadoop | Hadoop*

*Location:  Bellevue,WA*

 *Visa: **US Citizens/ Green card holders only. *

*Rate: DOE*



*Job Details:*

*Must Have Skills*

2+ years of development experience in object oriented design, development,
and implementation using Java technologies

2+ years of development experience in Hadoop HDFS, Spark, Hive, Pig, HBase,
Map Reduce, Sqoop

Expertise in overall Hadoop architecture, understanding of standard inbound
and outbound data transfers techniques

Nice to have skills

Reporting

Agile

*Detailed Job Description*

Expertise in overall Hadoop architecture, understanding of standard inbound
and outbound data transfers techniques.2 years of development experience in
object oriented design, development, and implementation using Java
technologies.2 years of development experience in Hadoop HDFS, Spark, Hive,
Pig, HBase, Map Reduce, Sqoop2 years of development experience in Java or
Scala, Python, SQL, shell scripting. Experience in developing REST services
is a plus. Experience in Cassandra and ElasticSearch, is



*Top 3 responsibilities you would expect the Subcon to shoulder and execute*

Requirement Clarification

Develop Solution and Demo the same

Coordinate with offshore



===



*Job Title:  Technology Architect | Big Data - Hadoop | Hadoop*

*Location: BellevueWA*

*Visa: **US Citizens/ Green card holders only. *

*Rate: DOE*

*Job Details:*

*Must Have Skills*

Strong working knowledge of Teradata, Hive and HBase

Experience in data architecture for handling streaming and batch data
within a data warehouse landscape

Understanding of Big Data technologies, Hadoop and modern data
architectures like Spark and NoSQL structures



*Nice to have skills*

Experience in distributed data processing framework such as Spark and
MapReduce and data streaming such as Kafka, Spark streaming preferred

Strong programming skill with at least one of the following Python, Java,
Scala, etc

Detailed Job Description

Strong working knowledge of Teradata, Hive and HBaseExperience in data
architecture for handling streaming and batch data within a data warehouse
landscapeUnderstanding of Big Data technologies, Hadoop and modern data
architectures like Spark and NoSQL structuresExperience in distributed data
processing framework such as Spark and MapReduce and data streaming such as
Kafka, Spark streaming preferredExperience with API management
solutionStrong programming skill with at least one of the following



*Top 3 responsibilities you would expect the Subcon to shoulder and execute*

Experience with data profiling tools such as Ataccama, Trifacta, etc
preferred

Excellent SQL tuning knowledge and experience with Teradata Query Grid
preferred

Experience working with vendors to evaluate, select, and implement 3rd
party solutions


-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer with Data Warehousing exp @ San francisco, CA

2018-08-31 Thread DILEEP k
*Note:** PASSPORT & Visa Copy MUST for Submission  No H1 Transfer
candidate No OPT*


*Title: Hadoop Developer*

*Location: san francisco, CA *

*Duration: 6+ Months*



*Job description:*

·8 – 10 Years Of strong Hadoop and Big Data experience with Enterprise Data
ware housing experience

·Well versed with Hadoop challenges related to scaling and self-service
analytics

· Well versed with Cloudera and Hortonworks distributions, hands on
experience required for both distributions.

·Well versed with hive, spark, hbase and latest developments in
the Hadoop eco system

·Excellent knowledge of Hadoop integration points with enterprise BI and
EDW tools

·Strong Experience with Hadoop cluster management, Administrative,
operations using Oozie, Yarn, Ambari, Zookeeper, Tez, Slider

·Strong Experience with Hadoop ETL, Data Ingestion: Sqoop, Flume, Hive,
Spark, Hbase

 Strong experience on SQL and PLSQL Good to have Experience in Real Time
Data Ingestion using Kafka, Storm, Spark or Complex Event Processing

· Experience in Hadoop Data Consumption and Other Components: Hive, Hue
HBase, Phoenix, Spark, Mahout, Pig, Impala, Presto

·Well versed with Enterprises with Multi-Functional Data Sources-
Marketing, Sales, Order/Invoice, Finance data, e-commerce and product usage
data.

·Good to have working experience in subscriptions business area.

·Experience monitoring, troubleshooting and tuning services and
applications and operational expertise such as good troubleshooting skills,
understanding of systems capacity, bottlenecks, and basics of memory, CPU,
OS, storage, and networks.

· Should be able to take the lead and interact with US team members with
minimal supervision Bachelor's Degree in Computer Science, Information
Science, Information Technology or Engineering or Related Field Good
communication skills across distributed team environment Should be aware of
Order to cash workflows and subscriptions business KPI's Solid
understanding of general business models, concepts and strategies Must be
self-motivated, responsive, professional and dedicated to customer success”

·Experience in developing API frameworks using Python/Java/Scala

·Must be self-motivated, responsive, professional and dedicated to customer
success

·Hands on programming experience in perhaps Java, Scala, Python, or Shell
Scripting, to name a few

·Experience in Design & Development of API framework using Plus

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


PASSPORT MUST for Submission - Need HADOOP Developer NOT Admin - CA

2018-08-30 Thread Corp to Corp Requirements
*PASSPORT & H1 MUST for submission – **no exceptions*



*Job Title: **HADOOP Developer *

*Location: SFO, CA*

*Duration: 6+ Months*


8 – 10 Years Of strong Hadoop and Big Data experience with Enterprise Data
ware housing experience

Well versed with Hadoop challenges related to scaling and self-service
analytics

Well versed with Cloudera and Hortonworks distributions, hands on
experience required for both distributions.

Well versed with hive, spark, hbase and latest developments in
the Hadoop eco system

Excellent knowledge of Hadoop integration points with enterprise BI and EDW
tools

Strong Experience with Hadoop cluster management, Administrative,
operations using Oozie, Yarn, Ambari, Zookeeper, Tez, Slider

Strong Experience with Hadoop ETL, Data Ingestion: Sqoop, Flume, Hive,
Spark, Hbase

Strong experience on SQL and PLSQL Good to have Experience in Real Time
Data Ingestion using Kafka, Storm, Spark or Complex Event Processing

Experience in Hadoop Data Consumption and Other Components: Hive, Hue
HBase, Phoenix, Spark, Mahout, Pig, Impala, Presto

Well versed with Enterprises with Multi-Functional Data Sources- Marketing,
Sales, Order/Invoice, Finance data, e-commerce and product usage data.

Good to have working experience in subscriptions business area.

Experience monitoring, troubleshooting and tuning services and applications
and operational expertise such as good troubleshooting skills,
understanding of systems capacity, bottlenecks, and basics of memory, CPU,
OS, storage, and networks.

Should be able to take the lead and interact with US team members with
minimal supervision Bachelor's Degree in Computer Science, Information
Science, Information Technology or Engineering or Related Field Good
communication skills across distributed team environment Should be aware of
Order to cash workflows and subscriptions business KPI's Solid
understanding of general business models, concepts and strategies Must be
self-motivated, responsive, professional and dedicated to customer success”

Experience in developing API frameworks using Python/Java/Scala

 Must be self-motivated, responsive, professional and dedicated to customer
success

Hands on programming experience in perhaps Java, Scala, Python, or Shell
Scripting, to name a few

Experience in Design & Development of API framework using Python/Java is a
Plus

Experience in developing BI Dash boards and Reports is a plus



C2C Vendors Note: Please join the link to receive C2C requirements on a
daily basis: *https://groups.google.com/forum/?hl=en-GB#!forum/c2c-requirementss
<https://groups.google.com/forum/?hl=en-GB#%21forum/c2c-requirementss>*



Thanks & Regards

Email: *vineeth...@aptivacorp.com *

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Java/Hadoop Developer -- { Passport No. Required } ---

2018-08-29 Thread harish karnala


Hi,

Hope you are doing well !!   

  This is Harish From USG.I 
have an Urgent Java/Hadoop Developer . I hope your resume fits for the 
below position. Kindly let me know your interest and also required your 
updated resume. Please feel free to reach me any 
time.   

   

*Role:  Java/Hadoop Developer*

*Location:*   *   Denver, CO*

*Job-Type: CTH/CTC/FTE* 

 

*JD*

 

*Description*

Experience with Java, Hadoop, Big data, working in AWS, working with other 
AWS components is a plus but not a requirement. 

Not looking for junior level *at least 5 years of experience with 
development and 1-2 years of experience with big data*.

 

*Responsibilities:*

Design and development on Hadoop software ecosystem and development on Map 
Reduce, HBase, Hive, Pig

Programming in Spark/Storm, Kafka or other distributed messaging system, 
PIG/Python

 

*Qualifications*

· 5+ years in application development experience in Java.

· 6+ years in strong development experience in core Java and Linux 
environment.

· 1+ years in Hadoop – HDFS, Hive, Map Reduce, HBase

· Prior experience working with Cloudera Hadoop distribution (Nice 
to have)

· Prior experience working with AWS cloud Infrastructure and 
services (S3, EC2 etc.)

· Working knowledge of Git, Bit Bucket, Stash as source repository

 

 

*Note:* 

· Need a consultant with 70% in Java and 30% in Big data 
technologies

· The consultant must be using Java on a daily basis for the coding 

 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>

Dublin, OH 43017 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>
 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

www.usgrpinc.com

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Hadoop Developer with Cloudera Experience -- { Passport No. Required }

2018-08-17 Thread harish karnala


Hi,

Hope you are doing well !!



  This is Harish From USG. 
I have an Urgent Requirement for Cloudera application support engineer. I 
hope your resume fits for the below position. Kindly let me know your 
interest and also required your updated resume. Please feel free to reach 
me at any time.

  

  

*Role :* Cloudera application support engineer   

   


*Location : * Marlborough, MA 

*Duration :*  6 months+ 

*Experience:*  10 Years



 


*JD*

 

*Technical/Functional Skills  *

•Excellent communication skills in English (written and verbal)

•Hands on experience in Unix bash script, Map Reduce understanding 
and working knowledge, sqoop framework, Hive, SQL, Spark framework, 
Cloudera manager

*Roles & Responsibilities   *

•Should have Cloudera application and production support experience 
of minimum 6-8 years.

•Very good at onshore and offshore team-driven approach to support 
service delivery best value to clients

•Manage day-to-day interactions with our customers with clear and 
honest expectation setting, friendly and collaborative disposition, aiming 
for the complete satisfaction of each customer

•Triage, diagnose and potentially escalate customer inquiries 
during their engineering and operations efforts

•Participate in occasional weekend on-call roster for critical 
support needs

•Strong problem-solving and analytical skills

•Must have exposure to debug problems and drive for getting root 
cause of the incidents related to cloud era environment.

•Exceptional focus on application support with Incident management 
and problem management focus

•Review changes for impact; implement changes with 100% success.

•Should work as active team member and collaborative.

•Should be able to write shell /perl scripts, decode existing 
scripts and modify as per requirements.

•Focus on system stability and deliver SIP's.

•May not be in shifts however providing 24x7 supports on need basis.

•Actively measure system status pre and post changes.

•Ability to drive changes independently and Take decisions.

•Performance review for changes.

•In-depth knowledge of system / subsystems and its integrations.

•Should be able to mentor team members and build team under you.

Status reporting to Onshore Delivery Management.

*Generic Managerial Skills*

•Must have excellent written and verbal communication skill

•Must be self-motivated and pro-active

•Able to work with demanding customer

•Excellent stakeholders management

•Capable of working in an onsite-offshore model with the delegation 
to offshore teams.

 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 


Dublin, OH 43017 

 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

www.usgrpinc.com

 

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Java/Hadoop Developer(Team Lead) || Alpharetta, GA

2018-08-16 Thread neha nityo
I do have contract to contract position so If you have anyone with you for
the following position, please send the suitable resume along with Contact
Details,
Kindly share suitable resumes ASAP to @neh...@nityo.com



Role : Java/Hadoop Developer(Team Lead)

Location : Alpharetta, GA

Duration: 12+ Months

-:Require Documents for submission :-

Visa Copy , Id Proof & I-94(Passport no)



 Job Description:-

Experience Required

Minimum 8-10 yrs. with a track record of team leadership in Hadoop and JAVA
development.



Roles & Responsibilities

Work as technical Lead for a Hadoop based application. The successful
candidate will have experience with initiating interactions with Agile
Development and Solution Architects, Business and Support (L2-L4)

Good communication skills and a proven track record for remediating issues
during deployment and team leadership are critical

Root cause analysis and coordination with multiple teams on daily basis



Generic Managerial Skills

Team Lead

Education

Degree in Computer Science or related discipline or equivalent work
experience

Fluent in English language – written and verbal







Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Lead Hadoop Developer(9+yr) || Alpharetta, GA

2018-08-16 Thread neha nityo
I do have contract to contract position so If you have anyone with you for
the following position, please send the suitable resume along with Contact
Details,
Kindly share suitable resumes ASAP to @neh...@nityo.com



Role : Lead Hadoop Developer(9+yr)

Location : Alpharetta, GA

Duration: 12+ Months



Job Description:
Big Data Hadoop, Hive and Spark with hands on expertise in design and
implementation of high data volume solution

• Strong in Spark Scala, Hive, Cassandra and Spark pipelines

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Sr. NET developer AND Hadoop developer positions form INFOSYS

2018-08-14 Thread recruiter logan
Hi Folks,

Hope you are doing good

Please go through the JD and share me the suitable profiles.

Please reach me at :

*Email: **sant...@humacinc.com* 

*Call :   +1 623-242-2594 *

*Required Passport Number/Copy for submission*

 *Note: No OPT, CPT, GC-EAD*

*__*

*Role: Sr. NET developer*

*Location :- Alpharetta GA*





*Must have:*

1.   VB scripting

2. MS Access

3. Experienece 8+ years





*Job Description –*

 *VB scripting/MS Access/Excel/scripting work. *This is mainly a Sr.
Developer role having VB script skill set, someone who knows MS access
database, can write excel macros, is very comfortable working with excel
and scripting will be a good fit. Should be self-starter, good
communications is essential and customer facing.





*__*







*Job Title: Hadoop developer *
* Location: Mason OH *

*Location: Hartford CT*
* Location: Madison WI*

*Location:  Richardson TX*





* Job Details: Must Have Skills (Top 3 technical skills only) **
1. Spark
2. Hadoop
3. Java

*Detailed Job Description:*
At least 7 years experience on Big data technologies. Spark skill is must.
Able to handle large team.

*Minimum years of experience*:8+*

Top 3 responsibilities you would expect the Subcon to shoulder and
execute*:

1. Able to handle big data projects development, design and implementation
by using Spark

2. Must have good communication skills

3. Able to handle large team








-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
<http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Requirement for Java- Hadoop Developer in Connecticut or NY City.....

2018-08-14 Thread Vamshi Krishna
Hi,

If you are interested and available for the Job, Please revert back with
latest resume and other details required for submission to
vam...@techorbit.com



*Role: Java- Hadoop Developer*

*Location: Connecticut or NY City.*

*Duration:  1+year*



*NO OPT’s and GC EAD’s Please.*



1. ( C++ or Java or Scala or Python) and (Hadoop or SQL) and Good
Communication

2. Experience is design, development, testing and deployment

3. Good Problem solving skills.

4. 5+ years of experience



 Candidate Name



Tel No



E-mail ID



Skype ID



Present location



Last 4 Digit SSN



Highest Degree of Education with Year of Passing



Work Authorization & Validity



LinkedIn ID



Rate on C2C/W2



DOB



Onsite availability (post-selection)



Total onsite experience, working in US



Overall relevant experience of candidate









*Regards,*



*Vamshi*

*vam...@techorbit.com* 

*972-646-2158*

[image: Description: cid:image002.jpg@01D35CE0.BBC5D2D0]

*1300 W Walnut Hill Ln. #260, Irving, TX 75038*
<https://goo.gl/maps/hRJfzPqWBbL2>*. *

*www.techorbit.com* <http://www.techorbit.com/>

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent need of a Hadoop Developer

2018-08-09 Thread Deepak Gulia
*Please send me the profiles at deepak.gu...@simplion.com
*



*Position : Hadoop Developer*

*Location : Reston VA*

*Duration : 6+ Months *





*Job Description for Hadoop Profiles*

· Bachelor's Degree in Computer Science or related field and 6
years’ experience building scalable e-commerce applications

· At least 2-3 years’ experience in big data methodologies
involving Hive/Hadoop/ Spark

· Work experience in Sqoop to import the data from RDBMS

· Should have work experience in Shell scripting

· Should have experience in Hive Query Optimization

· Should have knowledge in Hadoop Storage system

· Experience with Source Code Management Tools (Github).

· Should have knowledge in ETL tools

· Should have data lake experience.

· Knowledge of standard tools for optimizing and testing code

·  Ability to operate effectively and independently in a dynamic,
fluid environment

• Experience with Continuous Integration and related tools (i.e.
Jenkins, Hudson, Maven)



*Deepak Gulia *| Simplion – cloud*.* made simple

Fax: 408-935-8696 | Email: deepak.gu...@simplion.com

*GTALK **:-  **deepakgulia.rgtal...@gmail.com*


*LinkedIn* *https://in.linkedin.com/in/deepak-gulia-308a2b9b*
<https://in.linkedin.com/in/deepak-gulia-308a2b9b>

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Big Data Architect – /// Hadoop Developer // ETL Testing --- Immediate interviews

2018-08-09 Thread recruiter logan
*Hi Folks,hope you are doing goodPlease go through the JD and share me the
suitable profiles.Please reach me at :Email: sant...@humacinc.com
 Call :   +1 623-242-2594 Required Passport
Number/Copy for submissionNote: Any Visa with valid passport number and
genuine exp. *
*Please check the requirements available for the day *


*1.   Big Data Architect –  *
*Experience: 10+*
*Location: Hillsboro,OR*
*Note: NO OPT,CPT,GC-EAD*
*Must have skills:*

1. Expert level design and development experience on Cloud
2.AWS
3 Hadoop
4. Spark,
5.Scala
6. sqoop
7 SQL
8. Python
_


*2..   Hadoop Developer*
*Experience:8+*
*Location: Phoenix,AZ*


* Note: OPT,CPT,GC, USC with valid passport number is acceptable  *

*Must Have Skills (Top 3 technical skills only) * *
1. Hadoop
2. Hive
3. Pig

Detailed Job Description:
Bachelors degree or foreign equivalent required. Will also consider one
year of relevant work experience in lieu of every year of education. At
least 4-5 years of experience in Big Data space.Strong Hadoop MAP REDUCE
Hive Pig SQOOPO OZIE MUST Candidate should have hands on experience with
Java, APIs, spring MUST Good exposure to columnar NoSQL DBs like HBase.
Complex High Volume High Velocity projects end to end delivery experience
Good experience with at least one of the scripting language l



*3. Job Title: ETL Testing *

*Work Location: Phoenix, AZ - 85054*

*Note: OPT,CPT,GC, USC with valid passport number is acceptable *

***2 position available to fill***



*Job Details:*

*Must Have Skills (Top 3 technical skills only) [MUST be YOUR W2 employee
and NO Third Party Candidates or C2C candidates]*

1. ETL Testing

2. Big Data

3. Java automation

*Nice to have skills*

Selenium

*Detailed Job Description:*

Good knowledge of Big Data and ETL testing, extensive ETL testing knowledge
with test automation experience using Java and Selenium. Should be able to
manage end to end testing along with client communication and onsite
offshore coordination.



-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
<http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer(Spark Expert) || Onsite (Bradenton, FL)

2018-08-08 Thread neha nityo
Rate $60/hr on c2c || Please send me Genuine Resume @neh...@nityo.com



Let me know if you have any candidate available for the below job position
we have, Thanks!!

Kindly do share your updated resume for further process (neh...@nityo.com).



*Role : Hadoop Developer(Spark Expert)*

*Location : Onsite (Bradenton, FL)*

*Duration: 12+ Months*

*Implementation partner : TCS*



Mandatory Technical :-

o   Expertise working with large scale distributed systems (Hadoop, Spark).

o   Strong understanding of the big data cluster, and its architecture

o   Experience building and optimizing big data ETL pipelines.

o   Advanced programming skills with Python, Java, Scala

o   Good knowledge of spark internals and performance tuning of spark jobs.

o   Strong SQL skills and is comfortable operating with relational data
models and structure.







Thanks & Regards,

Neha Gupta

Team Lead

Nityo Infotech Corp.

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

LinkedIN: www.linkedin.com/in/nehag6

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Lead Hadoop Developer || Alpharetta, GA

2018-08-06 Thread neha nityo
Kindly do share your updated resume for further process (neh...@nityo.com).



Role : Lead Hadoop Developer

Location : Alpharetta, GA

Duration: 12+ Months



Job Description:
3+ years’ experience in Big Data Hadoop, Hive and Spark with hands on
expertise in design and implementation of high data volume solution

• Strong in Spark Scala pipelines (both ETL & Streaming)

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


URGENT REQUIREMENT FOR HADOOP DEVELOPER

2018-08-02 Thread VEER PARTAP
* Position: Hadoop Developer with Java kafkaLocation:  Johns Creek
GAUSC/GC/GENIUN H1B CANDIDATES ONLY*

*Job Description: *

   - *Hadoop, Hive, Impala, HBase and related technologies*
   - *Spark/Spark2*
   - *MPP, shared-nothing database systems, NoSQL systems*
   - *Object-Oriented and Functional Programming Experience*
   - *Data Warehousing design and concepts*
   - *Exposure to Infrastructure as Code (Ansible, Terraform)*
   - *Kafka*
   - *Java*
   - *Shell Scripting*
   - *SQL*
   -

*===*


*Position:  Java Automation*

*Location: Johnscreek GA*

*Duration: 6+ Months*




*USC/GC/GENIUN H1B CANDIDATES ONLY*


* Job Description: *


   -
*BDD – Jbehave/Cucumber *
   -
*Rest Services automation using rest assured API *
   -
*SQL *
   -
*UI automation using selenium *

===

-- 
-- 
*Thanks & Regards,*
*Veer Partap*
*IT Recruiter*
*Direct: 732-645-2086*
*veerpartaphmgamer...@gmail.com *
*www.hmgamerica.com <http://www.hmgamerica.com/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


URGENT REQUIREMENT FOR HADOOP DEVELOPER

2018-08-01 Thread VEER PARTAP
*Position: hadoop developer wtih java kafka*

*Location:  johnscreek ga*

* GC/US Citizens Only*


* Job Description is simple:Skill Required: 1)  Hadoop 2)  Hive
3)  Spark 4)  Kafka 5)  Java 6)  Shell Scripting
7)  SQL*

-- 
-- 
*Thanks & Regards,*
*Veer Partap*
*IT Recruiter*
*Direct: 732-645-2086*
*veerpartaphmgamer...@gmail.com *
*www.hmgamerica.com <http://www.hmgamerica.com/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Require: Hadoop Developer,Location : Tampa,FL,Duration : Long Term,Rate :$55/hr

2018-07-27 Thread Amar

Role:Hadoop Developer
Location : Tampa,FL
Duration : Long Term
Rate :$55/hr

*For -- H1b, H4 EAD, GC EAD, L2 EAD --- > Need Visa copy and passport 
no. for the submission.*


Responsibilities**

•Develop logical and physical data models for big data platforms.
•Write data pipelines using Apache Hive, Apache Spark / Hadoop
•Create solutions on AWS using services such as Lambda and API Gateway.
•Assist our team in building an Apache Spark infrastructure using 
existing business rules

•Transition legacy systems on-prem to AWS
•Learn our business domain and technology infrastructure quickly and 
share your knowledge freely and proactively with others in the team.

•Testing prototypes and propose accepted methodologies for Big Data.
•Participate in daily standup meetings and all meetings of the agile 
SDLC: planning, estimation, retrospectives, demos, etc.


Must Have**

•In depth understanding of big data processing/analysis as well as 
Apache Hadoop and Apache Spark ecosystem.

•Strong development skills in JAVA/Scala

Key Qualifications**

•5+ years of hands-on experience with developing data warehouse 
solutions and data products.
•2-3 years of hands-on experience in modeling and designing schema for 
data lakes
•2+ years of hands-on experience developing a distributed data 
processing platform with Hadoop, Hive, Spark, etc.

•Practice working with, processing, and managing large data sets
•Exposure to test driven development and automated testing frameworks.
•Background in Scrum/Agile development methodologies.
•Capable of delivering on multiple competing priorities with little 
supervision.
•Excellent verbal and written communication skills as well as ability to 
work in a globally distributed team

•Knowledge of and contribution to Hadoop ecosystem

Nice To Have**

•Experience building data product pipelines
•Familiarity with AWS technologies, particularly EMR, S3, Athena and 
RedShift
•Familiarity, better if hands-on, with tools/methods/framework for 
unit/system/functional testing of AWS components in Big Data/Spark ecosystem



*Thanks and Regards
Amarinder Singh (Sr. IT Associate)
Kalven Technologies Inc. 2300, E Higgins Rd, Suite 211, ELK Grove 
Village, IL-60007

1701, E.Wood Field Rd, Suite 300, Schaumburg, IL-60173
Phone :312-667-0211 | Email id : 
amarin...@kalventech.com | LinkedIn : 
Amar Singh | Skype id : Amarinderkalven**

**http://www.kalventech.com**
Product Engineering | Systems Integration | Professional Services.

**

*Note: Under Bill s.1618 Title III passed by the 105th U.S. Congress 
this mail cannot be considered Spam as long as we include contact 
information and a remove link for removal from our mailing list. To be 
removed from our mailing list reply with "remove" and include your 
"original email address/addresses" in the subject heading. Include 
complete address/addresses and/or domain to be removed. We will 
immediately update it accordingly. We apologize for the inconvenience 
caused..



--
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Infosys requirements!!!!----- Data Engineer// UI (React JS)Developer // Hadoop developer

2018-07-23 Thread recruiter logan
* Hi Partner,Hope you are doing good!Please go through the requirement
mentioned below and share the suitable profiles.Mail:  sant...@humacinc.com
 Feel free to call  +1 623-242-2594   Required
Passport Number/Copy for submission *

___

*Position: Data Engineer*
*Location: Boston, Massachusetts  *
*Client: NIKE *
*Duration: 1 year*
*VISA: H1B, H4-AED, GC,USC*



*Required Skills:*


8-10 Years of total experience, can go for high expr candidates only if
they are interested to work on a junior role of hands on developer/engineer.

5+ years of development experience in data management/integration/BI tools
such as Informatica, Business Objects/Tableau etc.,

Role will take direction from Tech Lead and/or Data Architect and will help
in developing the data management solutions.

Preferred to have knowledge in the data modeling, ETL patterns and/or
standards in the data management space.




*__*

*Role:UI (React JS)Developer*
*Location:Phoenix,AZ*

*Client: AMEX *
*Contract:long term *
*VISA: OPT-EAD*

*Skills:*
React JS
Angular JS
HTML5,CSS3

*Detailed Description:*
Need a strong UI/Front end Developer for client Amex at Phoenix.



*Job Title: Hadoop developer*
*Location: Phoenix AZ*
*Client: AMEX*
*Contract: long term *

* VISA: OPT-EAD *



*Job Details:*

*Must Have Skills (Top 3 technical skills only):*

1. Hadoop ecosystem,

2. Java full stack

3. Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring (advanced)



*Detailed Job Description:*

Candidate should have Bigdata, Haddop handson experience. Proficient on
Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring advanced


*Top 3 responsibilities you would expect the Subcon to shoulder and
execute: *

1. Should engage with client to understand business requirement

2. Convert the requirement into design and work products

3. Engage with all stakeholders in necessary technical and business
communications





-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
<http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent requirements for OPT consultants-- UI developer// Hadoop developer for AMEX

2018-07-19 Thread recruiter logan
 *Hi Partner,*

*Hope you are doing good!*
*Please go through the requirement mentioned below and share the suitable
profiles.*

*Mail:  sant...@humacinc.com  *

*Feel free to call  +1 623-242-2594   *

*Required Passport Number/Copy for submission*
*ONLY: OPT required  *


*__*

*Role:UI (React JS)Developer*
*Location:Phoenix,AZ*

*Client: AMEX *
*Contract:long term *

*Skills:*
React JS
Angular JS
HTML5,CSS3

*Detailed Description:*
Need a strong UI/Front end Developer for client Amex at Phoenix.



*Job Title: Hadoop developer*
*Location: Phoenix AZ*
*Client: AMEX*
*Contract: long term *



*Job Details:*

*Must Have Skills (Top 3 technical skills only):*

1. Hadoop ecosystem,

2. Java full stack

3. Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring (advanced)



*Detailed Job Description:*

Candidate should have Bigdata, Haddop handson experience. Proficient on
Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring advanced


*Top 3 responsibilities you would expect the Subcon to shoulder and
execute: *

1. Should engage with client to understand business requirement

2. Convert the requirement into design and work products

3. Engage with all stakeholders in necessary technical and business
communications

-- 


*Thanks & Regards...*


* [image:
https://newoldstamp.com/editor/profilePictures/profile-416b758aa53895552ddafe3d552e628a-194754.jpg?1469116491613]*


*Santosh*

*IT Analyst, Humac Inc.*
*2730 W Agua Fria, Freeway, Suite#204*
*Phoenix, AZ 85027*

*Search: www.humacinc.com
<http://www.humacinc.com/> E: sant...@humacinc.com *

*Ph: +1 623-242-2594   Hangouts: recruiter.lo...@gmail.com
*

*Linkedin: https://www.linkedin.com/in/santosh-kumar-98b6ba15b/
<https://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Immediate Need Hadoop Developer & Architect @ Philadelphia, PA 19103

2018-07-17 Thread Alok RMG
Please send me resume ASAP at *a...@lorventech.com*






*Hadoop Developer & Architect 6-12 MonthsPhiladelphia, PA 19103Face to face
after telephonic/Skype RoundJob Description:*

*For Architect:*
The Big Data Software Developer will develop (code/program), test, debug
-ETL (Extract/Transform/Load) of data to answer technically challenging
business requirements
(complex transformations, high data volume).

*For Developer:*
1.   Big data development experience: candidates should have 3-5 years
of professional big data development experience using Spark streaming api ,
scala , elastic search , mongodb , kinesis stream , AWS
2.   3+ years working in Java, Spring framework, Should have the
interest and comfortable on server side coding


Best Regards,
Alok
Lorven Technologies, Inc
Tel:609-799-4202 X 220
E-Mail: a...@lorventech.com

*WHATSAPP GROUP - RMG -US STAFFING =>*

https://chat.whatsapp.com/4wVkj3EPK8G8hASfTlsOvL



*TELEGRAM GROUP - RMG - US STAFFING =>*
https://t.me/joinchat/H-TDDRA9UdP6RQGFPIZWCQ

*NOTE:* This mail has been sent through Google Groups. If you do not want
to receive emails in future then please remove your Email ID from that
specific Google Group.

*Disclaimer:*
This is not an unsolicited mail. If you have received this message by
mistake or are not interested in receiving our e-mails, please reply with a
"REMOVE" in the subject line and delete this message from your system. We
are sorry for the inconvenience caused to you.

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent requirements!!!!!!! Data Analyst / Business system Analyst // Hadoop developer

2018-07-17 Thread recruiter logan
*Hi Folks,Please go through the Job description mentioned below and share
me the suitable profiles Please reach me at :Email: sant...@humacinc.com
 Call :   +1 623-242-2594 Required Passport
Number/Copy for submissionNote: Any VISA with passport number are
acceptable ___*

*Role: Data Analyst / Business system Analyst *
*Location: Hilisboro,OR*
*Client: Nike*

*Must have skills: *
 1. DATA ACQUISITION
2.   BUSINESS REQUIREMENTS
3.DATA ANALYSIS
4.DATA MANAGEMENT

*Job Description:*

Positions for DATA ACQUISITION, Who can perform BUSINESS REQUIREMENTS with
Scubon experience in DATA ANALYSIS, DATA MANAGEMENT and  BSA. Experience
required 8+


___






*Job Title: Hadoop developerLocation: Phoenix AZClient: AMEXJob
Details:Must Have Skills (Top 3 technical skills
only):1. Hadoop ecosystem, 2. Java full stack 3. Hive, Pig, Spark, Python,
Kafka, MapR and JavaSpring (advanced)Detailed Job Description:Candidate
should have Bigdata, Haddop handson experience. Proficient on Hive, Pig,
Spark, Python, Kafka, MapR and JavaSpring advancedMinimum years of
experience: 8+Top 3 responsibilities you would expect the Subcon to
shoulder and execute: 1. Should engage with client to understand business
requirement 2. Convert the requirement into design and work products 3.
Engage with all stakeholders in necessary technical and business
communications*


-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


*** Hadoop Developer @ Hillsboro, OR/Portland OR

2018-07-16 Thread Dhiraj Kumar
Please Share Me Profile --  dhiraj.ku...@q1tech.com




*Note: *

   1. *I Need passport number for H1B Consultant *
   2. *No OPT and Transfer consultant for this position*



*Role: Hadoop Developer*

*Job Location: Hillsboro, OR/Portland OR*

*Duration: Long Term Contract*

*Interview: Telephonic/Skype*



*Must Have:*

   - *Hadoop, Spark, Hive, Python, RESTful API’s ,NOSQL technologies such
   as HBase, DynamoDB, Cassandra, Apache Spark, Flink, Kafka stream*
  - Design and implement distributed data processing pipelines using
  Spark, Hive, Python, and other tools and languages prevalent in
the Hadoop
  ecosystem. Ability to design and implement end to end solution.
  - Experience publishing RESTful API’s to enable real-time data
  consumption using OpenAPI specifications
  - Experience with open source NOSQL technologies such as HBase,
  DynamoDB, Cassandra
  - Familiar with Distributed Stream Processing frameworks for Fast &
  Big Data like ApacheSpark, Flink, Kafka stream
  - Build utilities, user defined functions, and frameworks to better
  enable data flow patterns.
  - Work with architecture/engineering leads and other teams to ensure
  quality solutions are implements, and engineering best practices are
  defined and adhered to.
  - Experience in Business Rule management systems like Drools



Regards,



*Dhiraj Kumar*

*Account Manager, Resourcing*

*Q1 Tech.*

*Desk:* 630-536-8202 *Ext:* 5517

*Mailto: *dhiraj.ku...@q1tech.com 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop developer//Phoenix AZ @Client: AMEX &&& Hilsboro, OR @Client: Nike

2018-07-16 Thread recruiter logan
*Hi Partner,Hope you are doing good!Please go through the requirement
mentioned below and share the suitable profiles.*

*Mail:  sant...@humacinc.com  *

*Feel free to call  +1 623-242-2594   *

*Required Passport Number/Copy for submission*

*Note: Any VISA with passport number are acceptable *


*Multiple positions for hadoop developer *



___


*Job Title: Hadoop developer*
*Location: Phoenix AZ*
*Client: AMEX*
*Location: Hilsboro, OR*
*Client: Nike*

*Job Details:*
*Must Have Skills (Top 3 technical skills only):*
1. Hadoop ecosystem,
2. Java full stack
3. Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring (advanced)

*Detailed Job Description:*
Candidate should have Bigdata, Haddop handson experience. Proficient on
Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring advanced

*Minimum years of experience: 8+*

Top 3 responsibilities you would expect the Subcon to shoulder and execute:
1. Should engage with client to understand business requirement
2. Convert the requirement into design and work products
3. Engage with all stakeholders in necessary technical and business
communications


-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
<http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent One || Hadoop Developer || Austin, TX and Sunnyvale, CA

2018-07-06 Thread neha nityo
!! Genuine OPT Will Work (not below 1990 date of birth) !! Must Need :
Passport Copy, Number !! Client : TCS/ Apple



Let me know if you have any candidate available for the below job position
we have, Thanks!!

Kindly do share your updated resume for further process (neh...@nityo.com).



Role :  Hadoop Developer

Location : Austin, TX and Sunnyvale, CA

Duration: 12+ Months

Skype interview for next 3 days



Job Description:
3+ years’ experience in Big Data Hadoop, Hive and Spark with hands on
expertise in design and implementation of high data volume solution

• Strong in Spark Scala pipelines (both ETL & Streaming)

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform

• 3 yrs experience in Design and implementation using Hadoop,
Hive

• Should be able to optimize and performance tune Hive queries

• Experience in one coding language is a must - Java/Python

• Worked on designing ETL & Streaming pipelines in Spark Scala.





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent need of a Hadoop Developer

2018-07-05 Thread Deepak Gulia
*Please send me the profiles at deepak.gu...@simplion.com
*


*Position : Hadoop Developer *

*Location : San Jose, CA*

*Duration : Long Term *



   - Build back-end ETL components and solutions using Hive and sparksql.
   - Working Proficiency and Expertise on Hive is required.
   - Working Experience and understanding of  Hadoop architecture.
   - Strong experience with Data Warehousing concepts and standards
   - Work with the team to build, manage, optimize and customize ETL
   products and solutions applying best practices
   - Proficiency in SQL





*Deepak Gulia *| Simplion – cloud*.* made simple

Direct: 408-717-4886 | Fax: 408-935-8696 | Email: deepak.gu...@simplion.com

*GTALK **:-  **deepakgulia.rgtal...@gmail.com*


*LinkedIn* *https://in.linkedin.com/in/deepak-gulia-308a2b9b*
<https://in.linkedin.com/in/deepak-gulia-308a2b9b>

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


IMMEDIATE NEED || Hadoop Developer @ AZ with AMEX

2018-06-28 Thread sunil kumar kalwala
*$$$  Need Visa copy, DL copy and Passport Number $$$ *
Dear Professional,
Hope you are doing great

Please share the consultants profiles  to  *sunilhumaci...@gmail.com
*  or reach me on  *623-399-4930*

*Role : Hadoop Developer*
*Location : Phoenix, AZ*
*Duration : 6+ months*
*Client : AMEX*

Job Description  :
Must Have Skills
BigData (Hive, Pig) + Scripting (UNIX, Python) + Core Java (Java/J2EE,
Spring, Microservices/APIs).
BigData (Hive, Pig, MapR) + Spark
Minimum years of experience : 8+

*-- *
*K Sunil *
*US IT Recruiter,*
*Humac, Inc.*
*sunilhumaci...@gmail.com . *
*W:  623-399-4930,*
*2730 W Agua Fria Freeway, Suite# 204*
*Phoenix, AZ 85027.*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Spark/Hadoop Developer || Sunnyvale, CA

2018-06-26 Thread neha nityo
Rate $60/hr on c2c//100% interview !!



!! Genuine OPT Will Work(not below 1992 date of birth) !! Must Need :
Passport Copy, Number /Highest Education Document !! Client : TCS/ Apple



Let me know if you have any candidate available for the below job position
we have, Thanks!!

Kindly do share your updated resume for further process (neh...@nityo.com).



Role :  Hadoop Developer/Spark Developer

Location : Sunnyvale, CA

Duration: 12+ Months



Job Description:
3+ years experience in Big Data Hadoop, Hive and Spark with hands on
expertise in design and implementation of high data volume solution

• Strong in Spark Scala pipelines (both ETL & Streaming)

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform

• 3 yrs experience in Design and implementation using Hadoop,
Hive

• Should be able to optimize and performance tune Hive queries

• Experience in one coding language is a must - Java/Python

• Worked on designing ETL & Streaming pipelines in Spark Scala.

• Good experience in Requirements gathering, Design &
Development

• Working with cross-functional teams to meet strategic goals.

• Experience in high volume data environments

• Critical thinking and excellent verbal and written
communication skills

• Strong problem-solving and analytical abilities

• Good knowledge of data warehousing concepts





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer with KAFKA//Bigdata-AWS //Hadoop - CLOUDERA CERTIFIED ADMIN.

2018-06-25 Thread recruiter logan
 *Hi Partner,*

*Hope you are doing good.*

*Tried reaching you  regarding the multiple position available below.*
*please go through the JD and share the suitable profiles*
*feel free to call me: 6232422594  *
*Share resumes to **sant...@humacinc.com *

*No: OPT,CPT*
*Required Passport number for submission*
*Linked in is mandatory *

___


*Role : Hadoop Developer with KAFKA *
*Location : Seattle, WA*
*Duration : 8+ months*
*Visa status : H1B, GC, USC*
*Job Description  :*
*Must Have Skills (Top 3 technical skills only) * *
1. Spark
2. Kafka
3. Java
*Detailed Job Description:*
Should have worked on Apache spark and or Kafka.
He she should be able to contribute to fast data platform development using
spark streaming.
Also have 5 years of hands on experience in core java.
*Minimum years of experience: 9+*
*Top 3 responsibilities you would expect the Subcon to shoulder and
execute: *
1. Should have worked on Apache spark andor Kafka
2. Heshe should be able to contribute to fast data platform development
using spark streaming
3. Also have 5 years of hands on experience in core java.


*Role
: Bigdata-AWS*
*Location : Seattle, WA*
*Duration : 8+ months*
*Visa status : H1B, GC, USC*
*Job Description  :*
*Must Have Skills : *
1)AWS certified

1) 5+ years of experience in Bigdata Hadoop, Spark, Hive, Pig and Map reduce
2) Strong understanding on distributed system and high availability
3) Ability to provide solution with latest big data technologies
4) Experience in implementing big data projects for more than 5 years
5) Experience in AWS or GCP for big data projects
6) Expertise on implementing data security in big data platform
7 ) Experience in Scala, Python or Java programming or any scripting
language
8) Knowledge and experience in stream analytics
9) Very strong understanding on multiple file formats like Parquet, Avro
etc.., and what is good to use for which scenario
10) Strong understanding on compression techniques and how this will be
used in Hadoop
11) Experience in performance tune, bench marking exercise


_
*Role :**Hadoop - CLOUDERA CERTIFIED ADMIN *
*Location : Seattle, WA*
*Duration : 8+ months*
*Visa status : H1B, GC, USC*

Job Details:
Must Have Skills
1. *Cloudera certified Admin *
2. Upgrades
3. Security Implementation

Nice to have skills
1. UNIX & Lin
​u​
x

Detailed Job Description:
You will interface with key stakeholders and apply your technical
proficiency across different stages of the Hadoop Administration including
Requirements Elicitation, Cluster Architecture definition, Design
Development, Testing, Implementation, warranty and transition. You will
play an important role in creation of design artefacts; deliver high
quality deliverables, support activities related to implementation and
transition; interface with internal team and key stakeholders; analyse and
resolve issues to ensure high quality deliverables at each stage of Hadoop
Admin Life Cycle. You will also deliver high quality deliverables for each
module, lead validation for all types of testing and support activities
related to implementation, transition and warranty. You will be part of a
learning culture, where teamwork and collaboration are encouraged,
excellence is rewarded, and diversity is respected and valued.

Required

Hadoop Big Data Lead (Hadoop Administration):
* 8+ Strong experience and knowledge in Hadoop Administration using Cloudera
 distribution.
* Strong hands on experience on Linux and Hadoop commands.
* Strong knowledge in copying data from cluster to cluster.
* Strong knowledge on security Implementation like Kerberos and Sentry
* Strong hands on experience on Major and minor upgrades of CM and CDH
* Good exposure performance tuning of Hadoop services like Impala, Hive,
YARN and Spark
* Strong Knowledge on Dynamic resource pooling
* Good Client Communication skills.
* Good Knowledge on producing of Platform standards & Best Practices
documents
* Good domain knowledge on Retail industry

Preferred
* Exposure to Hadoop Administration.
* Exposure to Linux and Hadoop environment

Personal
Besides the professional qualifications of the candidates we place great
importance in addition to various forms personality profile. These include:
* High analytical skills
* A high degree of initiative and flexibility
* High customer orientation
* High quality awareness
* Excellent verbal and written communication skills


-- 

*Best Regards,*

*Santosh Kumar*

*IT ANALYST *

*Mail:  **sant...@humacinc.com *

*Hangouts:* *recruiter.lo...@gmail.com *
*" Hire Character, Train Skill "*
LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b
<http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>*

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this gr

Re: Hadoop Developer # Eagan, MN - 12 Months

2018-06-22 Thread Manikanta Bhattiprolu
*HOT-LIST:*

Dear Professional,



Greetings from Vintech Solutions Inc,

I am writing this email to introduce myself *MANIKANTA* as a point of
contact from *Vintech Solutions Inc.* I am glad to reach you today and
wondering if you can help me in placing my candidates in your contract
roles.



I really have excellent potential candidates for your prestigious client
requirements and I'm happy to associate with you and I look forward to
working with you on daily basis requirements, submissions and interview
follow-ups.



Currently, I am working on below technologies like



· Java/J2EE Developers

· Full Stack Developers

· Hadoop/Spark/Scala/Big Data Developers

· UI Developers

· AEM Developer

· Android Developer

· Data Engineer



*If you need additional information please do feel free to call me
314-989-9000*734 / email: *manika...@vintech.com  *



*Thanks & Regards,*

*Naga Manikanta* | *IT Recruiter*

Vintech Solutions, Inc | *Sales Team*

ERP & IT SERVICES | Consulting - Development - Staffing

Email: manika...@vintech.com | O: 314-989-9000 X 734 | F: 314-989-9009

   <https://www.linkedin.com/in/mani-b-468a3313b/>
<https://www.facebook.com/Vintechsolutions> <http://www.vintech.com/>



On Fri, Jun 22, 2018 at 11:01 AM,  wrote:

> Dear *Partner,*
>
>
> Greetings of the day!
>
>
>
> Please find the below requirement. If you are interested send me resume at 
> *roh...@intellisofttech.com
>  or 972-756-1212 Ext: 128*
>
>
>
> *Job Title: Hadoop Developer*
>
> *Position Type: 12 months Contract*
>
> *City: Eagan*
>
> *State: MN*
>
>
>
> *Required Skills :*
>
>
>
> • Scala exp would be great if possible
> • Bug fixes, admin support but lean more on development side
> • Support role - needing ETL, Datastage, Hadoop, Cloudera, Scala
> • Data warehouse knowledge
> • Experience using applications and troubleshooting
> • Excellent comm skills as this individual would be training others
> • Opportunity for individual to shape role
>
>
>
> *Job Description :*
>
>
>
> Position responsible for operational support of the Hadoop cluster used at
> Client. Technologies include Cloudera Navigator, Cloudera Manager, HDFS,
> Spark, Map Reduce, Hive, PIG, Hbase, YARN, Flume, Scoop, and HiveQL, Scala,
> Java and REST API technologies.  The candidate will support these
> technologies under the direction of the infrastructure operation team.
>
> The candidate will work closely with the infrastructure engineering build
> team.
>
>
>
> *Responsibilities:*
>
>
>
> · Install, upgrade, configure, and apply patches for Cloudera
> Manager
>
> · Setup of Hadoop cluster and maintenance support of the cluster
>
> · Keeping track of Hadoop cluster connectivity and security
>
> · Capacity planning and monitoring of Hadoop cluster job
> performances
>
> · HDFS maintenance and support
>
> · Resource manager configuration and trouble shooting
>
> · Setting up Hadoop users
>
> · Testing HDFS, Hive, Pig, and MapReduce access for the new users
>
> · Backup, recovery and maintenance
>
> · Consult with business users to manage tasks, incidents and focus
> on
>
> · Create/Update and manage reports for metrics and performance
>
> · Work with Shell Scripts, Python Scripts, and Ansible
>
> · Prepare file system / mount points
>
> · Install required services LDAP, DNS, and etc.
>
> · Collaborate with engineering, development, and operation teams
> to troubleshoot and resolve their issues
>
> · Manage Hadoop jobs using scheduler
>
> · Cluster coordination Services
>
> · Point of contact for vendor escalation
>
> · Executes and provides feedback for operational policies,
> procedure, processes, and standards
>
> · Automate manual tasks
>
> · Develop infrastructure documents
>
> · Troubleshoot production problems within assigned software
> applications.
>
>
>
> *Other duties as assigned*
>
> *Technical Qualifications:*
>
> · Administrator background – Hadoop cluster administration,
> Install, and upgrades
>
> · Experience with Cloudera software support and Cloudera Manager
>
> · HDFS file system experience
>
> · Experience with disaster recovery and business continuity
> practice with Hadoop clusters
>
> · Minimum 3 years broad-based experience with Cloudera and Hadoop
> technologies
>
> · Scripting skills in Linux environment
>
> · Hands on expe

Hadoop Developer # Eagan, MN - 12 Months

2018-06-22 Thread roheeth . ta


Dear *Partner,*


Greetings of the day!

 

Please find the below requirement. If you are interested send me resume at 
*roh...@intellisofttech.com 
 or 972-756-1212 Ext: 128* 

 

*Job Title: Hadoop Developer*

*Position Type: 12 months Contract*

*City: Eagan*

*State: MN*

 

*Required Skills :*

 

• Scala exp would be great if possible
• Bug fixes, admin support but lean more on development side
• Support role - needing ETL, Datastage, Hadoop, Cloudera, Scala
• Data warehouse knowledge
• Experience using applications and troubleshooting
• Excellent comm skills as this individual would be training others
• Opportunity for individual to shape role

 

*Job Description :*

 

Position responsible for operational support of the Hadoop cluster used at 
Client. Technologies include Cloudera Navigator, Cloudera Manager, HDFS, 
Spark, Map Reduce, Hive, PIG, Hbase, YARN, Flume, Scoop, and HiveQL, Scala, 
Java and REST API technologies.  The candidate will support these 
technologies under the direction of the infrastructure operation team.

The candidate will work closely with the infrastructure engineering build 
team.

 

*Responsibilities:*

 

· Install, upgrade, configure, and apply patches for Cloudera 
Manager

· Setup of Hadoop cluster and maintenance support of the cluster

· Keeping track of Hadoop cluster connectivity and security

· Capacity planning and monitoring of Hadoop cluster job 
performances

· HDFS maintenance and support

· Resource manager configuration and trouble shooting

· Setting up Hadoop users

· Testing HDFS, Hive, Pig, and MapReduce access for the new users

· Backup, recovery and maintenance

· Consult with business users to manage tasks, incidents and focus 
on 

· Create/Update and manage reports for metrics and performance

· Work with Shell Scripts, Python Scripts, and Ansible

· Prepare file system / mount points

· Install required services LDAP, DNS, and etc.

· Collaborate with engineering, development, and operation teams to 
troubleshoot and resolve their issues

· Manage Hadoop jobs using scheduler

· Cluster coordination Services

· Point of contact for vendor escalation

· Executes and provides feedback for operational policies, 
procedure, processes, and standards

· Automate manual tasks

· Develop infrastructure documents

· Troubleshoot production problems within assigned software 
applications.

 

*Other duties as assigned*

*Technical Qualifications:*

· Administrator background – Hadoop cluster administration, 
Install, and upgrades

· Experience with Cloudera software support and Cloudera Manager

· HDFS file system experience

· Experience with disaster recovery and business continuity 
practice with Hadoop clusters

· Minimum 3 years broad-based experience with Cloudera and Hadoop 
technologies

· Scripting skills in Linux environment

· Hands on experience in Cloudera, Hadoop cluster, HCatalog, and 
Hive

· Hadoop authentication protocol (Kerbero) knowledge

· Development skills in Java, REST API, Python, Scala

· Experience with Design of Data Integration using ETL, ELT or 
patterns.

· Map Reduce, PIG, HIVE, SPARK, HADOOP

 

 

Best Regards,

*Rohith *

*Office:* *(**972)-756-1212* *Ext**. 128* | *Email:* 
roh...@intellisofttech.com

*Intellisoft Technologies Inc* <http://www.intellisofttech.com/> – 1320 
Greenway Dive, #460. Irving, TX 75038

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Urgent Requirement for Big data and Hadoop Developer only US and GC

2018-06-07 Thread harirecruiter21


Hi,

Hope you are doing well !!



  This is Harish From USG. 
I have an Urgent Requirement for Bigdata and Hadoop Developer. I hope your 
resume fits for the below position. Kindly let me know your interest and 
also required your updated resume. Please feel free to reach me any time.






*Role*: Bigdata and Hadoop Developer  

*Location*: New York, NY

*Duration*: 18 months 

*Experience*:4-6 Years


 


*JD*

 

Competencies Digital: BigData and Hadoop Ecosystems

   

*Essential Skills*

• Hadoop, Strong PL/SQL knowledge and development experience

• In working as a team lead which involves teamwork, coordination with 
onshore-offshore, requirement gathering documentation for SRS, BRD, 
Testing, and Status – reporting.  



*Desirable Skills  *

•Any other experience of Reporting Technologies or tools will be an added 
advantage.

 

 

Harish Karnala

United Software Group Inc.. 

565 Metro Place South. Suite # 110 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>

Dublin, OH 43017 
<https://maps.google.com/?q=565+Metro+Place+South.+Suite+%23+110++Dublin,+OH+43017=gmail=g>
 

Direct Number : +1 614-408 1549

Board Number : 614-495-9222 EXT. 622

Fax: 1-866-764-1148

 

karnal...@usgrpinc.com 

www.usgrpinc.com

 

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer - new contract role at Sunnyvale, CA

2018-06-06 Thread RAVI KRISHNA
Hi Folks,



Hope you are doing good!!!



Please go through the below requirement and let me know whether you have
any resource.



You can reach me at 847-440-2436 x 365.



*Hadoop Developer*

*Sunnyvale, CA*

*6+ Months Contract*



*Job Description:*

Candidate should have over all of 5+ yrs. of total IT experience in which

   - candidate should have 2-3 yrs. of Spark + Scala combination on big
   data environment,
   - should have good oral and written communication,
   - should have good confidence level and should be customer facing





Best regards,

Nihal Singh | Sr. Technical Recruiter



[image: cid:image001.jpg@01D1AD41.22B56DD0]
<https://www.facebook.com/DeegitInc>





*DeegitTM Inc* | Technology Consulting



1900 E Golf Rd., Suite 925 | Schaumburg, IL 60173



*Phone* 847-440-2436 ext:365



*Email* ni...@deegit.com



*Skype* nihalsinghaarakash

[image: cid:image002.png@01D1AD41.22B56DD0]

www.deegit.com

[image: cid:image003.jpg@01D1AD41.22B56DD0]
<https://in.linkedin.com/in/nihal-singh-32854133>

The information transmitted is intended only for the person or entity to
which it is addressed and may contain confidential and/or privileged
material. Any review, retransmission, dissemination or other use of, or
taking of any action in reliance upon this information by persons or
entities other than the intended recipient is prohibited. If you received
this in error, please contact the sender and delete the material from any
computer.

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Spark/Hadoop Developer || Sunnyvale, CA

2018-06-06 Thread neha nityo
Greeting !!!

Please find an urgent requirement on W2 Contract & please respond ASAP with
your profile @ neh...@nityo.com

*Rate $60-65/hr on c2c*

*Interview date Saturday (9june)*



Role :  Hadoop Developer/Spark Developer

Location : Sunnyvale, CA

Duration: 12+ Months

Skype Call



Job Description:
3+ years experience in Big Data Hadoop, Hive and Spark with hands on
expertise in design and implementation of high data volume solution

• Strong in Spark Scala pipelines (both ETL & Streaming)

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform

• 3 yrs experience in Design and implementation using Hadoop,
Hive

• Should be able to optimize and performance tune Hive queries

• Experience in one coding language is a must - Java/Python

• Worked on designing ETL & Streaming pipelines in Spark Scala.

• Good experience in Requirements gathering, Design &
Development

• Working with cross-functional teams to meet strategic goals.

• Experience in high volume data environments

• Critical thinking and excellent verbal and written
communication skills

• Strong problem-solving and analytical abilities

• Good knowledge of data warehousing concepts





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer/Spark Developer || Sunnyvale, CA || Interview date Saturday (9june)

2018-06-06 Thread neha nityo
Greeting !!!

Please find an urgent requirement on W2 Contract & please respond ASAP with
your profile @ neh...@nityo.com



Role :  Hadoop Developer/Spark Developer

Location : Sunnyvale, CA

Duration: 12+ Months

*Only local profile need in Sunnyvale , CA || F2F interview, Interview date
Saturday (9june)*





Job Description:
3+ years experience in Big Data Hadoop, Hive and Spark with hands on
expertise in design and implementation of high data volume solution

• Strong in Spark Scala pipelines (both ETL & Streaming)

• Proficient in Spark architecture

• Atleast 1 year experience in migration of Map Reduce process
to Spark platform

• 3 yrs experience in Design and implementation using Hadoop,
Hive

• Should be able to optimize and performance tune Hive queries

• Experience in one coding language is a must - Java/Python

• Worked on designing ETL & Streaming pipelines in Spark Scala.

• Good experience in Requirements gathering, Design &
Development

• Working with cross-functional teams to meet strategic goals.

• Experience in high volume data environments

• Critical thinking and excellent verbal and written
communication skills

• Strong problem-solving and analytical abilities

• Good knowledge of data warehousing concepts





Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer || Hillsboro, OR || Rate $60/Hr

2018-06-05 Thread neha nityo
Greetings!!!

We have a 12 Months Contract position with our client TCS. If you feel that
your profile suites this job description. Kindly do share your updated
resume for further process (neh...@nityo.com).

Must Need : Passport Copy, Number



Max Rate $60/Hr on c2c



Role : Hadoop Developer

Location : Hillsboro, OR

Duration: 12+ Months



Mandatory Technical Skill:- Hadoop Peripherals ,AWS, map-reduce

Digital : Python, Digital : Amazon Web Service(AWS) Cloud Computing,
Digital : Apache Spark,

Digital : Hadoop and its related products, Digital : HBase, Web
Technologies : CSV/JSON

Work with a variety of talented Nike teammates and be a driving force for
building solutions for Nike Digital.

You will be working on development projects related to consumer behavior,
commerce, and web analytics.







Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


Hadoop Developer || Hillsboro, OR

2018-06-05 Thread neha nityo
Greetings!!!

We have a 12 Months Contract position with our client TCS. If you feel that
your profile suites this job description. Kindly do share your updated
resume for further process (neh...@nityo.com).

Must Need : Passport Copy, Number



Max Rate $55/Hr on c2c



Role : Hadoop Developer

Location : Hillsboro, OR

Duration: 12+ Months



Mandatory Technical Skill:- Hadoop Peripherals ,AWS, map-reduce

Digital : Python, Digital : Amazon Web Service(AWS) Cloud Computing,
Digital : Apache Spark,

Digital : Hadoop and its related products, Digital : HBase, Web
Technologies : CSV/JSON

Work with a variety of talented Nike teammates and be a driving force for
building solutions for Nike Digital.

You will be working on development projects related to consumer behavior,
commerce, and web analytics.







Thanks & Regards,

Neha Gupta

Team Lead

Desk : 609-853-0818 * 2105

neh...@nityo.com

neha.gupta1...@gmail.com

www.nityo.com

-- 
You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.


  1   2   >