Azure Data Engineer / Data Engineer @ Quincy, MA / Salisbury, NC
*1) **Azure Data Engineer * *Location: Quincy, MA or Salisbury, NC* *Duration: 6 Months – 1 Year* *Skills: * *Hadoop is a plus (big data)* *Azure experience.* *More focus on Spark SQL with Data Bricks as engineer. C# dev won’t be enough.* *Priority of skill sets:* 1. Spark SQL with Data Bricks **Critical skill sets 2. Hive SQL 3. Java 4. Data modeling 5. Data factory/techniques *2) * *Azure Systems Analyst Location: Quincy, MA or Salisbury, NC* *Duration: 6Months – 1 Year* *Description:* - Excellent problem solving, Critical and Analytical thinking skills - Strong SQL Skills with working knowledge of advanced analytics - Create Technical Design documents which include source to target data mapping, process overview, data dictionary, etc - Experience as Scrum Master or working in Agile teams - Experience in Data and Analytics cloud technologies - especially in Azure Data Factory, Azure Data Lake to understand the code and work with Business and Data Engineers. *3) **Data Engineer * *Location: Santa Monica, CA* *Duration: 1 Year* *Primary Skills: SQL, Spark and RedShift/Aurora* *Please send resumes to sa...@spkconsultantsinc.com * Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dhnm8-Qez%3D3P0Tgt5P%2BDNkjUWFoNv38KtwRZ23B5xPnEcA%40mail.gmail.com.
Azure Data Engineer / Data Engineer @ Quincy, MA / Salisbury, NC
*1) **Azure Data Engineer * *Location: Quincy, MA or Salisbury, NC* *Duration: 6Months – 1 Year* *Skills: * *Hadoop is a plus (big data)* *Azure experience.* *More focus on Spark SQL with Data Bricks as engineer. C# dev won’t be enough.* *Priority of skill sets:* 1. Spark SQL with Data Bricks **Critical skill sets 2. Hive SQL 3. Java 4. Data modeling 5. Data factory/techniques *2) * *Azure Systems Analyst Location: Quincy, MA or Salisbury, NC* *Duration: 6Months – 1 Year* *Description:* - Excellent problem solving, Critical and Analytical thinking skills - Strong SQL Skills with working knowledge of advanced analytics - Create Technical Design documents which include source to target data mapping, process overview, data dictionary, etc - Experience as Scrum Master or working in Agile teams - Experience in Data and Analytics cloud technologies - especially in Azure Data Factory, Azure Data Lake to understand the code and work with Business and Data Engineers. *3) **Data Engineer * *Location: Santa Monica, CA* *Duration: 1 Year* *Primary Skills: SQL, Spark and RedShift/Aurora* *Please send resumes to sa...@spkconsultantsinc.com * Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dh%3DAU%2BRDQbuFD899q5zRH4OmoXFzQgQkNPOrDx8Pu-66uQ%40mail.gmail.com.
Azure Data Engineer @ Quincy, MA / Salisbury, NC
*Azure Data Engineer * *Location: Quincy, MA or Salisbury, NC* *Duration: 6Months – 1 Year* *Azure Data Engineer positions * *Strong technical expertise in Data and Analytics cloud technologies - especially in Azure - Azure Data Bricks, Azure SQL DW, Azure Data Factory, Azure Data Lake (Gen 1 and Gen 2), Azure HDInsights, Azure DevOps, Azure Data Lake Analytics* · ·OSS · ·Java / Python ··Mongo / Cassandra · ·Kafka / Scoop · ·Hadoop / Scala / Spark · ·**Agile Scrum including development responsibilities · ·Retail Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dhk3vywLrhe-cz0D241g%3DqtKeQDTgva%3DGv7%3Dj6Ve6gSMYQ%40mail.gmail.com.
*** Two Urgent Requirement :: Data Engineer & Data Architect
Please Share Profile - dhira...@idctechnologies.com *Role: Data Engineer* *Job Location: Tempa, FL* *Duration: 12+ Months* *Interview: Telephonic/Skype* *Must Have:* · Must have 4-5 Years of experience in Data Engineering *Role: Data Architect* *Job Location: Tempa, FL* *Duration: 12+ Months* *Interview: Telephonic/Skype* *Must Have:* · Must have 8-10+ Years of experience in Data Architect. Regards, *Dhiraj Kumar* *Account Manager* *IDC Technologies.* *Direct: *408-819-2770 *Mailto: *dhira...@idctechnologies.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CALn2kdZK0T_KmKfRg3nkMXn-F-b6OU-sf4-vYOTZMdYEcH6b-A%40mail.gmail.com.
Position – Big Data Engineer
*Position – Big Data Engineer* *Location – **SFO,CA* *Duration- 6+ Months* *Interview: Telephonic, Skype.* *Client: Persistent/Cisco* *Main responsibilities:* - Design and develop applications utilizing the Hadoop or Spark Frameworks - Read, extract, transform, stage and load data to multiple targets, including Hadoop, Hive, BigQuery. - Migrate existing data processing from standalone or legacy technology scripts to Hadoop framework processing. - Should have experience working with gigabytes/terabytes of data and must understand the challenges of transforming and enriching such large datasets. *Skills required:* - Expertise in core Java - Expertise in Big Data skills like Hadoop and Hive - Strong Shell Scripting - Experience with Spark and Python a plus -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANvUxK7vHN2%3Dtgs2Xhy%2BNiz-24ciqZF%2B2GUM%3D5pitQ4hWAF07A%40mail.gmail.com.
IMMEDIATE INTERVIEW : DATA ENGINEER
*Please send me the profiles at **deepak.gu...@simplion.com* *Position : Data Engineer* *Location : Sunnyvale /CA* *Duration : Long term* *Positions : 4* *Professional Qualifications: * • Bachelor’s Degree • 3+ years managing and troubleshooting production Hadoop jobs • Have scala * Deepak Gulia *| Simplion – cloud*.* made simple Fax: 408-935-8696 | Email: deepak.gu...@simplion.com *GTALK :- **deepakgulia.rgtal...@gmail.com* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANidSO2ha3axokdTeHb9bzLCnetndZeYAhF0xEq%2BaFa9Kpd_Rg%40mail.gmail.com.
Data Engineer @ Livermore, CA
*Data Engineer* *Location: Livermore, CA ( Need Local candidates)* *Duration: Long Term* *Need a data engineer with experience in Spark.* Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dhnCBs-yms7WB-_rUrd1LopiYfW2nHoNvEdwTBVqUf9J1A%40mail.gmail.com.
Data Engineer @ Livermore, CA
*Data Engineer* *Location: Livermore, CA ( Need Local candidates)* *Duration: Long Term* *Need a Data Engineer with experience in Spark.* Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dhnuUgfbHBhDtDnN3145v0z-qb%3Dye7Kf6XR0Ufbxn2Ws4A%40mail.gmail.com.
Open Requirements MicroStrategy Developer | OBIEE Architect / Lead | MuleSoft Developer | Data Engineer
logy · Complete project related deliverables including future state design documentation and requirements documentation · Knowledge of Oracle Fusion Finance or HCM cloud is a plus *Title: MuleSoft Developer * *Location: Atlanta, GA* *Pay Rate:** $70/hr on C2C* *Required:* · At least 7+ years of Middleware/Integration experience(any middleware platform) · At least 3 full life cycle MuleSoft Project experience(Analysis, Design, Dev, Testing, Deployment) · Hands on experience with Mule 4 and/or Mule 3.7+ with both on-premises servers and CloudHub. · Demonstrated experience implementing Mule ESB architecture, rules, caching, etc. · Experienced troubleshooting Mule ESB, including working with debuggers, flow analyzers and configuration tools. · SAP and Salesforce Integration experience with best practices using MuleSoft is a Big Plus · Understanding of MuleSoft Deployment/Physical architecture(On Prem, on Cloud and Hybrid) is a BigPlus · MuleSoft Certification is Desirable. · Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment. *Title: Data Engineer* *Location: Playa Vista, CA * *Pay Rate:** $80/hr on CTC* *Required:* · Experience on AWS and its web service offering S3, Redshift, EC2, EMR, Lambda, CloudWatch, RDS, Step functions, Spark streaming etc. · Good knowledge of Configuring and working on Multi node clusters and distributed data processing framework Spark. · Hands on 3 years of experience with EMR Apache Spark Hadoop technologies · Experience with must have Linux, Python and PySpark, Spark SQL. · Experience in working with large volumes of data Tera-bytes, analyze the data structures · Experience in designing scalable data pipelines, complex event processing, analytics components using big data technology Spark, Python, Scala, PySpark, · Expert in SQLPLSQL, Redshift, NoSQL database · Experience in process orchestration tools Apache Airflow, Apache NiFi etc. · Hands on knowledge of design, development and enhancement of Data Lakes, constantly evolve with emerging tools and technologies -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAOUkwrsg-7R0c7wubbVTVF_SL4BOsGV9o8sAgziWoe5j2eT%3DUg%40mail.gmail.com.
Need - Power BI | .Net Developer | Front End | mOS Developer | Big Data Engineer | UI/UX | Sr. AWS Consultant |
*Note: Kindly share the profile to gan...@gang-board.com * Title: Power BI Developer Location: Redmond, WA Duration: 6 Months Skills Required: Power BI, DAX. Secondary: T-SQL. Basic programming knowledge/experience in .Net. *Title: Front End Developer* *Location: Dallas, TX* *Duration: Long Term* *Qualifications and Skills:* · 10+ years of coding experience in Java technology stack · 3+ years of relevant experience in Web 2.0 technologies, NodeJS, Typescript, Git, UI Frameworks like Angular7, NGRX etc., CSS3 (Flexbox) - syntax and usage · Knowledge of mobile technologies like native script, Ionic, Cordova etc., progressive web applications (PWA) and service workers would be preferable. · Working knowledge of UI automation testing framework including Protractor, Selenium, Jasmine, Karma, etc. ensuring testing at all levels of application. · Design, Shift Left thinking. · Working knowledge of Webservices creation & consumption for REST & SOAP · Good understanding of Localization and Internationalization changes. · Basic understanding of ADA and related standards and experience of building application as per these standards. · Experience of working in SAFe methodology · Experience of using DevOps pipelines for automation · Good Logical skills and Problem solving ability, · Excellent communication skills / Must have exposure of Client facing role. *Title: BigData Engineer* *Location: Austin, TX* *Duration: Long Term* • Experience developing large scale distributed computing systems • In-depth knowledge and experience in one or more of the following technologies: Hadoop ecosystem, Kafka, Samza, Flume, HBase, Cassandra, Redshift, Vertica, Spark. • Deep understanding of key algorithms and tools for developing high efficiency data processing systems • Validated software engineering experience and discipline in design, test, source code management and CI/CD practices • Experience in data modeling and developing SQL database solutions • Proficient in working with Linux or other Posix operating systems, shell scripting, and networking technologies • Strong software development, problem-solving and debugging skills with experience in one or more of the following languages: Java, Python, Scala, or Ruby • Ambitious, passionate about software development, especially in data technologies, you love working in a fast-paced and dynamic environment • You are deeply organized, detail oriented, and thorough in every undertaking. You are able to multi-task and change focus quickly • Excellent interpersonal skills. *Title: UI/UX Visual Designer * *Location: Sunnyvale, CA* *Duration: Long Term* Skills: 10+ years of experience in UX Design (enterprise software UX design preferred) - Designing Web UI for enterprise platform (it is for web front-end; mobile UX experience is not required) - Advanced knowledge of wireframing and/or prototyping tools and methodologies - Experience of tool, Balsamiq, or any other similar tool - Should have worked on HTML5, CSS3, JavaScript and similar development tools - Experience in a fast-paced software environment, and an ability to execute against aggressive timelines - Proven ability to influence cross-functional teams without formal authority - Bring expertise to recommend better solutions/tools for better, intuitive designs for end-users - High energy level, enthusiastic, and eager to do what is necessary to be successful - Highly creative and inquisitive - Strong verbal and written communication skills - Ability to work closely with development team and other stakeholders to come out with holistic, full-fledged design *Title: Sr. Net Developer* *Location: Agoura Hills, CA* *Duration: 12 months* Skills Required MVC, .Net C#, SQL server Detailed JD / Responsibility • Minimum 8+ years of development experience • Strong at MVC 5.0, JQuery, WebApi, WCF, Angular js, bootstrap • Strong in C# • Good in ASP.Net • Good in SQL Server 2012 • Knowledge on Windows based applications using .Net is a plus • Knowledge on design patterns • Good analytical skills. • Strong at communication *Title: **AWS Senior Consultant* *Location: Houston, TX * *Duration: Long Term* No. Of Openings: 8 Experience level 11 + years Required: AWS as IaaS at enterprise level ( 500 + instances) o AMI Build Mechanisms and tools ( Building client specific images on Windows, RHEL, SUSE o Managing the following at enterprise account level § Back-up § Data Retention policy implementation § Configuration Management using tools like Chef and detailed explanation on the policies
Senior Data Engineer interview on Monday
*Position: Senior Data Engineer* *Location: Open for multiple locations ( Concord-CA; Minneapolis, MN; Charlotte-NC)* *Duration: Long term* *positions :10 ( in any location mentioned above)* *Job Description:* • Data model development and Model scoring • Work with Data Scientists and build scripts to meet their data needs • Required Qualifications • 7+ years of overall experience • 3+ years experience with Big Data platforms –Hive, Spark • 5+ years of ETL (Extract, Transform, Load) - Scoop, INFA RDBMS Teradata, Oracle • Experience with Automation – Autosys or relevant tools, Python • Reproduce issues faced by Data Scientists -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAB%2BB0qk3M2Z-zD6xSD8F374CLNta5r5rUwhmpDyCQk-XZE2e8Q%40mail.gmail.com.
Need Data Engineer - Kansas City, KS
*Hello,* *Greetings from ICS Global Soft INC.!* We have a requirement for you and the details are as follows!!! *Position: Data Engineer* *Location: Kansas City, KS* *Duration: 12+ Months* *Data Engineer Job Description* As a Data Engineer, you will be working with stakeholders throughout the company to ensure we have high quality data to power our business in all departments. Your challenge will be to make sure we can scale data even more effectively to support business decisions and improve our products. Joining a cross functional team of data scientists, and product owner. *Responsibilities* - Rapidly developing proof of concepts using cutting edge technology - Experimenting with new tools and technologies to meet business requirements regardingperformance, scaling, and data quality - Providing tools that enhance Data Quality company wide - Developing integrations between multiple applications and services, both on premiseand in the cloud - Contributing to self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise - Solving issues with data and data pipelines, prioritizing based on customer impact, and building solutions that prevent them from happening again (root cause) - End-to-end ownership of data quality in our core datasets and data pipelines We are looking for driven Data Engineers who enjoy solving problems, who initiate solutions and discussions and who believe that any challenge can be scaled with the right mindset and tools. - Minimum of 3 years of experience in the field, working with systems and data infrastructure at scale - Proficiency in 1 or more server side programming languages -- preferably Python or Java - Experience working with large scale data pipelines in distributed environments with AWS technologies including S3, Glue, Athena, QuickSight; - Demonstrable experience with NoSQL, SQL, etc; - Nice to have: Experience in at least one reporting / visualization tool (e.g: Tableau) - Nice to have: Good understanding of basic analytics and machine learning concepts; - Preferably a university degree in Software Engineering or similar field; - Excellent communication, written and spoken. *Best Regards..!* *Rajender Reddy* ICS Globalsoft Inc E-mail: rajen...@icsglobalsoftinc.com Desk: +1 972-737-8734 1231 Greenway Drive | Ste 375 | Irving, TX 75038 *NMSDC Certified | Certified Minority Business Enterprise (MBE), SBE * According to Bill S.1618 Title III passed by the 105th US Congress, this message is not considered as "Spam" as we have included the contact information. If you wish to be removed from our mailing list, please respond with "remove" in the subject field. We apologize for any inconvenience caused. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANugJWxQqKzJPmJptm3jUr3%3D5zekyV3ryVZZasbZbsPyHbgiCw%40mail.gmail.com.
Immediate need for Senior Data Engineer interview today
*Position: Senior Data Engineer* *Location: Open for multiple locations ( Concord-CA; Minneapolis, MN; Charlotte-NC)* *Duration: Long term* *positions :10 ( in any location mentioned above)* *Job Description:* • Data model development and Model scoring • Work with Data Scientists and build scripts to meet their data needs • Required Qualifications • 7+ years of overall experience • 3+ years experience with Big Data platforms –Hive, Spark • 5+ years of ETL (Extract, Transform, Load) - Scoop, INFA RDBMS Teradata, Oracle • Experience with Automation – Autosys or relevant tools, Python • Reproduce issues faced by Data Scientists • Knowledge of Agile is a must Thanks Venkat ven...@headwaytek.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAB%2BB0q%3DyNMQN4YJACLLiEk-fgduTpv2qNDWm4i9N5X2qMfa4SA%40mail.gmail.com.
Immediate need for Senior Data Engineer interview today
*Position: Senior Data Engineer* *Location: Open for multiple locations ( Concord-CA; Minneapolis, MN; Charlotte-NC)* *Duration: Long term* *positions :10 ( in any location mentioned above)* *Job Description:* • Data model development and Model scoring • Work with Data Scientists and build scripts to meet their data needs • Required Qualifications • 7+ years of overall experience • 3+ years experience with Big Data platforms –Hive, Spark • 5+ years of ETL (Extract, Transform, Load) - Scoop, INFA RDBMS Teradata, Oracle • Experience with Automation – Autosys or relevant tools, Python • Reproduce issues faced by Data Scientists • Knowledge of Agile is a must Thanks Venkat ven...@headwaytek.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAB%2BB0q%3DyqvReTO409PzTR-WEF9NkUk-hO2UEHOJ3rQSgz1_gcQ%40mail.gmail.com.
Requirement for Data engineer in Sunnyvale,CA & Bentonville, Arkansas....
Hi, If you are interested and available for the Job, Please revert back with latest resume and other details required for submission to vam...@techorbit.com Job Title : Data engineer Location : Sunnyvale, CA & Bentonville, Arkansas Duration : 12+ Months Hadoop, Teradata, Google Cloud platform, NoSQL (preferably Cosmos DB), Programming(Java/Python), SQL Skills, Kafka, API Development First Name Last Name Immigration Status Current Location Date of Birth Phone Number Email Id Total IT Exp Exp in Required Skills Name of the Degree Highest education Highest Degree Start Date Highest Degree End Date University Education Type (Fulltime / Part time) PP Number EX – TCS employee if yes Fulltime or Contract Skype ID Rate Please share Three interview slots for three consecutive days. Date >From (Time) To (Time) Time Zone Vamshi vam...@techorbit.com 972-646-2158 1300 W Walnut Hill Ln. #260, Irving, TX 75038. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMpkLZBudCGzu_0pC_ROHjESV5w-YLboqTE_sDvqgRdaYwigsg%40mail.gmail.com.
Role : Big Data Engineer,Location : Bentonville,AR,Duration : 1+ year,Terms: Corp to Corp
*Role : Big Data Engineer** **Location : Bentonville,AR **Duration : 1+ year** **Terms: Corp to Corp* *Mode of Interview :* *Zoom Video Call* ** ??*Job Description :* * Minimum 6+ years in BIG Data and GCP. * Experience with the following tools and technologies: Hadoop, Spark, Kafka, * Relational SQL and NoSQL databases. * Data pipeline/workflow management tools such as Azkaban and Airflow. * AWS cloud services such as EC2, EMR, RDS and Redshift. * Stream-processing systems such as Storm and Spark-Streaming. * Object-oriented/object function scripting languages such as Python, Java, C++, etc. Thanks and Regards, Amarinder Singh (Sr. IT Associate), Kalven Technologies Inc. 2300, E Higgins Rd, Suite 211, ELK Grove Village, IL-60007, 1701, E.Wood Field Rd, Suite 300, Schaumburg, IL-60173, Work : 312-667-0211 | Email id : amarin...@kalventech.com | LinkedIn : Amar Singh | Skype id : Amarinderkalven, http://www.kalventech.com, Product Engineering | Systems Integration | Professional Services. *** *Note: Under Bill s.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered Spam as long as we include contact information and a remove link for removal from our mailing list. To be removed from our mailing list reply with "remove" and include your "original email address/addresses" in the subject heading. Include complete address/addresses and/or domain to be removed. We will immediately update it accordingly. We apologize for the inconvenience caused.. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/a72444f3-a8dc-8611-8a66-7a02bbaad90b%40kalventech.com.
Urgent Position : SQL/ BI Developer Or Data Engineer | NYC,NY
Reply to - anc...@quantumworld.us *Position :* *SQL / BI Developer Or Data Engineer * *Duration : Long Term Contract * *Location : NYC,NY* *Need Passport Number For H1B candidates * The Data Developer role is responsible for supporting Business Intelligence and Reporting initiatives. Main responsibilities are creating data structures for reporting systems and developing data exchanges and integrations between various platforms. Important aspect of this role is optimizing existing and new data processes. *Basic Requirements: * • Expert knowledge of SQL • Extensive hands on experience on MS SQL Server platform • Deep understanding of RDBMS concepts • Hands on experience developing stored procedures, functions and scripts • Expertise in database/query performance optimization (Indexing, query tuning, troubleshooting performance problems) • Expertise in developing ETL and batch processes to support data movement • Experience in performing source to target data mapping exercises • Strong knowledge of Data Warehousing concepts • Exposure to data modeling/database design • Experience working with BIG data and unstructured data sources (AWS) • Familiarity with AWS Redshift is a huge plus. • Curious, relentless with excellent written and verbal communication and interpersonal skills. *Desired Skills* • Experience developing/deploying reports on SSRS/Cognos/PowerBI platform is nice to have • Financial Services industry experience • Familiarity with Salesforce *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* *Fax : 805-834-0532* *E: *anc...@quantumworld.us -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMYaxUDFJ%3D0kc%3Dcx7cdt0LH5vdcS07du8dKZw3sCvf5FpmBfyQ%40mail.gmail.com.
today and tomorrow interview BIG Data Engineer
Title: *BIG Data Engineer* Experience : *6+ Years* Location : *Bentonville, AR* Mode of Interview : *Zoom Video Call* Duration: *12+ Months* *Job Description :* - Minimum 6+ years in BIG Data and GCP. - Experience with the following tools and technologies: Hadoop, Spark, Kafka, - Relational SQL and NoSQL databases. - Data pipeline/workflow management tools such as Azkaban and Airflow. - AWS cloud services such as EC2, EMR, RDS and Redshift. - Stream-processing systems such as Storm and Spark-Streaming. - Object-oriented/object function scripting languages such as Python, Java, C++, etc. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANvUxK5YkaVBRs0%2BUGT9yrcPr0%2BqwtNL_RgMR4L8FNuL04LaWA%40mail.gmail.com.
Position :: (Bigdata Engineer OR Data Engineer OR Data Scientist) with Azure Exp | NYC, NY
Mail to -- anc...@quantumworld.us *Job Title :: (Bigdata Engineer ORData Engineer OR Data Scientist) with Azure Exp * *Duration :: Long Term Contract* *Location :: NYC, NY* *Need Passport Number for H1b Candidates * *Description * • Hands-on experience in Azure, and must have understanding on data. • Design, construct, install, test and maintain highly scalable data management systems • Build automated data delivery pipelines and services to integrate data • Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models • Develop solutions in agile environment for the overall data domain • Deep experience with developing SQL • Must have Deep experience developing with MS SQL • Must be able to do ETL (SSIS, Azure, Informatica) • Understand Data Security *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* *Fax : 805-834-0532* *E: *anc...@quantumworld.us -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMYaxUB%3DeLiyjz%3DKoP-4sorEcEDniL%2Bv1SXv7GoYJL5YJBXE5g%40mail.gmail.com.
Contract position: Looking for Data Engineer in Redmond WA
Title: Data Engineer Location: Redmond, WA Client: - Microsoft Visa: - H1B, H4-EAD, L2-EAD, GC and USC Note: - As per client request, for H1B candidate I want passport number *Requirements:* - This position requires a bachelor’s degree in Computer Science or a related technical field. - 5+ years of experience in data warehouse technologies and/or a back-end reporting system. - Strong Background in Data warehousing principles, architecture and its implementation in large environments. - Strong scripting skills to perform data/file manipulation. - Experience with various testing methodologies, processes and artifact creation, and user acceptance testing is a plus. - Experience with ETL, Data Modeling, and working with Business Intelligence systems. - Expert in writing SQL scripts. - Experience with processing large, multi-dimensional datasets from multiple source. - Experience in monitoring and automated reporting. - Experience with various testing methodologies, processes and artifact creation, and user acceptance testing is a plus. - Ability to operate effectively and independently in a dynamic, fluid environment. - Familiarity with Agile development practices highly desirable. - Must have strong customer service skills and excellent verbal and written communication skills. Thanks & Regards, Babjee | Talent Acquisition Specialist E-mail: - bab...@quadrantresource.com Direct: 425-939-0173 Office: 425-996-8484 (Ext - 422) -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/b746df4f-4a25-407b-b959-6faece64dc72%40googlegroups.com.
Immediate Requirement for AWS Data Engineer
*Job Details:* *Title: AWS Data Engineer (SQL/Python and AWS and Experience in any ETL tool is added one)* *Location: Playa Vista, CA * *Kindly share the resumes or hotlist to dee...@kanisol.com * *Skills Required:* a) Good experience on any ETL tool like Informatica/Ab Initio/Data Stage b) Extensive experience and knowledge on SQL and query tuning skills c) Exposure or Hands on experience in AWS services like Lambda function and Redshift d) Hands on experience in Python scripts d) Good analytical and communication skills e) Ability to analyze the data variances and provide solution for the same. *Job Responsibilities: * a) Production job monitoring, failure analysis and restart the jobs b) Analyze the data discrepancies, identify root cause and communicate to business users c) Coordinate with offshore team and mentor them as required d) Status reporting -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAOUkwrtsx1W5B2%2BruHJ5w%2BN%2BwVea1SJn_Ljw8O51chHP%3Dr-z%3DQ%40mail.gmail.com.
Very Urgent Requirement :: Big data Engineer with Microstrategy experience_San Francisco, CA_6+Months
*Job Title :: Big data Engineer with MicroStrategy experience* *Location :: San Francisco, CA* *Mode of hire: Contract (C2C/W2) * *Duration :: 6+ Month contract* *Client :: Cognizant* *Must Have Skills:* - 8 years or more of Technical background in MSTR and, Big Data solutions. - 8 years or more of hands-on experience MSTR development in an Oracle/ Teradata Data Warehouse environment required. - Strong Experience in SQL programming, Unix scripting, Python and performance tuning is required. - Strong written and oral communication skills are essential. - Would require working with business users, understanding requirements. - Ability to work independently and multi-task to meet critical deadlines in a rapidly changing environment. - JIRA experience is a Plus. *Atulit Tripathi* US: 678-496--7809 atulit.tripa...@acsicorp.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/9dc43912-eba8-4472-834f-1c86c6a47216%40googlegroups.com.
Azure Data Engineer @ San Ramon, CA
*Azure Data Engineer* *Location: **San Ramon, CA* *Duration: 1**2+ Years* *Interview Process: Phone + In-person interviews* *Job Description:* *Required Qualifications:* -2+ years’ experience in consulting on data factory pipeline and transformation services -Experience with Azure services: Data Factory, Databricks, Functions, Logic Apps *Preferred Qualifications:* -Local candidates highly preferred *Responsibilities:* -Data engineer Azure data flows for CPF Plant Turnaround Analytics using Azure services: Data Factory, Databricks, Functions, Logic Apps -Implement row level security in Azure SQL DB and Azure Analysis Services -Implement CI/CD via Azure DevOps -Ingest data into a data lake from an Azure Service Bus -Create Databricks jobs to operationalize data flows -Schedule automatic refresh and scheduling refresh of AAS models -Write Ansible playbooks to deploy into Azure resources -- Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAAh4dhkCwwBoUm%3Ds0dmOi1vtzywBUUUJ-MD3tU-gkdWBVtGX8A%40mail.gmail.com.
Data Engineer(NO H1B)
Job Title: Data Engineer Location: Denver, CO Duration: 4-12 Month Contract + Extensions (Long Term Need) *Key Skills and Experience Desired:* · *Work with disparate sources of data (Redshift, PowerBI, CSV, etc.)* · *Proficient analytical skills. Experience performing ad-hoc analysis using Redshift, Looker, SQL Server, MySQL (or Postgres-SQL preferred)* · *Experience with building report visualization with tools such as Looker, PowerBI, Tableau, etc.* · *Masters or Ph.D. is preferred (not required)* · *Experience with Matlab for modeling* · *Experience developing code in Python or R* *Job Responsibilities: * · Understand business requirement of data analysis and create appropriate data models · Work with disparate sources of data (Redshift, PowerBI, CSV, etc) · Strong writing skills to communicate requirements, strategy and documentation · Understand and maintain data definition (metadata) with associated source mappings and related business rules · Able to examine data workflow processes to identify and resolve problems with correctness or performance · Ensure data policies, standards and practices are defined and adhered to throughout the company · Ensure a strong focus on data quality · Build and support reporting tools and jobs · Be on-call on a rotating basis for the services owned by the team · Bring forward ideas to experiment and work in teams to transform ideas to reality · Prioritize tasks with the scrum master that leads the team to be successful *Basic Qualifications: * · Bachelors in Mathematics, Data Science or Economics · 3+ years’ experience in data related efforts/analysis required · Proficient analytical skills and experience performing ad-hoc analysis using Redshift, Looker, SQL Server, MySQL (or Postgres-SQL preferred) · Experience with building report visualization with tools such as Looker, PowerBI, Tableau, etc · Analytically minded, critical thinker, problem-solver a must · Understanding of experimentation, predictive analytics and/or machine learning · Excellent communication skills · Comfortable training and dealing with wide range of audience *Preferred Qualifications: * · Masters or Ph.D. in above fields · Experience with Matlab for modeling · Strong experience with linear regression analysis · Hands-on work on classification problems · Experience developing code in Python or R -- *Naveen Tripathi* * Technical Recruiter* *Zenith tech Solutions* * Desk: **518 621 0048* *Fax:* *518-244-4977* <518-244-4977> *3 COMPUTER DR West,* *Suite #107* *ALBANY, NY 12205* *naveen.tripa...@zenithtechsolutions.com * *Hangout id: tripathi3...@gmail.com * *DISCLAIMER:* Note: This is not an unsolicited mail. Under Bill 1618 Title III passed by the 105th USACongress this email cannot be considered as spam as long as we include our contact information and an option to be removed from our emailing list. If you have received this message in error or, are not interested in receiving our emails, please accept our apologies.To be removed from our mailing list, please reply with the subject line. All removal requests will be honored ASAP. We sincerely apologize for any inconvenience caused to you -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAHo3s0Gi3AToMBQimMNEAj9EMFJ_9UjXMYznYjFnxDSy6ON8Qg%40mail.gmail.com.
Data Engineer(NO H1B)
Job Title: Data Engineer Location: Denver, CO Duration: 4-12 Month Contract + Extensions (Long Term Need) *Key Skills and Experience Desired:* · *Work with disparate sources of data (Redshift, PowerBI, CSV, etc.)* · *Proficient analytical skills. Experience performing ad-hoc analysis using Redshift, Looker, SQL Server, MySQL (or Postgres-SQL preferred)* · *Experience with building report visualization with tools such as Looker, PowerBI, Tableau, etc.* · *Masters or Ph.D. is preferred (not required)* · *Experience with Matlab for modeling* · *Experience developing code in Python or R* *Job Responsibilities: * · Understand business requirement of data analysis and create appropriate data models · Work with disparate sources of data (Redshift, PowerBI, CSV, etc) · Strong writing skills to communicate requirements, strategy and documentation · Understand and maintain data definition (metadata) with associated source mappings and related business rules · Able to examine data workflow processes to identify and resolve problems with correctness or performance · Ensure data policies, standards and practices are defined and adhered to throughout the company · Ensure a strong focus on data quality · Build and support reporting tools and jobs · Be on-call on a rotating basis for the services owned by the team · Bring forward ideas to experiment and work in teams to transform ideas to reality · Prioritize tasks with the scrum master that leads the team to be successful *Basic Qualifications: * · Bachelors in Mathematics, Data Science or Economics · 3+ years’ experience in data related efforts/analysis required · Proficient analytical skills and experience performing ad-hoc analysis using Redshift, Looker, SQL Server, MySQL (or Postgres-SQL preferred) · Experience with building report visualization with tools such as Looker, PowerBI, Tableau, etc · Analytically minded, critical thinker, problem-solver a must · Understanding of experimentation, predictive analytics and/or machine learning · Excellent communication skills · Comfortable training and dealing with wide range of audience *Preferred Qualifications: * · Masters or Ph.D. in above fields · Experience with Matlab for modeling · Strong experience with linear regression analysis · Hands-on work on classification problems · Experience developing code in Python or R -- *Naveen Tripathi* * Technical Recruiter* *Zenith tech Solutions* * Desk: **518 621 0048* *Fax:* *518-244-4977* <518-244-4977> *3 COMPUTER DR West,* *Suite #107* *ALBANY, NY 12205* *naveen.tripa...@zenithtechsolutions.com * *Hangout id: tripathi3...@gmail.com * *DISCLAIMER:* Note: This is not an unsolicited mail. Under Bill 1618 Title III passed by the 105th USACongress this email cannot be considered as spam as long as we include our contact information and an option to be removed from our emailing list. If you have received this message in error or, are not interested in receiving our emails, please accept our apologies.To be removed from our mailing list, please reply with the subject line. All removal requests will be honored ASAP. We sincerely apologize for any inconvenience caused to you -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAHo3s0HNH_VS72zjtakeNfXpSn5C4aC-PsYCcMevycjxBH7iAw%40mail.gmail.com.
Position : Big Data Engineer OR Data Scientist OR Data Analyst | NYC, NY
Mail to -- anc...@quantumworld.us *Job Title : Big Data Engineer** OR Data Scientist OR Data Analyst * *Duration: **Long Term Contract * *Location: NYC, NY* *No OPT – Need Passport Number for H1B candidates* *Description --* • Design, construct, install, test and maintain highly scalable data management systems • Build automated data delivery pipelines and services to integrate data • Develop solutions in agile environment for the overall data domain • Deep experience with developing SQL • Must have Deep experience developing with MS SQL • Must be able to do ETL (SSIS, Informatica) • Understand Data Security. • Exp with Azure *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMYaxUDnd2Jk3%2B3F9az9YpRO_bTMrJvoeuvSKB-JPZ6NuJE6bw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Immediate role for Hadoop Data Engineer
*Job Details:* *Role: SQL + Hadoop Data Engineer* *Location : San Jose, CA * *Duration: Contract * *Any Visa status is fine* *Required Skills* · Excellent SQL and advanced SQL skills. · Experience in Spark and possibly Scala (Hadoop ecosystem) · Knowledge of enterprise Data Warehouses · Basics of Python (Good to have) Good communication skills -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAOUkwruUQnbf03HTQ3x6KCvDw2K7j9NYtSDzJa678mKiTfaQCQ%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Immediate Opportunity for SQL + Hadoop Data Engineer
*Job Details:* *Role: SQL + Hadoop Data Engineer* *Location : San Jose, CA * *Duration: Contract * *Kindly share the resumes or hotlist to dee...@kanisol.com * *Open for all Visas* *Required Skills* · Excellent SQL and advanced SQL skills. · Experience in Spark and possibly Scala (Hadoop ecosystem) · Knowledge of enterprise Data Warehouses · Basics of Python (Good to have) · Good communication skills -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAOUkwrtfzbczRPZrXCDp9Z2wwf0DJTZ%3DBMDn%3D47Q1BrBN%2BDNug%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Data Engineer(NO H1B)
*Data Engineer* 6 Month Contract Rhode Island - Healthcare Client Interview: Phone -> WebEx/Skype You will partner with business partners to identity opportunities to leverage big data technologies in support of Pharmacy Operations with a common set of tools and infrastructure to make analytics faster, more insightful, and more efficient. You will build and architect next-generation Big Data machine learning framework developed on a group of core Hadoop technologies. You will design highly scalable and extensible Big Data platforms which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. You will define and maintain data architecture, focusing on applying technology to enable business solutions. You will assess and provide recommendations on business relevance, with appropriate timing and deployment. You will perform architecture design, data modeling, and implement CVS Big Data platforms and analytic applications. You will bring a DevOps mindset to enable big data and batch/real-time analytical solutions that leverage emerging technologies. You will develop prototypes and proof of concepts for the selected solutions, and implement complex big data projects. You will apply a creative mindset to a focus on collecting, parsing, managing, and automating data feedback loops in support of business innovation. *Job Details* 3-5 years of professional experience including the following: - Hands-on experience with “big data” platforms including Hadoop and Spark as well as experience with traditional RDBMS (eg, Teradata, Oracle). - Proficiency in “big data” technologies including MapReduce, Spark, Airflow, Kafka, Hbase, Pig, NoSQL databases, etc. - Proficiency in the following programming languages: Python, shell scripting, SQL (preferably Teradata and PL/SQL syntax) and Hive - Ability to design and build a framework to orchestrate data pipelines and ML models - Familiarity with data modeling, data architecture and governance concepts - Should be able to aggregate huge amount of data and information from large numbers of sources to discover patterns and features necessary to build machine learning models. - Design and implement end-to-end solutions using Machine Learning, Optimization, and other advanced computer science technologies, and own live deployments. - Familiarity with specialized areas such as Optimization, NLP, Reinforcement Learning, Probabilistic Inference, Machine Learning, Information Retrieval, Recommendation Systems. - Familiarity with frameworks for either Machine Learning or NLP (Scikit-Learn, SpaCy, Pytorch, Spark NLP) - Knowledge of Conda, H2O, Airflow / Oozie / Jenkins, Git - Platforms knowledge: Hadoop, Spark, Kafka, Kinesis, Oracle, Teradata - Build continuous integration/continuous delivery, test-driven development, and production deployment frameworks - Lead conversations with infrastructure teams (on-prem & cloud) on analytics application requirements (e.g., configuration, access, tools, services, compute capacity, etc.) - *Preferred Qualifications* - Exposure to Healthcare Domain knowledge - Proficiency in Python - Experience with cloud computing environment (ideally Microsoft Azure) and the organizational risks of transitioning from on-prem to cloud infrastrucuture. - Experience with automation tools: eg, Jenkins, Airflow, Control-M - Experience operating in distributed environments including cloud (Azure, GCP, AWS etc.) *Education* Bachelor’s Degree required B.S. Computer Science, Engineering, Astronomy/Physics, Economics, Math or related fields preferred -- *Naveen Tripathi* * Technical Recruiter* *Zenith tech Solutions* * Desk: **518 621 0048* *Fax:* *518-244-4977* <518-244-4977> *3 COMPUTER DR West,* *Suite #107* *ALBANY, NY 12205* *naveen.tripa...@zenithtechsolutions.com * *Hangout id: tripathi3...@gmail.com * *DISCLAIMER:* Note: This is not an unsolicited mail. Under Bill 1618 Title III passed by the 105th USACongress this email cannot be considered as spam as long as we include our contact information and an option to be removed from our emailing list. If you have received this message in error or, are not interested in receiving our emails, please accept our apologies.To be removed from our mailing list, please reply with the subject line. All removal requests will be honored ASAP. We sincerely apologize for any inconvenience caused to you -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp
Data Engineer(NO H1B)
*Data Engineer* 6 Month Contract Rhode Island - Healthcare Client Interview: Phone -> WebEx/Skype You will partner with business partners to identity opportunities to leverage big data technologies in support of Pharmacy Operations with a common set of tools and infrastructure to make analytics faster, more insightful, and more efficient. You will build and architect next-generation Big Data machine learning framework developed on a group of core Hadoop technologies. You will design highly scalable and extensible Big Data platforms which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. You will define and maintain data architecture, focusing on applying technology to enable business solutions. You will assess and provide recommendations on business relevance, with appropriate timing and deployment. You will perform architecture design, data modeling, and implement CVS Big Data platforms and analytic applications. You will bring a DevOps mindset to enable big data and batch/real-time analytical solutions that leverage emerging technologies. You will develop prototypes and proof of concepts for the selected solutions, and implement complex big data projects. You will apply a creative mindset to a focus on collecting, parsing, managing, and automating data feedback loops in support of business innovation. *Job Details* 3-5 years of professional experience including the following: - Hands-on experience with “big data” platforms including Hadoop and Spark as well as experience with traditional RDBMS (eg, Teradata, Oracle). - Proficiency in “big data” technologies including MapReduce, Spark, Airflow, Kafka, Hbase, Pig, NoSQL databases, etc. - Proficiency in the following programming languages: Python, shell scripting, SQL (preferably Teradata and PL/SQL syntax) and Hive - Ability to design and build a framework to orchestrate data pipelines and ML models - Familiarity with data modeling, data architecture and governance concepts - Should be able to aggregate huge amount of data and information from large numbers of sources to discover patterns and features necessary to build machine learning models. - Design and implement end-to-end solutions using Machine Learning, Optimization, and other advanced computer science technologies, and own live deployments. - Familiarity with specialized areas such as Optimization, NLP, Reinforcement Learning, Probabilistic Inference, Machine Learning, Information Retrieval, Recommendation Systems. - Familiarity with frameworks for either Machine Learning or NLP (Scikit-Learn, SpaCy, Pytorch, Spark NLP) - Knowledge of Conda, H2O, Airflow / Oozie / Jenkins, Git - Platforms knowledge: Hadoop, Spark, Kafka, Kinesis, Oracle, Teradata - Build continuous integration/continuous delivery, test-driven development, and production deployment frameworks - Lead conversations with infrastructure teams (on-prem & cloud) on analytics application requirements (e.g., configuration, access, tools, services, compute capacity, etc.) - *Preferred Qualifications* - Exposure to Healthcare Domain knowledge - Proficiency in Python - Experience with cloud computing environment (ideally Microsoft Azure) and the organizational risks of transitioning from on-prem to cloud infrastrucuture. - Experience with automation tools: eg, Jenkins, Airflow, Control-M - Experience operating in distributed environments including cloud (Azure, GCP, AWS etc.) *Education* Bachelor’s Degree required B.S. Computer Science, Engineering, Astronomy/Physics, Economics, Math or related fields preferred -- *Naveen Tripathi* * Technical Recruiter* *Zenith tech Solutions* * Desk: **518 621 0048* *Fax:* *518-244-4977* <518-244-4977> *3 COMPUTER DR West,* *Suite #107* *ALBANY, NY 12205* *naveen.tripa...@zenithtechsolutions.com * *Hangout id: tripathi3...@gmail.com * *DISCLAIMER:* Note: This is not an unsolicited mail. Under Bill 1618 Title III passed by the 105th USACongress this email cannot be considered as spam as long as we include our contact information and an option to be removed from our emailing list. If you have received this message in error or, are not interested in receiving our emails, please accept our apologies.To be removed from our mailing list, please reply with the subject line. All removal requests will be honored ASAP. We sincerely apologize for any inconvenience caused to you -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptoco
Need - .Net || Machine Learning || SQL PM || Talend || Big Data || Java || UI Architect || Sr. BI || Dynamic AX || Data Engineer
orm. Title Data Engineer Location Framingham, MA Duration Long Term Rate $DOE Below is the detailed JD for Data Engineer role in Framingham, MA. Please go through the JD and send me relevant resumes for the same ASAP. I am attaching below the qualifications for this role, but talking through the type of team member I am looking for is important. I have an existing staff that covers the below skill set, and we of course have more work than we can keep up with. I’m looking for someone that can work with the internal customers and the team to do more data ETL work with our existing tool set. The individual must be able to have business acumen to be successful, working with the customers is key to being successful on this team. Skills: • Delivered the full lifecycle of a solution using Hadoop, AWS S3, AWS EMR • Understand Data and Data Quality from many sources, help design the data with producers and consumers • Expert knowledge of Python • Ingested data using Big Data ETL tools (Apache Spark) • JSON data structures • Implemented data security and privacy in a cloud environment • Delivered solutions using Agile methodology Extended Skills: • Cloudera tool set • Apache Airflow • Support Data Science and ML tools like AWS Sagemaker, Cloudera CDSW *Title: UI Architect (React)* *Location: Waltham, MA.* *Duration: 6 Months* *Job Description:* *RESPONSIBILITIES* · Minimum 10 years of experience in UI development · Minimum 2+ years Design and Develop components in Reactjs supporting various Web Application efforts · Should have lead the UI team size of 5 · Experience with JavaScript supporting Reactjs development · Adhere to design guidelines and standards for all performed work · Participate in the development of advanced UX features, working closely with Web Designers as appropriate · Provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues *Client* *Centene* *Position* *Talend Lead Developer* *Location* *St. Louis, MO - (Need to join Immediately)* *Job Description* *EXPERIENCE* - Talend Data Certified Developer - A minimum of 10-11+ Years of developing ETL processes using Talend including processing of NoSQL and JSON formats. - Solid understanding of data warehouse technologies and concepts (e.g. star and snowflake schemas). - Data modeling (structured and unstructured), including slowly changing dimensions and other data warehouse concepts. - SQL programming. - Linux, scripting. - Be on top of your game. Be honest and a humble team player. - Own your tasks, be able to take notes and follow through. - Pay attention to details. If you decide to apply to this role include the name “Hera” in a creative way in your resume or cover letter. - Excellent verbal and written communication skills; including presentations to peers and senior management. - Working knowledge of word processing, spreadsheet, diagramming, etc. - Be precise in your communication. - Bachelor’s degree in Computer Science, Computer Engineering, or Information Technology from an accredited university. - We appreciate an advanced degree in your area of specialty. Thanks and Regards! Ganesh C | Staffing Manager Gangboard LLC Desk: 302 703 7764 Cell: 302 570 2866 gan...@gang-board.com www.gang-board.com Disclaimer: Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered spam as long as we include contact information and a REMOVE link for removal from our mailing list. To be removed from our mailing list reply with "remove" and include your "original email address/addresses" in the subject heading. We will immediately update it accordingly. We apologize for the inconvenience if any caused. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/05404d96-869a-4f87-aa62-e77a7625b840%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
urgent need Big Data Engineer in Pleasanton, CA | Only GC or USC or TN !!
*Please share resume to:* gau...@holistic-partners.com Hello All, Hope you are doing well, Please let me know if you have any consultant available for the given below position. *Title: Big Data Engineer* *Location: Pleasanton, CA* *Duration: 12+ months* *Phone and Skype* *USC/GC* JD: As a Big Data Developer, you will be working with clients to implement leading edge data analytics and cloud solutions across a range of industries. We are looking for software engineers with a strong understanding of the full data development lifecycle, including requirements gathering, solution design, development, and production deployment. The ideal candidate will have solid development experience with the desire and passion to learn big data technologies. Will provide training for the ideal candidate to learn Spark development. * Responsibilities* · Participate in technical planning & requirements gathering phases including design, coding, testing, troubleshooting, and documenting big data-oriented software applications. Responsible for the ingestion, maintenance, improvement, cleaning, and manipulation of data in the business’s operational and analytics databases, and troubleshoots any existent issues. · Design, build and launch extremely efficient & reliable data pipelines to move data (both large and small amounts) to our data warehouses. · Design, build and launch new data extraction, transformation and loading processes in production. · Create new systems and tools to enable the customer to consume and understand data faster. · Build, implement and support the data infrastructure; ingest and transform data (ETL/ELT process). · Implements, troubleshoots, and optimizes distributed solutions based on modern big data technologies like Hive, Hadoop, Spark, Elastic Search, Storm, Kafka, etc. in both an on premise and cloud deployment model to solve large scale processing problems. · Define and build large-scale near real-time streaming data processing pipelines that will enable faster, better, data-informed decision making within the business. · Work inside the team of industry experts on the cutting edge Big Data technologies to develop solutions for deployment at massive scale. · Study data, identify patterns, make sense out of it and convert it to algorithms. · Designs and plans BI, and other Visualization Tools capturing and analyzing data from multiple sources to make data-driven decisions, as well as debugs, monitors, and troubleshoots solutions. Keep up with industry trends and best practices, advising senior management on new and improved data engineering strategies that will drive departmental performance leading to improvement in overall improvement in data governance across the business, promoting informed decision-making, and ultimately improving overall business performance. *Required Qualifications:* · Bachelor’s Degree with a minimum of 3+ year’s relevant experience or equivalent. · *Minimum of 2 years Java/J2EE* *software development experience* in a complex enterprise system environment, building software from conception to production deployment · *1+ years of experience in Java and Spring frame work is mandatory*. · Experience in Building data pipeline with Java and Spark is plus. · *2+ years’ experience with traditional relational databases such as Oracle, SQL Server, PostgreSQL, MySQL**.* · *1+ years’ hands-on Experience with various messaging systems, such as Kafka, Spark data manipulation, pipeline creation**.* · *Experience working in an Agile/Scrum environment* · Need someone who is a self-starter and team player, capable of working with a team of Architects, Developers, Business/Data Analysts, QA, and client stakeholders · Proficient understanding of distributed computing principles · Strong written and verbal communications * Preferred Qualifications:* · 2+ years’ experience working in Big Data (Hadoop, Hive, Pig, Storm, NoSQL, HBase, Cassandra, Druid) preferably in Azure HDInsight. · 1+ years’ hands-on experience working with Business Intelligence and Reporting · 1+ years’ hands-on experience working within AWS, Azure, Google, or other Cloud Platform based on IaaS and PaaS Solutions. · Hands-on experience with DevOps solutions like: Puppet, AWS CloudFormation, Docker and Microservices. · Experience with integration of data from multiple data sources Intelligence & Visualization Space (Tableau, Power BI, Qlik, Cognos, SAP BOBJ, etc.) Regards, *Gautam Sharma* *| **IT Technical Recruiter* (D): 408-400-3343 (O): 408-400-3356 Ext: 103 (E): gau...@holistic-partners.com (A): 11 Bowles Ave West Boylston, MA 01583 *“**Believe you can and you’re halfway there**.**”* -- You received this message because you are subscribed to the G
Need Data Engineer at Seattle, WA.
Hi, Hope you are doing good, we have an urgent requirement for the position of Data Engineer at Seattle, WA. Role:Data Engineer Location: Seattle, WA Start Date:ASAP Contract: Long Term Please share resume at m...@nytpartners.com Required Skills:- . Data Engineer . AWS . Python . ETL . Elastic Search Thanks and Regards, Matt Irfan New York Technology Partners - Rochester 332 Jefferson Rd. Rochester, NY 14623 T1: (201) 680-0200 x 7025 Fax: (201) 474-8533 8 <mailto:m...@nytpartners.com> m...@nytpartners.com 8 <http://www.nytp.com/> www.nytp.com Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: cid:image002.jpg@01D261DB.2E5B8E90/ Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: cid:image003.png@01D261DB.2E5B8E90 /Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: cid:image004.png@01D261DB.2E5B8E90: irfanstaffing10 Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: cid:image005.jpg@01D261DB.2E5B8E90Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: img1Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: inc5000vsmDescription: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: Description: RT100-3 This email message and any files transmitted with it may contain confidential and proprietary data. It is intended solely for the use of the intended recipient, noted in the SMTP header. Unauthorized use of the data contained in this message is prohibited. If you have received this message in error, please reply to this message and delete the original material. Thank you for your cooperation. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/05ef01d52153%24e7fa2430%24b7ee6c90%24%40nytpartners.com. For more options, visit https://groups.google.com/d/optout.
Data Engineer @ Menlo Park, CA / Seattle , WA
Hi, Location: Menlo Park, CA / Seattle , WA Position 1: Data Engineer -1 Opening: 2 Experience: 7->10yrs JD: Experience in Advance SQL, Python, ETL, Data Modelling, Tableau or any BI tool Should be well versed in creating data pipelines using Python. Should be very strong in writing advance SQL queries. Should be in Data engineering and not Data Analyst or Data scientist. Python (Must to have) Position 2: Data Engineer -2 Opening: 3 Experience: 7->10+yrs JD: Experience in Advance SQL, Python, ETL, Data Modelling, Tableau or any BI tool Should be well versed in creating data pipelines using Python. Should be very strong in writing advance SQL queries. Should be in BI Engineer Python (Nice to have) Thanks and Regards, Shawn Brown New York Technology Partners Mobile: 408-767-7353 Office: 201-680-0200 Ext 7023 / 7028 <mailto:sh...@nytpartners.com> sh...@nytpartners.com <http://www.nytp.com/> www.nytp.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/020d01d52131%247566e790%246034b6b0%24%40nytpartners.com. For more options, visit https://groups.google.com/d/optout.
Urgent Need : Big Data/Data Engineer with Azure Exp | Los Angeles, CA And NYC, NY
Reply to -- anc...@quantumworld.us *Big Data/Data Engineer with Azure Exp* *Los Angeles, CA And NYC, NY* *Long Term Contract* *ALL Visa ( No OPT )* *Need Passport Number if Candidates is H1B, GC EAD, L2 EAD and GC EAD* • Hands-on experience in Azure Data Factory, Informatica and good understanding on data obfuscation or data masking techniques • Design, construct, install, test and maintain highly scalable data management systems • Build automated data delivery pipelines and services to integrate data • Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models • Develop solutions in agile environment for the overall data domain • Deep experience with developing SQL • Must have Deep experience developing with MS SQL • Must be able to do ETL (SSIS, Azure Data Factory, Informatica) • Understand Data Security *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMYaxUD6CvaPeqeb9JAXuDatiwOt5b3BUc_esGYPK_Xp9OH_tg%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Urgent requirement || Big Data Engineer || SFO CA || INFOSYS
Hello Professionals, Greetings from Humac Inc. I am currently looking for *“**Big Data Engineer**”* to fill a Position in *“**SFO CA**”* If you are interested for this position please reply me with your updated resume. *reach me at :* *Email: **sant...@humacinc.com* *Call : +1 623-242-2594* *Note:* *Required passport number for submission.* *NO : OPT,CPT,GC-EAD* *Corp to corp position * *Big Data Engineer* *SFO CA* *JD:* *· Must have:* . 5+ years of hands on experience with spark , and big data ecosystem, including building data pipeline solutions involving spark, kafka, hadoop / cassandra / mongodb/hbase. · Big data in Horton works stack · Proficiency in Java server side frameworks - Spring, Spring Boot, and 2 to 4 years’ experience on Hadoop Platform ( Hive, HBase, Spark, oozie, Impala etc.) *· Must be hands on and strong in spark, spark sql, spark streaming, ETL using spark technology.* · Data integration using NiFi · Big data in Horton works stack · Scala spark programming a must. · Database administration or development with a NoSQL database · Strong analytic skills related to working with unstructured data sets -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CACx%2BUq%3D9jzOrxC95_KtaaHVapC68a8nYTFHb-QL3J%3Dx6nbs2Mw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Data Engineer - San Jose, CA - 6+ Months contract
Hi *Folks*, Hope you're doing great there..! I am wondering if you have consultant available for the below position, if so please respond me with their most updated resume and the best contact number to reach them to discuss about the below role ASAP. Please respond back to aj...@tranzeal.com *NOTE : Need A Consultant ASAP..!!* *Job Title: Data Engineer* *Location: San Jose, CA* *Duration: 6+ Months Contract* *Position Summary:* •Designing, develop & tune data products, applications and integrations on large scale data platforms (SQL server, HANA, Hadoop, Kafka Streaming, etc) with an emphasis on performance, reliability and scalability and most of all quality. •Analyze the business needs, profile large data sets and build custom data models and applications to drive the Adobe business decision making and customers experience •Develop and extend design patterns, processes, standards, frameworks and reusable components for various data engineering functions/areas. •Collaborate with key stakeholders including business team, engineering leads, architects, BSA's & program managers. *The ideal candidate will have:* •MS in Computer Science/related technical field with 5+(level 3) or 10+(level 5) years of strong hands-on experience in enterprise data warehousing/big data implementations & complex data solutions and frameworks •Strong SQL, ETL, scripting and or programming skills with a preference towards Python, Java, Scala, shell scripting •Demonstrated ability to clearly form and communicate ideas to both technical and non-technical audiences. •Strong problem-solving skills with an ability to isolate, deconstruction and resolve complex data/engineering challenges •Results driven with attention to detail, strong sense of ownership, and a commitment to up-leveling the broader IDS engineering team through mentoring, innovation and thought leadership •SQL Server ETL Development experience using SSIS. Strong ETL, data warehouse, T-SQL skills. •Extensive experience in implementing large scale data warehouse and data mart architecture and implementation. •Extensive experience on MSBI Stack SQL Server DBMS, SQL Server Analysis Services (SSAS) and SQL Server Integration Services (SSIS) is preferred *Nice to have:* •Demonstrated skill working with SQL and PL/SQL programming. •Demonstrated skill designing, developing and supporting database applications. •Performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts.. •Marketing, Sales, Bookings, and finance Domain expertise is a plus •Has agile scrum work experience for executing day to day activities and reporting to scrum team. •Good communication skills across distributed team environment •Must be self-motivated, responsive, professional and dedicated to customer success •Familiarity with streaming applications Thanks & Regards Ajat Singh aj...@tranzeal.com www.tranzeal.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANON17fnSUZ2Prx5P7gjOH8oX%2BycPrt3omoT3b-Ef35OFpQmWA%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Need Big Data Engineer - Austin, TX
*Hello,* *Greetings from ICS Global Soft INC.!* We have a requirement for you and the details are as follows!!! *Position: Data Engineer* *Location: Austin, TX* *Duration: 8-12 months* *The position:* Client is seeking a Data Engineer to build out fully functional data pipelines and manage the data flow in and out our their CRM platform. This is a CI/CD, highly scalable environment in AWS. This position will require both ETL experience and real-time data integration. *Responsibilities:* - Extract, load and transform data from WeddingWire/XO and other sources to our DataWarehouse and datamarts for the consumption of our end users. - Design and develop solutions for integration between disparate systems which include cloud-based sources such as Salesforce and Exacttarget - Design and develop data models used for relational reporting tools such as Tableau, Birst, Domo, Qlik - Document data lineage and data model *Requirements:* - Programming experience and a demonstrated interest in statistical analysis, informatics, analytics, or business intelligence - Minimum of 3+ years’ experience with SQL - Hands-on experience with at least one of the following Databases (MYSQL, PostgreSQL, MSSQL, Redshift, Snowflake) - Scripting skills using shell, python or ruby, 3+ years - Experience with writing shell scripts, python, ruby or other language - Experience with at least one Business Intelligence tools such as Tableau, Birst, Domo, Qlik - Experience with at least one ETL tool such as Matillion, Talend, SSIS, Informatica - Excellent communication skills, both verbal and written. *Best Regards..!* *Rajender Reddy* E-mail: rajen...@icsglobalsoftinc.com Office: 972-737-8734 | Cell: 614-664-3283 1231 Greenway Drive | Ste 375 | Irving, TX 75038 *NMSDC Certified | Certified Minority Business Enterprise (MBE), SBE * -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANugJWxujMkOfmSyBEM7BAV2EZpRdRbOMQTtmmjCXpfCp1Wbtg%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Need Data Engineer - Austin, TX
*Hello,* *Greetings from ICS Global Soft INC.!* We have a requirement for you and the details are as follows!!! *Position: Data Engineer* *Location: Austin, TX* *Duration: 8-12 months* *The position:* Client is seeking a Data Engineer to build out fully functional data pipelines and manage the data flow in and out our their CRM platform. This is a CI/CD, highly scalable environment in AWS. This position will require both ETL experience and real-time data integration. *Responsibilities:* - Extract, load and transform data from WeddingWire/XO and other sources to our DataWarehouse and datamarts for the consumption of our end users. - Design and develop solutions for integration between disparate systems which include cloud-based sources such as Salesforce and Exacttarget - Design and develop data models used for relational reporting tools such as Tableau, Birst, Domo, Qlik - Document data lineage and data model *Requirements:* - Programming experience and a demonstrated interest in statistical analysis, informatics, analytics, or business intelligence - Minimum of 3+ years’ experience with SQL - Hands-on experience with at least one of the following Databases (MYSQL, PostgreSQL, MSSQL, Redshift, Snowflake) - Scripting skills using shell, python or ruby, 3+ years - Experience with writing shell scripts, python, ruby or other language - Experience with at least one Business Intelligence tools such as Tableau, Birst, Domo, Qlik - Experience with at least one ETL tool such as Matillion, Talend, SSIS, Informatica - Excellent communication skills, both verbal and written. *Best Regards..!* *Rajender Reddy* E-mail: rajen...@icsglobalsoftinc.com Office: 972-737-8734 | Cell: 614-664-3283 1231 Greenway Drive | Ste 375 | Irving, TX 75038 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CANugJWx_omckvicHYqdb5E8DskPd%2BT56UwfAJv%2BQTJ%3D748zpUw%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Position: - Data Engineer / Big Data Developer / Big Data Engineer | Westlake , TX
*Mail to ** -- anc...@quantumworld.us * *Position: - **Data Engineer / Big Data Developer / Big Data Engineer * *Location: - Westlake , TX* *Contract ( 6 to 12 Months )* *ALL Visa ( NO CPT ) – Need Passport Number for H1B, OPT, GC EAD, H4 EAD* *The Purpose of This Role* Data Engineers will play a pivotal role in the team, which builds the enterprise analytical platform for Fidelity. This team interacts with various stake holders across geographic locations to understand the customer needs, identify the data sources, integrate the data and prepare them to be consumed by the end user. Data Engineers may also participate in building models that helps in forecasting/getting insights on various parameters of interest. *Technical / Behavioral* • You should have experience in Enterprise Data Warehouse applications. • You are expected to have hands on experience with Hadoop Echo System, Hive, Spark and Python. • You should have experience with ETL using Spark SQL and Spark SQL Performance Tuning experience • You should have experience with data ingestion tools SQOOP and NIFI (Niagara Files) • You should have experience with Oracle/Netezza databases, ETL, dimensional modeling. • You should have experience with shell scripting, control M • AWS EMR, Jupyter and AI/ML experience is a plus *The Skills that are Good To Have for this role* • You should possess good interpersonal skills with ability to work with cross functional teams located across geographies. • You should be details oriented and sound exhibit good judgment • You should be able to effectively deals with ambiguity • You should be self-motivated with a high degree of intellectual curiosity • You should promote positive and professional work environment • Statistics back ground/prior data science experience *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAMYaxUACuC1RWOSou-EMG46wzXskPXGGBOBy-4vydCPJUehs0w%40mail.gmail.com. For more options, visit https://groups.google.com/d/optout.
Urgent Role : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )
Mail to -- anc...@quantumworld.us *Role : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )Location: Newark, DE9 to 12 Months* *No OPT – Need Passport Number for H1B and GC EAD Candidates* *Required –* - Python - Scala or Spark - Hadoop or Bigdata *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Interview within 2 days : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )
*Mail to ** -- anc...@quantumworld.us * *Role : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )Location: Newark, DE9 to 12 Months* *No OPT – Need Passport Number for H1B and GC EAD Candidates* *Required –* - Python - Scala or Spark - Hadoop or Bigdata *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent Requirements-Full Stack Developer/Data Engineer/DevOps Engineer/Software QA Engineer
We need resumes for below 4 requirements. Please help us. Location: Virginia Contract to Hire & Corp to Corp USC/GC/H1B Requirement #1 - Full Stack Developer - Applications and Integration Multiple locations in the US Long term contract The ideal candidate is an excellent software engineer/architect with exceptional understanding of Java stack, Angular/React, Java script, SOA and middle tier technologies such as ESBs, web services, data management, APIs and API management, API security, Cloud Technologies (AWS), and DevOps. *Primary Duties and Responsibilities:* · Collaborate with product, data, infrastructure and operations teams and be responsible for the defining-designing-delivering the technical architectures, patterns, business alignment, and maintainability of technical solutions · Work closely with Project Manager/scrum lead to define project milestones, acceptance criteria, manage engineering deliverables within the SDLC framework · Gather requirements, analyze source and destination systems, interfaces, security, and other operational needs to architect and design secure, robust systems · Translate business processes into technical design · Build enterprise grade software for integrating systems that include APIs and web services · Establish a best practices-based strategy for implementing cloud enterprise solutions, DevOps, Open source technologies, automated testing, and technology stack rationalization · Promote the integration of new technologies and solutions that will improve the organization’s capabilities while leading to lower total cost of operation *Essential Skills* · Passion for understanding data and for leveraging tools that extract and convey actionable insights from data · Consultant mindset – identify, communicate, and act on issues and initiatives · Takes initiative on improvements and testing results · Ability to handle multiple tasks and projects simultaneously in an organized and timely manner · Detailed oriented, with the ability to plan, prioritize, and meet deadlines in a fast-paced environment · Ability to communicate professionally and effectively, both written and verbally, particularly when under pressure · Ability to work independently, as well as part of a team *Qualifications* · Bachelor’s Degree in Information Technology or related field; or an equivalent combination of education and experience sufficient to successfully perform the key accountabilities of the job required Requirement #2 - Data Engineer Multiple locations in the US Long term contract The ideal candidate is an excellent software programmer with exceptional understanding of data, databases, data modeling techniques, BI and Analytics, distributed processing, and SOA. *Primary Duties and Responsibilities* · Gather requirements, analyze source data systems, data modeling, and implement ETL processes to populate the enterprise data warehouse · Build enterprise grade software for ingesting and processing real time and batch data, as well as analytics/BI tools to enable and support BI end users · Support and enhance existing platforms while upholding high standards for data quality · Work with various teams across the organization to understand how to interpret the data that will be incorporated into systems and reporting · Help perform ad-hoc analytics requests, analyze test results, draw actionable insights, generate reports and present findings to internal customers · Promote the integration of new technologies and solutions to improve the organization’s data processing, analysis, and validation capabilities *Essential Skills* · Passion for understanding data and for leveraging tools that extract and convey actionable insights from data · Consultant mindset – identify, communicate, and act on issues and initiatives · Takes initiative on improvements and testing results · Ability to handle multiple tasks and projects simultaneously in an organized and timely manner · Detailed oriented, with the ability to plan, prioritize, and meet deadlines in a fast-paced environment · Ability to communicate professionally and effectively, both written and verbally, particularly when under pressure · Ability to work independently, as well as part of a team *Qualifications* · Bachelor’s Degree in Information Technology or related field; or an equivalent combination of education and experience sufficient to successfully perform the key accountabilities of the job required · Solid SQL, NOSQL skills; strong database design and development capabilities (schemas, indexing, profiling, query plan optimization) · Familiarity with data architecture and deep understan
Immediate Interview::Data Engineer in Renton WA
Hello Partner, Hope you are doing well... *Job Title:* Data Engineer (*Local Profiles Or Nearby locations only)* *Location: *Renton, WA *Duration: **Long Term* *Experience Level: 8+ Years* *Job Description:* •Masters’ or Bachelors' degree in Computer Science or a related field • 5+ years of experience in large-scale software development with emphasis on data analytics and high-volume data processing • 3+ years of experience in data engineering development • 2+ years of experience implementing scalable data architectures • 2+ years of experience with AWS and related services (e.g., EC2, S3, DynamoDB, ElasticSearch, SQS, SNS, Lambda, Airflow, Snowflake) • Experience in data-centric programming languages (e.g., Python, GO, Ruby, Javascript, Scala) • Proficiency with ETL tools and techniques. • Knowledge of and experience with RDBMS platforms, such as MS SQL Server, Oracle, DB2, IMS, IDMS MySQL, Postgres, SAP HANA, and Teradata. • Experience with participating in projects in a highly collaborative, multi-discipline team environment Thanks & Best regards, Nelson White Sr. Recruiter | Talent Acquisition Team Conquest Tech Solutions, Inc. Phone: (302) 286-9010 EXT 109 Fax: (302) 357-9305 Email: nel...@conq-tech.com Hangouts: nelsonsconqu...@gmail.com Disclaimer: This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. If you are not the intended recipient of this email, you must neither take any action based upon its contents, nor copy or show it to anyone. Please contact the sender if you believe you have received this email in error. *If you feel this mail is irrelevant and you want to unsubscribe please reply with "Remove" in content or subject line.* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Need Big Data Engineer - Plano, TX
Hi, Please find the below JD and let me know if you have any suitable profiles. This is very urgent requirement *Position: Big Data Engineer* *Location: Plano, TX* *Duration: 6-12+ months* *Job Description:* *Responsibilities: * - Make architectural decisions to build efficient, adaptable, and scalable data pipelines in cloud to process unstructured Big Data - Work closely with product owners and other team members to deliver next generation connected car data services - Rapidly architect, design, prototype, and implement architectures to tackle the Big Data needs - Research, experiment, and utilize leading Big Data technologies such as Spark, Kafka, Kinesis on Microsoft Azure, AWS, etc. - Operate in a highly-interactive Agile development environment - Maintain and train junior team members on latest technologies and software engineer best practices *Requirements: * - Bachelor’s degree in Computer Science, Computer Engineering, or related field - 7+ years of experience with multiple programming languages and technologies - 3+ years of experience in the Big Data space - Fluency in languages such as Python, Scala, or Java - Ability to pick up new languages and technologies quickly - Solid understanding of cloud and distributed system principles, including load balancing, networks, scaling, and in-memory vs disk - Experience with large-scale, Big Data methods such as MapReduce, Hadoop, Spark, Hive, or Storm - Ability to work efficiently under Unix/Linux environments - Experience with source code management systems such as Git *Best Regards..!* *Rajender Reddy* E-mail: rajen...@icsglobalsoftinc.com Office: 972-737-8734 | Cell: 614-664-3283 1231 Greenway Drive | Ste 375 | Irving, TX 75038 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Interview within 2 days : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )
*Reply -- anc...@quantumworld.us * *Role : Data Scientist / Data Engineer / Data Modeler / Data Architect ( Min 7+ Years Exp )Location: Newark, DE9 to 12 Months* *No OPT – Need Passport Number for H1B and GC EAD Candidates* *Required –* - Python - Scala or Spark - Hadoop or Bigdata *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *www.quantumworld.us <http://www.quantumworld.us/>* *Office: 805-222-0532 Ext-338* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent Client Need: Data Engineer; Philadelphia PA; 12+ Months Contract
*Please send your resumes at niran...@bioinfosystems.com * *Hi* *This is Niranjan from BioinfoSystems; hope you are doing well.* *Please go through below description and reply with your resume, contact details and current location, if you feel comfortable* *Title: Data Engineer* *Location: Philadelphia PA* *Duration: 12+ Months Contract* *Responsibilities:* · Hands on experience in developing big data pipelines end to end · 5-8 years of Java and Python experience · Experience in software development of largescale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem · Experience working with imperfect data sets that, at times, will require improvements to process, definition and collection · Experience with real-time data pipelines and components including Kafka, Spark Streaming · Proficient in Unix/Linux environments · AWS experience developing data streaming pipelines · Deep Spark understanding · Must Have Skills : Spark, SQL,Kinesis/Kafka, python for scripting on AWS and Java for APIs *Please do follow on our LinkedIn page:- https://www.linkedin.com/company/bioinfo-systems-llc/ <https://www.linkedin.com/company/bioinfo-systems-llc/> * *Thanks & Regards* *Niranjan Kumar * *Sr. Technical Recruiter* Desk: 860-207-9466|Fax: 860-722-9692 Email: niran...@bioinfosystems.com [image: Description: Description: Description: logo] -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer
*Hadoop / Data Engineer / Data Migration with Snowflake migration experience* *Location: San Jose ,CA * *Duration: Long Term* *Description:* *Spark, Hive, Hadoop, Linux with Snowflake migration experience.* Regards, *Sansi * | *SPK Consultants INC* Email : sa...@spkconsultantsinc.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Contract opportunities for Data Engineer at NYC, NY
*Job Details:* *Position: Data Engineer with AWS Redshift and ETL* *Location: New York, NY* *Duration: 6 Months* *Kindly share the resumes to dee...@kanisol.com OR you can reach me on 609-651-4663* *Job Description* · Experienced data engineer · Building ETL and data pipes, previous extensive experience with Informatica ETL tools is a must; · Knowledge of other ETL tools like Snowflake, SSIS, Amazon Glue, etc. is a plus · Experience with designing ETL data pipes in DW environment - maintaining dimensions and facts; staging areas · Understanding data modeling - specifically dimensional modeling (slowly changing dimensions, fact tables) · Extensive experience with relational databases - Redshift is strongly preferred, but strong knowledge of relational db like Oracle or SQL server, etc. is ok; · Understanding database performance tuning is a must · Experience with AWS, S3, Redshift is preferred · Knowledge of analytical and reporting tools like Looker or Tableau is a big plus. · Knowledge of Python is a plus. *Title: Data Engineer* *Location: New York, NY* *Duration: Contract* *Kindly share the resumes to dee...@kanisol.com OR you can reach me on 609-651-4663* *Job Description:* · Collaborate with product teams, data analysts and data scientists to design and build data-forward solutions · Build and deploy streaming and batch data pipelines capable of processing and storing petabytes of data quickly and reliably · Integrate with a variety of data providers ranging from marketing, web analytics, and consumer devices metrics · Build and maintain dimensional data warehouses in support of business intelligence tools · Develop data catalogs and data validations to ensure clarity and correctness of key business metrics · Drive and maintain a culture of quality, innovation and experimentation *Required skills* · *Any of the following tools : (Kafka or Spark, or Flink)* · AWS-based data solutions with tools such as Cloud Formation, IAM, and Kinesis · Big-data solutions using technologies like S3, Spark and an in-depth understanding of data partitioning and sharding techniques · Experience loading and querying cloud-hosted databases such as Redshift. Building streaming data pipelines using Kafka or Spark, or Flink -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Need - Automation Tester || Microstrategy || WebSphere Admin || Devops Lead || Java Fullstack Developer || Big Data Engineer/Lead Architect || ETL Lead
*Note: Kindly share the profile to gan...@gang-board.com * *Title: WebSphere Administrator * *Location: Tampa, FL.* *Duration: Long Term * * Client: NYL* Required: This resource will act as a Primary Administrator and Support for Middleware technologies including Websphere ND Application Server, Websphere MQ. Secondary responsibilities as a backup for the Pega. PRPC application administrator. Responsible for installation, configuration, maintenance and troubleshooting of complex, business-critical middleware installations. Coordinate and direct troubleshooting efforts. Assist with the discovery of modifications and enhancements to existing configurations and procedures. Manage Middleware release and patch levels, technology versioning and product roadmaps. Conduct evaluations of overall system performance to develop and implement recommendations to address and resolve issues. Familiarity with automation of tasks ksh, bsh, python SPLUNK Having experience working in an Agile IT environment. The ability to work on multiple projects as a shared-service Specialized skillset with distributed applications running in IBM Websphere environment Collaborates with other to solve complex problems. uses sophisticated analytical thought to exercise judgment and identify innovated solutions DUTIES: Responsible for the configuration, maintenance, monitoring and troubleshooting of high-availability middleware and web services application infrastructure technologies (WebSphere, MQ, IIS, Apache, Pega PRPC) Work with application development groups to tune and troubleshoot their applications within a fault-tolerant WebSphere/WAS environment Advises on and resolves issues that may occur within middleware services and inter-operability with RDBMS and other integrated back-end systems. reate and update system documentation Acts as a liaison with application teams and infrastructure teams to define requirements and/or determination of problems Evaluate overall system performance to develop and implement recommendations to address and resolve issues. Advise teams, on best practices provide assistance in resolution of technical issues Off-hours support and on-call rotation with the flexibility to work overtime when required Capacity planning and analysis. Performance tuning Assist in defining, documenting, and socializing clear and effective support and escalation processes Software package review, installation, configuration, and maintenance Research, planning and coordination of the implementation of new technology Disaster Recovery planning, documentation, and troubleshooting Performs other assignments to comply with the Business strategies and initiatives *Title: Automation Tester * *Location: Gainesville, FL * *Duration: Long Term* • Experience in Insurance Domain testing is mandatory • Knowledge in automation testing tools - Java/Selenium • Experience in leading a testing team of 4-6 members • Hands on testing tools such as version one, HP ALM • Should have worked in Agile projects • Expert in creation of test scenarios/cases which meets the business requirements/mapping documents • Experience in creating automation framework and enhancing as per the business needs • Good communications skills • Able to manage the customer and Mindtree stakeholders *Title: Devops Lead * *Location: Clinton, NJ.* *Duration: 6 Months* *Client: NYL* JD: • Hands-on DevOps implementation and lead team of DevOps engineers as and when required • Maintain DevOps framework covering Process and Tools • Conduct DevOps maturity assessments, create roadmap for DevOps transformation and lead the transformation • An environment where you get to work with some of the best engineers / developers *Title: Java Fullstack Developer* *Located: Charlotte , NC ( Locals Preferred – In-person interview )* *Duration: 6 Months + * Relevant Experience: 9+ Years Job Description: Technical Skills: 1.Proficiency with Angular 4 and above 2.Has working knowledge of Core Java and has worked on web and mobile apps 3.Has good understanding of the APIs and has worked on API integrations between Java and Angular 4.Ability to work with offshore teams and liaison with client for offshore deliverables Soft Skills: 1.Assertive 2.Good communications – written and oral 3.Result oriented 4.Ability to work with stringent deadlines *Title: Big Data Engineer* *Location: San Francisco, CA* *Duration: 6 Months* *Interview: Ph + F2F* Roles and Responsibilities : Architect and implement big data pipelines for TokBox by understanding our core platform. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Build tools to allow internal and external teams to visualize and extract insights
Contract opportunity for Data Engineer with AWS Redshift and ETL at NYC, NY
*Job Details:* *Position: Data Engineer with AWS Redshift and ETL* *Location: New York, NY* *Duration: 6 Months* *Kindly share the resumes to **dee...@kanisol.com* * OR You can reach me on 609-651-4663* *References Appreciated..!!!* *Job Description* · Experienced data engineer · Building ETL and data pipes, previous extensive experience with Informatica ETL tools is a must; · Knowledge of other ETL tools like Snowflake, SSIS, Amazon Glue, etc. is a plus · Experience with designing ETL data pipes in DW environment - maintaining dimensions and facts; staging areas · Understanding data modeling - specifically dimensional modeling (slowly changing dimensions, fact tables) · Extensive experience with relational databases - Redshift is strongly preferred, but strong knowledge of relational db like Oracle or SQL server, etc. is ok; · Understanding database performance tuning is a must · Experience with AWS, S3, Redshift is preferred · Knowledge of analytical and reporting tools like Looker or Tableau is a big plus. · Knowledge of Python is a plus. -- *Thanks and regards,* *Deepak Kannoji* *Kani Solutions Inc* *Phone: 609-651-4663* *Email: dee...@kanisol.com * *Skype: kannoji.deepak* *https://www.linkedin.com/in/deepakkannoji/ <https://www.linkedin.com/in/deepakkannoji/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent Need :: Sr. Big data Engineer | Cedar Rapids, IA
Hi, Hope you are doing great ! I'm Sharing you our client requirement details below. *Sr. **Big data** Engineer* *Location: Cedar Rapids, IA* *Long Term* *No OPT – Need Passport Number for H1B and GC EAD candidate* *Client –* *SYNTEL* *Required --* 8 years of experience Hadoop, Caml, and big data Experience in Data warehouse architecture and ETL tools. Experience in Agile Excellent communication skills. *Cordially,* *Anchit Bajpai* *Technical Recruiter* *Quantum World Technologies Inc.* *199 West Hillcrest Dr Suite # 0112, Thousand Oaks CA – 91360* *Direct – 917-781-6463* *Office: 805-222-0532 Ext-310* *Fax : 805-834-0532* *E: *anc...@quantumworld.us -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Opening for Data Engineer - Richardson, Texas
Hi , Hope you are doing great!! We have opening for Data Engineer *Job Title: Data Engineer* *Location: Richardson, Texas* *Duration: Long Term Contract* *Passport Number is Mandatory for H1B, OPT* *Job Description:* *Must Have Skill set : SQL, Oracle database work experience.* *Skills Required:* Must have Extensive experience in querying Web based applications and client server applications using SQL, Oracle query Should have hands on experience in working in data supply projects Must have hands on experience on querying on various Databases and analyze DB report like AWR, SQL trace, Tkprof on Oracle / SQL/DB2 etc. Should have basic knowledge in handling performance attributes on MQ , Massage broker , IIB, Web sphere , Web logic , Mainframe , AIX , iplanet and IBM DMZ load balancers. *Good To Have Skill Set :QA experience with PBM domain.* *Thanks & Regards,* *Vamshi* *Email: **vamsh...@itechus.net* *Direct: 802-227-0236 / Main: 802 383 1500 Ext. 109* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
JPMC Looking for Data Engineer
*Data Engineer Resources Location: Dallas, TX* *Visa: - H1B, GC and USC (As per client request, I want passport number for H1B consultant) Contract Need 4 Positions * *Job Description: -* • 10+ years of IT experience with a proven track record of success operating in a complex, global, fast paced environment is required. • 5+ experience in various Big Data technologies and utilities (Hadoop, Spark SQL, Hive, Hue, Impala, Nifi, Kafka) and strong knowledge of analytics and consumption tools • 8+ years of experience in utilizing and extending ETL/ ELT solutions (e.g., Informatica, Hortonworks, Pentaho, Ab Initio) in a complex, high-volume data environment • 8+ years of hands on coding experience in two or more of C/C++, Python, Scala, Java, R • 5+ years of experience with building data visualization solutions using QlikView, Tableau etc. • Strong exposure in Data Management, Governance and Controls functions • Proficiency across the full range of database and business intelligence tools • Exposure to Machine Learning to include: feature extraction, statistical approaches, linear and non-linear classifiers, and deep learning • Financial Services background or experience preferred across different LOBs. • Excellent customer service attitude, communication skills (written and verbal), and interpersonal skills, skills in dealing with a diverse population. Thanks & Regards, *Babjee* Ph: 940-229-5980 bab...@qcentrio.com www.qcentrio.com # 405 State Hwy 121, Suite A250 Lewisville, Texas, 75067 [image: Description: Qcentrio] *Disclaimer** :* Under Bill’s.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered spam as long as we include contact information If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line and mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience. All the removal requests will be duly acted upon -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required Big Data Engineer in Phoenix, AZ / Raleigh, NC / Minneapolis, MN
Hi , Hope you’re doing great, There is a Contract available as *Big Data Engineer *in *Phoenix, AZ / Raleigh, NC / Minneapolis, MN* please contact me by sending an email to *chandrak...@tanishasystems.com * or call me at * (212) 729-6543 Ext 340* with your consultant most updated resume in Word format as an attachment. *Job Title : Big Data Engineer* * Location: Phoenix, AZ / Raleigh, NC / Minneapolis, MN Duration: Long-term Job type: C2C Job Description :* AWS- Must have experience of EMR, Lambda, S3 Good understanding of Step functions Bigdata – Must have experience in Hive, Spark Good to have experience in Ranger, Hue Programming experience in Java Nice to have experience in Python/Scala Nice to have experience in Data Dog. *Best Regards,* * Chandrakant Tanisha Systems Inc Tel: (212) 729-6543 Ext 340 chandrak...@tanishasystems.com * *An Information Technology Company* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Opening for Data Engineer - Manhattan, NY
Hi , Hope you are doing great!! We have opening for a Data Engineer *Job Title: Data Engineer* *Location: Manhattan, NY* *Duration:12+ Months Contract* *Any Visa is Fine* *Role Description:* As a Cloud Data Engineer will be getting involved in Pilot implementations and strategic engagements for customers who are adopting cloud for setting up their data platforms. *Responsibilities:* - Will be part of the Data & Analytics team and play the role of a Cloud Data Engineer. - Will be working on customer engagements and will be responsible for delivering scope identified and assigned as part of the engagements. Primarily creation of solutions that require creation of data pipelines from ingestion to consumption using cloud native components. - Experienced with S3, Redshift, Glue, EMR, Lambda and RDS databases. - Highly proficient in scripting in Python and Scala. - Highly Experienced in Spark and Hive. - Desirable to have Experience in Data Quality tools/procedures for Cloud native data stores - Strong experience in Data modeling for traditional and non-traditional data sources - Expert level AWS and Cloud native data experience is required - Minimum 1 years of cloud-native database design, migration, and implementation experience required - Experience with Enterprise Information Management (EIM) is required - Experience with CDC tools to move data from On Premise data sources - Experience with SQl / ETL. *Thanks & Regards,* *Vamshi* *Email: vamsh...@itechus.net * *Direct: 802-227-0236 / Main: 802 383 1500 Ext. 205* *iTech US, Inc. | www.iTechUS.net <http://www.itechus.net/>* *[image: cid:image001.png@01D2E4D1.083DA080]* <http://www.itechus.com/>[image: cid:image002.jpg@01D2E4D1.083DA080] <http://www.facebook.com/itechus>[image: cid:image003.jpg@01D2E4D1.083DA080] <http://twitter.com/itechus> -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Need AWS Data Engineer
Title: AWS Data Engineer Location: Plano, Texas Expected Duration: 12 months contract with possible extension or permanent depending on performance of the candidate and client needs. Client Project Description: This project involves building the real-time and batch data pipeline in AWS cloud to provide data to the online retirement application and migrate the legacy db2 based data system to AWS cloud. Skillset Requirements: Having over all experience more than 5 years on data engineer + AWS Must have: Strong knowledge on object-oriented design and programming, data structures, algorithms, databases SQL and relational design. Demonstrable expertise with Python, Elasticsearch, and Spark, wrangling of various data formats - CSV, XML, JSON, Parquet. Experience with the following technologies is highly desirable: R, AWS cloud computing, Apache Airflow, Apache Kafka, Kibana, Node.js,java, Python, AWS lambda and step functions.Experience with both the relational and NoSQL database design paradigmsExperience with indexing and querying data in ElasticsearchExperience with large-scale data warehousing and analytics projects, including using AWS technologies – Redshift, S3, and EC2Working with various storage backends, possibly including Postgres, Redshift, DynamoDB, and SnowflakeContributing to Docker services in node.js, springboot and PythonExperience with Agile methodology, using test-driven development.Excellent command of written and spoken EnglishSelf-driven problem solver Additional skills which adds value: Aware of different Build/Source Code Configuration management tools - Maven/GitWorked on setting up CI/CD Pipeline for large development teamsJenkins + Plugin, CI/CD Pipeline, MavenDistributed Computing technologies in particular Hadoop MapReduce, Spark / Spark-SQL, YARN/MR2. Thanks... Uday Varanganti Recruiting Specialist uday.varanga...@performancesoftech.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Opening for a Cloud Data Engineer - Manhattan, NY
Hi , Hope you are doing great!! We have opening for a Cloud Data Engineer *Job Title: Cloud Data Engineer* *Location: Manhattan, NY* *Duration:12+ Months Contract* *Passport Needed for H1B* *Role Description:* As a Cloud Data Engineer will be getting involved in Pilot implementations and strategic engagements for customers who are adopting cloud for setting up their data platforms. *Responsibilities:* - Will be part of the Data & Analytics team and play the role of a Cloud Data Engineer. - Will be working on customer engagements and will be responsible for delivering scope identified and assigned as part of the engagements. Primarily creation of solutions that require creation of data pipelines from ingestion to consumption using cloud native components. - Experienced with S3, Redshift, Glue, EMR, Lambda and RDS databases. - Highly proficient in scripting in Python and Scala. - Highly Experienced in Spark and Hive. - Desirable to have Experience in Data Quality tools/procedures for Cloud native data stores - Strong experience in Data modeling for traditional and non-traditional data sources - Expert level AWS and Cloud native data experience is required - Minimum 1 years of cloud-native database design, migration, and implementation experience required - Experience with Enterprise Information Management (EIM) is required - Experience with CDC tools to move data from On Premise data sources - Experience with SQl / ETL. *Thanks & Regards,* *Vamshi* *Email: **vamsh...@itechus.net* *Direct: 802-227-0236 / Main: 802 383 1500 Ext. 205* *iTech US, Inc. | **www.iTechUS.net* <http://www.itechus.net/> -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer (AWS & GCP)
*Hello All,* *Hope you all are doing well….!!* *I am attaching a job description for your review. If you are interested so please reply to me with updated resume on my e-mail * *khush...@technocraftsol.com* *Position: Data Engineer (AWS & GCP) Location: Palo Alto, CA Duration: Long term * * RESPONSIBILITIES* · Design, build and maintain data pipelines in multi-cloud infrastructure (AWS and GCP) · Design and develop big data processing systems optimized for scaling (Apache Spark) · Develop and maintain real time data pipelines · Build software libraries, tools, server less applications and workflows (Java and Python) · Design mission critical dashboards and reports using BI tools · Internal process improvements such as automating manual processes, alerting systems, tooling, devops · Collaborate closely with product teams to build tools, frameworks, reports to run experiments, analyze AB test results · Work with analysts and data scientists to extract actionable insights from data that shape the direction of the company · Actively engage in design and code reviews - learn from your peers and teach your peers · Lead initiatives to research, analyze and propose new technologies and tooling for our stack *REQUIREMENTS* · 5+ years of software development or data engineering experience · Experience with *Big Data, ETL and data modeling* · Experience in developing and operating high-volume, high-availability environments · Previous experience with Linux, AWS, Docker and Kubernetes · Solid coder with Java, Python, Bash Thanks and Regards, *Khushboo singh* IT Recruiter *Technocraft Solutions LLC| Partner with Xpedantic LLC. * *Email*: *khush...@technocraftsol.com* www.technocraftsol.com | www.xpedantic.com | Connect me on LinkedIn <https://www.linkedin.com/in/khushboo-singh-b67750119> -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer (AWS & GCP)
*Hello All,* *Hope you all are doing well….!!* *I am attaching a job description for your review. If you are interested so please reply to me with updated resume on my e-mail * *khush...@technocraftsol.com* *Position: Data Engineer (AWS & GCP) Location: Palo Alto, CA Duration: Long term * * RESPONSIBILITIES* · Design, build and maintain data pipelines in multi-cloud infrastructure (AWS and GCP) · Design and develop big data processing systems optimized for scaling (Apache Spark) · Develop and maintain real time data pipelines · Build software libraries, tools, server less applications and workflows (Java and Python) · Design mission critical dashboards and reports using BI tools · Internal process improvements such as automating manual processes, alerting systems, tooling, devops · Collaborate closely with product teams to build tools, frameworks, reports to run experiments, analyze AB test results · Work with analysts and data scientists to extract actionable insights from data that shape the direction of the company · Actively engage in design and code reviews - learn from your peers and teach your peers · Lead initiatives to research, analyze and propose new technologies and tooling for our stack *REQUIREMENTS* · 5+ years of software development or data engineering experience · Experience with *Big Data, ETL and data modeling* · Experience in developing and operating high-volume, high-availability environments · Previous experience with Linux, AWS, Docker and Kubernetes · Solid coder with Java, Python, Bash Thanks and Regards, *Khushboo singh* IT Recruiter *Technocraft Solutions LLC| Partner with Xpedantic LLC. * *Email*:* khush...@technocraftsol.com * www.technocraftsol.com | www.xpedantic.com | *Connect me on LinkedIn <https://www.linkedin.com/in/khushboo-singh-b67750119>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Opening for a Senior Data Engineer/Designer
Hi , Hope you are doing great!! We have opening for a Senior Data Engineer/Designer *Job Title: Senior Data Engineer/Designer* *Location: WALTHAM, MA* *Duration: 12 Months Contract* *Passport Number Needed for H1B* *Essential Skills: BigData and Hadoop Ecosystem - MapR* *Job Description:* - Senior Data Engineer/Designer- Experience in building data pipeline, datamarts, Analytical DBs on Hadoop MapR and S3. - Knowledge to design the solution and document technical details for programing - Good Knowledge in Data modeling, database structures, theories, principles, and practices. - Experience in creating ETL data mapping document to load the data in Hadoop environment. - Creating technical documents like HLD/LLD and review with SMEs.- Knowledge in Hadoop development and implementation using MapR distribution components. - Conduct Data Profiling on data domains , Design and Implement Data Quality Rules, Data Lineage, Data Governance - Configuring and Loading data from disparate data sets into HDFS. - Pre-processing using Pig, Scoop. - Maintain security and data privacy. - Technologies - AbInitio, Linux, BigData components - HDFS, Pig, Scoop, HiveQL. *Thanks & Regards,* *Vamshi* *Email: **vamsh...@itechus.net* *Direct: 802-227-0236 / Main: 802 383 1500 Ext. 205.* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer with Healthcare Exp
*Job Title: Data Engineer with Healthcare Exp* *Job Location: NJ* *Job Duration: Long Term* *Job Description:* *Technical/Non-Technical Skills* 5+ Years of SQL knowledge Interpret data, analyze results using statistical techniques and provide ongoing reports/insights Must have healthcare domain knowledge (Payer knowledge will be an added advantage) Acquire data, Identify, analyze, and interpret trends or patterns, Filter and “clean” data Experience in Maintaining and enhancing Big Data data pipeline, hands on working with Large and complex data sets that meet functional /non-functional business requirements. Experience in Big Data technologies like Kafka, Spark, Data Lake Experience in Reference Data and Master Data Management, Data Governance Experience with Analytical tools like R, SAS, Tableu and creation/migration of Analytical models Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues Work with data and analytics experts to strive for greater functionality in our data systems Thanks... Uday Varanganti Recruiting Specialist uday.varanga...@performancesoftech.com O: (785)380 8559 Ext:109 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Need Data Engineer @ Irvine,CA For a long term contract with End Customer Ingram Micro
*Data Engineer* *Irvine, CA* *Ingram Micro* *7-10 Years* *Best Rate on C2C* *Very Long term Contract* *Need H1B/H1 Transfer, USC/GCOPT & CPT 1990 & Below Year Of Birth* *===* *sek...@chabeztech.com * *sekhar.salesrecrui...@gmail.com * *(No Phone Calls Please Message me In Hangout)* *434-322-084* Detailed Requirement: Data Engineer · Data flow design and implementation: · Create and maintain optimal data pipeline architecture. · Gather and process large, complex, raw data sets at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) that meet functional / non-functional business requirements. · Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. · Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies. · Work with stakeholders including the Business, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. · Create data tools for analytics and engineering team members that assist them in building and optimizing our product into an innovative industry leader. · Be a data expert to strive for greater functionality in our data systems. Data solutions development: · Employing your skills in designing, developing and delivering world class data algorithmic artifacts, including documentation and coding; coordinate data algorithmic development with infrastructural development · Work closely with our engineering team to integrate your amazing innovations and algorithms into our products. · Research and apply advanced algorithms and methods involving data mining, statistical analysis and machine learning techniques · Process unstructured data into a form suitable for analysis – and then do the analysis. · Support business decisions with ad hoc analysis as needed. · Master third party systems and interfaces, including: data available by the parties, API to be used for obtaining the data, limitations related to these interfaces · Excellent subject matter expertise in designing algorithms, business logics to automate commerce process flows. · Apply your broad-based data development expertise to create practical and innovative solutions · Efficiently implement clean, maintainable, and testable data solutions with high availability, blazing speed in performance and fault tolerant. · Participate in agile project execution and provide accurate work effort estimates · Apply excellent communications skills, creativity and practical knowledge to benefit our customers · What you bring to the role: · Bachelor's degree in Computer Science, Engineering, Science and Math or related technical discipline is required · Preferred: an MBA (or equivalent) from a top-tier institution, or equivalent business experience preferred · 7-10 years of technical experience, with at least 5+ years of experience with web services development and middleware applications or Master’s degree plus 5-7 years of technical experience. · Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. · Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. · Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. · Strong analytic skills related to working with unstructured datasets. · Build processes supporting data transformation, data structures, metadata, dependency and workload management. · A successful history of manipulating, processing and extracting value from large disconnected datasets. · Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. · Experience with big data tools: Hadoop, Spark, Kafka, etc. · Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. · Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. · Experience with stream-processing systems: Storm, Spark-Streaming, etc. · Experience with object-oriented/object function scripting languages: Java, C# etc. -- *Respectfully Regards,* *N . Sekhar* *Sr. Technical Recruiter* *Hangouts : sekhar.salesrecrui...@gmail.com * *Linkedin : linkedin.com/in/sekhar-nallamelli <http://linkedin.com/in/sekhar-nallamelli>* *Skype : sekhar.nallamelli* This e-mail and any files transmitted with it are for the sole use o
Urgent required || Informatica ETL Lead || Big Data Engineer || Immediate interview
Hi , Greetings form Humacinc. Please go through the JD and share the suitable profile. reach me at : *Email: **sant...@humacinc.com* *Call : +1 623-242-2594* *Required Passport Number/Copy for submission* *Required : H1B, H4-EAD, GC,USC* *No OPT,CPT* *Corp to Corp positions* *Job Title: Informatica ETL Lead* *Work Location: Irving, TX* *Job Details:* *Must Have Skills (Top 3 technical skills only) :* 1. Informatica 2. ETL *Detailed Job Description:* Proven working knowledge of ETL Informatica development. Experience with relational database design, PLSQL and stored procedure programming. database application, data warehouse and administration, Microsoft SQL Server, MySQL, and Oracle. Experience designing and implementing ETL Solution for various data sources such as API and Flat files preferred The candidate must have atleast 5 years of hands on Development or Maintenance experience with Informatica Powercenter. Understanding of best pract * Top 3 responsibilities you would expect the Subcon to shoulder and execute*:* 1. Experience with Control M is nice to have 2. Very strong troubleshooting skills 3. Ability to multi task and effectively prioritize in a highly dynamic work environment. *Title: Big Data Engineer* *Location: N. Phoenix,AZ* *Duration: 1 yr + ext.* *Overall Role Summary:* This position designs, develops, tests and maintains software product applications with the Big Data space *Essential Functions* · Participates in designing, developing, documenting, testing, and debugging new and existing software systems and applications. · Contributes in development life cycle including requirements analysis and design. · Write technical specifications based on conceptual design and stated business requirements. · Support, and document software functionality. · Identify and evaluate new technologies for implementation. · Analyze code to find causes of errors and revise programs as needed. · Provide critical input in software design meetings and analyze user needs to determine technical requirements. · Consult with the end user to prototype, refine, test, and debug programs to meet needs. · Complies with all security policies and procedures, to ensure that the highest level of system and data confidentiality, integrity and availability is maintained. *Skill Requirements:* *EXPERIENCE REQUIREMENT*: with a Master’s degree: Three (5) years of experience in the job offered or three (5) years of experience in the field of software engineering or program analysis. With a Bachelor’s degree: Five (7) years of experience in the job offered or five (7) years of progressively responsible experience in the field of software engineering or program analysis. *SKILLS REQUIREMENT:* · *Strong object-oriented design and development skills* · Good understanding of data structures and algorithms. · Solid hands-on expertise in *programming/scripting experience in Java or Scala* · *Experience on Big Data Technologies such as Hadoop, Apache Spark, Kafka, Hive/HiveQL and HBase.* · Experience with Unix/Linux and shell scripting -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent required || Big Data Engineer || Phoenix,AZ ||
Hi , Greetings form Humacinc. Please go through the JD and share the suitable profile. reach me at : *Email: **sant...@humacinc.com* *Call : +1 623-242-2594* *Required Passport Number/Copy for submission* *Required : H1B, H4-EAD, GC,USC* *No OPT,CPT* *Corp to Corp positions* *Title: Big Data Engineer* *Location: N. Phoenix,AZ* *Duration: 1 yr + ext.* *Overall Role Summary:* This position designs, develops, tests and maintains software product applications with the Big Data space *Essential Functions* · Participates in designing, developing, documenting, testing, and debugging new and existing software systems and applications. · Contributes in development life cycle including requirements analysis and design. · Write technical specifications based on conceptual design and stated business requirements. · Support, and document software functionality. · Identify and evaluate new technologies for implementation. · Analyze code to find causes of errors and revise programs as needed. · Provide critical input in software design meetings and analyze user needs to determine technical requirements. · Consult with the end user to prototype, refine, test, and debug programs to meet needs. · Complies with all security policies and procedures, to ensure that the highest level of system and data confidentiality, integrity and availability is maintained. *Skill Requirements:* *EXPERIENCE REQUIREMENT*: with a Master’s degree: Three (5) years of experience in the job offered or three (5) years of experience in the field of software engineering or program analysis. With a Bachelor’s degree: Five (7) years of experience in the job offered or five (7) years of progressively responsible experience in the field of software engineering or program analysis. *SKILLS REQUIREMENT:* · *Strong object-oriented design and development skills* · Good understanding of data structures and algorithms. · Solid hands-on expertise in *programming/scripting experience in Java or Scala* · *Experience on Big Data Technologies such as Hadoop, Apache Spark, Kafka, Hive/HiveQL and HBase.* · Experience with Unix/Linux and shell scripting -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent required || Big Data Engineer || Phoenix,AZ || Direct client
Hi , Greetings form Humacinc. Please go through the JD and share the suitable profile. reach me at : *Email: **sant...@humacinc.com* *Call : +1 623-242-2594* *Required Passport Number/Copy for submission* *Required : H1B, H4-EAD, GC,USC* *No OPT,CPT* *Corp to Corp positions* *Title: Big Data Engineer* *Location: N. Phoenix,AZ* *Duration: 1 yr + ext.* *Overall Role Summary:* This position designs, develops, tests and maintains software product applications with the Big Data space *Essential Functions* · Participates in designing, developing, documenting, testing, and debugging new and existing software systems and applications. · Contributes in development life cycle including requirements analysis and design. · Write technical specifications based on conceptual design and stated business requirements. · Support, and document software functionality. · Identify and evaluate new technologies for implementation. · Analyze code to find causes of errors and revise programs as needed. · Provide critical input in software design meetings and analyze user needs to determine technical requirements. · Consult with the end user to prototype, refine, test, and debug programs to meet needs. · Complies with all security policies and procedures, to ensure that the highest level of system and data confidentiality, integrity and availability is maintained. *Skill Requirements:* *EXPERIENCE REQUIREMENT*: with a Master’s degree: Three (5) years of experience in the job offered or three (5) years of experience in the field of software engineering or program analysis. With a Bachelor’s degree: Five (7) years of experience in the job offered or five (7) years of progressively responsible experience in the field of software engineering or program analysis. *SKILLS REQUIREMENT:* · *Strong object-oriented design and development skills* · Good understanding of data structures and algorithms. · Solid hands-on expertise in *programming/scripting experience in Java or Scala* · *Experience on Big Data Technologies such as Hadoop, Apache Spark, Kafka, Hive/HiveQL and HBase.* · Experience with Unix/Linux and shell scripting -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Data Engineer with finance - Malvern, PA
Share resumes at brijes...@kenfill.co Job Title: Data Engineer with finance Location: Malvern, PA JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Perm Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Perm * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurator EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar* *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Senior Data Engineer - Malvern, PA
Share resumes at brijes...@kenfill.co Job Title: Senior Data Engineer Location: Malvern, PA Experience: 7+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Perm Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Perm * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurator EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar* *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Big Data Engineer :: Richmond, VA and Phoenix, AZ
Hello Employers, Let me know if you have ready consultants for the below mentioned role. Drop me suitable candidate updated resume and contact details. *Position: **Big Data Engineer* *Location: **Richmond, VA and Phoenix, AZ* *Type: Contract* *Visa: OPT**/GC EAD/GC/**Citizen* Must Skills: *Big Data, Java, Hadoop, Hive, Spark, MapReduce, BI, ETL* *Skills Required: * - - Strong previous professional experience building and managing complex products/solutions - Strong previous professional experience building Distributed Solutions dealing with high volume of data - Candidate should involve in designing - Involved in writing Map-Reduce jobs - Developing hive scripts - Hands on experience on HDFS, Hive, Pig, Sqoop and NOSQL - Experience/ knowledge on working with batch processing/ Real time systems using various Open source technologies like Solr, spark, Storm, Kafka etc. would be plus Thanks, *Rajnish Mishra* *Social Media Recruiter* *Email: rajnish.mish...@rangtech.com * *Phone: *(732) 947-4119 Ext 301 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Need - UI/UX Architect || Sr Big Data Engineer // New York, NY & Sunnyvale, CA
*Note: Kindly share the profile to gan...@gang-board.com* *Title: UI/UX Architect* *Location: New York, NY* *Duration: 12 months* VISUAL DESIGNERS Skillsets: Adobe Photoshop, illustrator, Visual Styleguide, Branding, HTML5, CSS3, Angular JS, JS Frameworks Roles and Responsibilities: 1) Good understanding of responsive design systems 2) Craft graphic elements, assets and visual treatments that adapt and flow with responsive design systems. 3) Produce style guides from the provided wireframes to create interactive and aesthetically appealing designs 4) Enforce the produced visual treatments and ensure all creative guidelines lives up to the highest quality of the visual standards 5) Work closely with front-end developers to create HTML and CSS-friendly designs *Title : Sr Big Data Engineer* *Location: Sunnyvale, CA* *Duration: 12 months+* • Minimum of 10 years of experience in a professional software development environment. • Lead large scale initiatives implemented by cross-functional teams • 3+ years of Cassandra or related NoSQL experience. • 2+ years of Spark or Hadoop experience • 4+ years working with message oriented architecture like JMS, AMQP, Kafka, or equivalent • In-depth knowledge in designing server-side web applications. • Experience in object-oriented programming (Java, J2EE, Spring or other MVC framework), Service Oriented Architectures, etc. Strong problem solving and debugging skills are required. • Knowledge of or experience with Search Solr, Spring Batch is a definite plus. • Strong technical leadership skills • Demonstrated experience designing, developing and deploying in fast paced environment embracing agile practices like SCRUM/LEAN and SCRUM-BAN. Thanks and Regards! Ganesh C | Staffing Manager Gangboard LLC Direct: 302 703 7764 gan...@gang-board.com www.gang-board.com Disclaimer: Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered spam as long as we include contact information and a REMOVE link for removal from our mailing list. To be removed from our mailing list reply with "remove" and include your "original email address/addresses" in the subject heading. We will immediately update it accordingly. We apologize for the inconvenience if any caused. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Hadoop Data engineer(8+ year) || Malvern, PA
Let me know if you have any candidate available for the below job position we have, Thanks!! Kindly do share your updated resume for further process ( neh...@nityo.com). *Role : Hadoop Data engineer(8+ year)* *Location : Malvern, PA* *Job Type : Contract * *Client : TCS/* *Vanguard* *Interview : Phone + Video* · Minimum 8+ years relevant work experience. · Highly experienced with Big Data, Glue, EMR, and Data Quality tools/procedures for Cloud native data stores · Strong database and cloud expertise, along with experience with data replication technology required · Strong knowledge of micro-services and DDAAS required · Expert level AWS and Cloud native data experience is required · Minimum five years big data and advanced analytics experience required - Minimum 2 years of cloud-native database design, migration, and implementation experience required · Experience with Enterprise Information Management (EIM) is required · Experience with CDC tools (Attunity preferred) to move data from OnPrem sources Duties and Responsibilities 1. Provides the architectural leadership in shaping strategic technology programs, which focus on both Business of IT (e.g., Unified Communication, SOI) and LOB-specific strategic technology programs. 2. Continuously pursues advanced level technical acumen: · Attends conferences and engages in associated activities (e.g., conducting presentations, leading workshops, etc.). · Consumes and contributes content from and to Open Source communities. 3. Provides architecture thought leadership and expertise including cost optimization 4. Defines reference and implementation architectures. 5. Produces technology roadmaps in support of IT’s vision and strategy. 6. Develops proof-of-concept prototypes and initial implementation models. 7. Monitors implementation activity to ensure architecture and design principles are upheld. 8. Ensures implementation solutions support architecture objectives (availability, scalability, performance, security, etc.), as appropriate. 9. Rolls up sleeves and does deep dives on Financial Services data sets to understand intricacies and come up with optimal data models 10. Is adept with handling multiple data types from multiple data sources and strategize on the optimal way to analyze/store such data sets Thanks & Regards, Neha Gupta Talent Acquisition (Team Lead) Desk : 609-853-0818 * 2105 neh...@nityo.com neha.gupta1...@gmail.com URL : www.nityo.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Urgent Need - Pega Senior System Architect || Sr Big Data Engineer // Sunnyvale, CA & Louisville, Kentucky & Blue Bell, PA
*Note: Kindly share the profile to gan...@gang-board.com * *Title: Pega Senior System Architect* *Location: Louisville, Kentucky & Blue Bell, PA* *Duration: 6 Months* *Job Description:* · Pega Senior System Architect SSA is responsible for architecting and designing business applications using Pega PRPC and other Pega frameworks. Provides architecture and design guidance to project teams developing BPM/BRE solutions using Pega. · Provide thought-leadership to client across business and technical project dimensions solving complex business requirements · Develops and demonstrates an advanced knowledge of the PRPC Architecture and all PRPC design and implementation features · Works in conjunction with Program Manager to size, manage scope and risk · Accountable for ensuring the business and technical architecture of the delivered solution matches customer technical and functional requirements, and commits to Customer Success (realization of business benefit) · Participates in the development of additional consulting opportunities within the customer base. *Required experience / skills:* · 8+ years of IT experience with n-tier, database and client server design/development · 4+ years’ experience in design and implementation of PRPC-based solutions, including leadership role in design to develop shared/reusable enterprise rules and workflow components within Pega Process Rules Commander. · 2+ years’ experience with PRPC v5.x (preferably V5.3 or above) with experience in new Pega capabilities · Experience on Pega CPM, Smart Investigate frameworks preferred · Strong experience scoping, planning and delivering projects using iterative software development lifecycles over multiple release cycles · Previous experience as a lead architect on multiple large-scale product or enterprise designs · Expertise and good understanding of Relational Database Management Systems including architecting and designing for performance and scalability and working with Object to Relational Mapping schemes for distributed data access · Solid fundamentals in Core Java & Object Oriented concepts · Experience in Web technologies including Servlets, JSP, and XML. Should have hands on experience in developing web applications. · Excellent skills in any of the RDBMS tools like Oracle, MSSQL, DB2 or Sybase. Should be proficient in SQL commands · Experience in client side technologies like HTML, Java Script, CSS · Experience working with any of the Servlet containers or Enterprise containers like Jakarta tomcat, BEA Web logic, IBM WebSphere JBOSS. Should have knowledge in using any of the mentioned servers like deployment, configuring & troubleshooting · Knowledge in web services, JMS, Middleware tools (like Web Methods, Vitria), Web frameworks (like Struts, spring), Design Patterns (like MVC I, MVC II etc). · Knowledge of components for enterprise architecture like enterprise beans is a plus *Title : Sr Big Data Engineer* *Location: Sunnyvale, CA* *Duration: 12 months+* *Prefer local and GC or USC* • Minimum of 10 years of experience in a professional software development environment. • Lead large scale initiatives implemented by cross-functional teams • 3+ years of Cassandra or related NoSQL experience. • 2+ years of Spark or Hadoop experience • 4+ years working with message oriented architecture like JMS, AMQP, Kafka, or equivalent • In-depth knowledge in designing server-side web applications. • Experience in object-oriented programming (Java, J2EE, Spring or other MVC framework), Service Oriented Architectures, etc. Strong problem solving and debugging skills are required. • Knowledge of or experience with Search Solr, Spring Batch is a definite plus. • Strong technical leadership skills • Demonstrated experience designing, developing and deploying in fast paced environment embracing agile practices like SCRUM/LEAN and SCRUM-BAN. Thanks and Regards! Ganesh C | Staffing Manager Gangboard LLC Direct: 302 703 7764 gan...@gang-board.com www.gang-board.com Disclaimer: Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered spam as long as we include contact information and a REMOVE link for removal from our mailing list. To be removed from our mailing list reply with "remove" and include your "original email address/addresses" in the subject heading. We will immediately update it accordingly. We apologize for the inconvenience if any caused. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group
Urgent Need - Pega Senior System Architect || Sr Big Data Engineer // Sunnyvale, CA & Louisville, Kentucky & Blue Bell, PA
*Note: Kindly share the profile to gan...@gang-board.com * *Title: Pega Senior System Architect* *Location: Louisville, Kentucky & Blue Bell, PA* *Duration: 6 Months* *Job Description:* · Pega Senior System Architect SSA is responsible for architecting and designing business applications using Pega PRPC and other Pega frameworks. Provides architecture and design guidance to project teams developing BPM/BRE solutions using Pega. · Provide thought-leadership to client across business and technical project dimensions solving complex business requirements · Develops and demonstrates an advanced knowledge of the PRPC Architecture and all PRPC design and implementation features · Works in conjunction with Program Manager to size, manage scope and risk · Accountable for ensuring the business and technical architecture of the delivered solution matches customer technical and functional requirements, and commits to Customer Success (realization of business benefit) · Participates in the development of additional consulting opportunities within the customer base. *Required experience / skills:* · 8+ years of IT experience with n-tier, database and client server design/development · 4+ years’ experience in design and implementation of PRPC-based solutions, including leadership role in design to develop shared/reusable enterprise rules and workflow components within Pega Process Rules Commander. · 2+ years’ experience with PRPC v5.x (preferably V5.3 or above) with experience in new Pega capabilities · Experience on Pega CPM, Smart Investigate frameworks preferred · Strong experience scoping, planning and delivering projects using iterative software development lifecycles over multiple release cycles · Previous experience as a lead architect on multiple large-scale product or enterprise designs · Expertise and good understanding of Relational Database Management Systems including architecting and designing for performance and scalability and working with Object to Relational Mapping schemes for distributed data access · Solid fundamentals in Core Java & Object Oriented concepts · Experience in Web technologies including Servlets, JSP, and XML. Should have hands on experience in developing web applications. · Excellent skills in any of the RDBMS tools like Oracle, MSSQL, DB2 or Sybase. Should be proficient in SQL commands · Experience in client side technologies like HTML, Java Script, CSS · Experience working with any of the Servlet containers or Enterprise containers like Jakarta tomcat, BEA Web logic, IBM WebSphere JBOSS. Should have knowledge in using any of the mentioned servers like deployment, configuring & troubleshooting · Knowledge in web services, JMS, Middleware tools (like Web Methods, Vitria), Web frameworks (like Struts, spring), Design Patterns (like MVC I, MVC II etc). · Knowledge of components for enterprise architecture like enterprise beans is a plus *Title : Sr Big Data Engineer* *Location: Sunnyvale, CA * *Duration: 12 months+ * *Prefer local and GC or USC* • Minimum of 10 years of experience in a professional software development environment. • Lead large scale initiatives implemented by cross-functional teams • 3+ years of Cassandra or related NoSQL experience. • 2+ years of Spark or Hadoop experience • 4+ years working with message oriented architecture like JMS, AMQP, Kafka, or equivalent • In-depth knowledge in designing server-side web applications. • Experience in object-oriented programming (Java, J2EE, Spring or other MVC framework), Service Oriented Architectures, etc. Strong problem solving and debugging skills are required. • Knowledge of or experience with Search Solr, Spring Batch is a definite plus. • Strong technical leadership skills • Demonstrated experience designing, developing and deploying in fast paced environment embracing agile practices like SCRUM/LEAN and SCRUM-BAN. Thanks and Regards! Ganesh C | Staffing Manager Gangboard LLC Direct: 302 703 7764 gan...@gang-board.com www.gang-board.com Disclaimer: Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered spam as long as we include contact information and a REMOVE link for removal from our mailing list. To be removed from our mailing list reply with "remove" and include your "original email address/addresses" in the subject heading. We will immediately update it accordingly. We apologize for the inconvenience if any caused. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group
Required : Data Engineer with AWS - Malvern, PA
Share resumes at brijes...@kenfill.co We have an urgent requirement for the *Data Engineer with AWS *at* Malvern, PA* *Job Title: Senior Data Engineer* *Location: Malvern, PA* *Experience: 7+yr* JD : · Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Perm Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration · Build data pipeline between AWS Cloud and On-Perm · Create data ingestion framework in cloud using SQOOP and S3 for storage · Configurator EMR and schedule batch jobs using Control M · Integrate Tableau with EMR for reporting *Best Regards**,* Brijesh Kumar Kenfill Techno Solutions USA : 219-209-4155 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Data Engineer with AWS - Malvern, PA
Share resumes at brijes...@kenfill.co We have an urgent requirement for the *Data Engineer with AWS *at* Malvern, PA* *Job Title: Senior Data Engineer* *Location: Malvern, PA* *Experience: 7+yr* JD : · Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Perm Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration · Build data pipeline between AWS Cloud and On-Perm · Create data ingestion framework in cloud using SQOOP and S3 for storage · Configurator EMR and schedule batch jobs using Control M · Integrate Tableau with EMR for reporting *Best Regards**,* Brijesh Kumar Kenfill Techno Solutions USA : 219-209-4155 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Data Engineer with AWS - Malvern, PA
Share resumes at brijes...@kenfill.co We have an urgent requirement for the *Data Engineer with AWS *at* Malvern, PA* *Job Title: Senior Data Engineer* *Location: Malvern, PA* *Experience: 7+yr* JD : · Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Perm Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration · Build data pipeline between AWS Cloud and On-Perm · Create data ingestion framework in cloud using SQOOP and S3 for storage · Configurator EMR and schedule batch jobs using Control M · Integrate Tableau with EMR for reporting *Best Regards**,* Brijesh Kumar Kenfill Techno Solutions USA : 219-209-4155 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Data Engineer - Thousand Oaks, CA
Share resumes at brijes...@kenfill.co Role: Data Engineer Duration: 12+ months contract Location: Thousand Oaks, CA Interview mode: 1st Skype – final will be Skype/telephonic. *NO OPT is accepted.* Skills: · Big data · AWS · Python · Spark · ETL · Hadoop *Best Regards**,* Brijesh Kumar Kenfill Techno Solutions USA : 219-209-4155 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer(Hadoop) || Bradenton , FL
Kindly share suitable resumes ASAP to @neh...@nityo.com || Desk No : 609-853-0818 * 2105 Role : Data Engineer(Hadoop) Location : Bradenton , FL Duration of contract: 1+ year Client : TCS/footlocker Required/Desired Skills: Skills Mandatory/ Knowledge Amount of Experience Data Engineer Mandatory 8 Years Hadoop Mandatory 8 Years Kafka Mandatory 8 Years Hadoop ecosystem tools Mandatory 8 Years Job Description: Data Engineers with excellent in Spark with SQL 7+ Yrs of Experience & minimum 4+ Yrs of development experience in Big Data related projects Experience with Java, Spark, Scala, Hadoop, Kafka Practical knowledge of Big Data frameworks and Map Reduce Programming Experience with Hadoop ecosystem tools (HDFS, MapReduce, Spark, Yarn) Should be good in SQL Is self-motivated to take initiatives and thrive in an ambiguous work environment Demonstrated ability to complete multiple tasks under pressure with a high degree of flexibility Should be able to understand the data Architecture and Functionality in a short span of time Thanks & Regards, Neha Gupta Desk no : 609-853-0818 Ext-2105 Email id : neh...@nityo.com LinkedIN: www.linkedin.com/in/nehag6 (www.nityo.com) -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Req : Big Data Engineer OR Hadoop Engineer ( Min 10 Years ) | San Francisco, CA
*Hi, * *Hope you are doing great!* *I'm Sharing you our client requirement details below.* *Please reply toanc...@1pointsys.co * Req : Big Data Engineer OR Hadoop Engineer ( Min 10 Years ) Location : San Francisco, CA Duration : 6 month contract NO OPT and NO H1B Local to CA Only ( Onsite Interview ) LinkedIn Must *Responsibilities:* - Owner of the core company data pipeline, responsible for scaling up data processing flow to meet the rapid data growth at Lyft - Consistently evolve data model & data schema based on business and engineering needs - Implement systems tracking data quality and consistency - Develop tools supporting self-service data pipeline management (ETL) - SQL and MapReduce job tuning to improve data processing performance * Experience & Skills:* - Extensive experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet) - Proficient in at least one of the SQL languages (MySQL, PostgreSQL, SqlServer, Oracle) - Good understanding of SQL Engine and able to conduct advanced performance tuning - Strong skills in scripting language (Python, Ruby, Perl, Bash) - Experience with workflow management tools (Airflow, Oozie, Azkaban, UC4) - Comfortable working directly with data analytics to bridge business requirements with data engineering *Anchit Bajpai* *Technical Recruiter* 1 Point System LLC Unit 103, 206 N College St, Pineville, North Carolina, 28134 *D : 803-369-3436,* *E* : anc...@1pointsys.com Hangout: bjpaia1point...@gmail.com Note: We respect your Online Privacy. This is not an unsolicited mail. Under Bill s.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered Spam as long as we include Contact information and a method to be removed from our mailing list. If you are not interested in receiving our e-mails then please reply with a "remove" in the subject line and mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience caused to you. -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data engineer role/ data analyst role
Hi , Please help me out by providing good resource for this requirement on *s...@puresoftinc.com * *Data engineer role/ Data analyst role* *They are short listing candidates today only.* *GoLang is just a plus, not a must*. Main thing is well-rounded*, 3-5+ years in SQL Data Engineering*, and any kind of development *(C++) a plus.* This is a *data engineer role/ data analyst role*. Bachelor's degree • 4.+years of experience with *ETL and advanced SQL skills* - Adept at queries, report writing and presenting findings • Expertise in Data Analysis, Data Profiling, and SQL Tuning • Expertise in translating business requirements to project design, development, and execution • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy •C++/GoLang experience a must. Ability to clearly communicate capabilities, opportunities, and recommendations to both technical and nontechnical audiences • Experience working in Data warehouse ETL & BI platforms and have a good understanding of related development activities and challenges • Strong knowledge of and experience with reporting, databases (SQL etc), programming ( ETL frameworks) • Experience in understanding the source data from various platforms and mapping them into Entity relationship model(ER) for data integration and reporting. • Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms *Thanks & Regards* *Om Shiv * *Puresoft, Inc* *W:+ 408-442-3664 EXT: 4425* *Email:**s...@puresoftinc.com* * || Hangout:kesharioms...@gmail.com* *Website: **www.puresoftinc.com* <https://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Epuresoftinc%2Ecom=tP9r=Puresoft+Inc_website> *[image: Puresoft Inc]* *This message contains information that may be privileged or confidential and is the property of Puresoft, Inc. It is intended only for the person to whom it is addressed. If you are not the intended recipient, you are not authorized to read, print, retain copy, disseminate, distribute, or use this message or any part thereof. If you receive this message in error, please notify the sender immediately and delete all copies of this message. Puresoft, Inc does not accept any liability for virus infected mails.* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Engineer
*Hello All,* *Hope you all are doing well.* *I am attaching a job description for your review. If you are interested so please reply to me with updated resume on my e-mail * *khush...@technocraftsol.com* *Position: Data Engineer* *Location: Summit, NJ(Must be local)* *Duration: 6month* *F2F is required* POSITION Contractor, Senior Data Engineering SUPERVISOR Director of *Big Data Architecture* and Engineering DEPARTMENT IT PREREQUISITES · Bachelor's degree in computer science, system analysis or a related study, or equivalent experience · Minimum of 5 years hands-on experience with Data Integration, *ETL *and/or Data Engineering · Minimum of 3 years of hands-on experience with *Talend* used in conjunction with Hadoop MapReduce/Spark/Hive/Kafka · Minimum of 5 years hands-on experience with Big Data tools and techniques · Excellent interpersonal skills in areas such as teamwork, influence, facilitation and negotiation ·Problem Solver Summary: Celgene has established a Big Data capability that provides actionable insights and informs decisions throughout the product life cycle and helps improve patient lives. Celgene’s modern Big Data Platform includes a Data Lake that stores and provide easy and secured access to data needed by various functions for reporting and analytics. The Sr Data Engineering is responsible for adhoc data ingestions, building/testing industrialized data pipelines using Talend, data profiling and exploration, data preparation enabling data analysis by data scientists and helping data scientist Big Data tools e.g. Spark, Impala, pySpark, Sparklyr. The role works closely with the Data Engineering Lead and Big Data Platform Engineer and the team responsible for data ingestions/integration and the data scientists/analysts. Thanks and Regards, *Khushboo singh* IT Recruiter *Technocraft Solutions LLC| Partner with Xpedantic LLC. * *Email*:* khush...@technocraftsol.com * *LinkedIn: *linkedin.com/in/khushboo-singh-b67750119 <https://www.linkedin.com/in/khushboo-singh-b67750119> www.technocraftsol.com | www.xpedantic.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Datastage Consultant // Data Engineer // Kronos WFM Solution// Business Analysis-
Hi Folks, Hope you are doing good Please go through the JD and share me the suitable profiles. Please reach me at : *Email: sant...@humacinc.com * *Call : +1 623-242-2594 * *Required Passport Number/Copy for submission* *Note: No OPT, CPT, GC-EAD* *__* *Position : Datastage Consultant* *Location : Roanoke, WA* *Job description:* Data StageTechnology Lead with DataStage skillset based out of Roanoke, WA. Will have to interact and work with an offshore team. Primary responsibilities are onsite coordination, design and build, reviews, technical leadership and implementation. 1.Teradata 2.Design and Build 3. Reviews 4. Offshore Coordination *__* *Role: Data Engineer (SQL + ETL + Python)* *Location: Hillsboro,OR* *Please find the job description.* · Strong experience with relational SQL or Teradata, Oracle or Snowflake. · Experience building cloud scalable, real time and high-performance data lake solutions · Language: Python · Strong Experience in any ETL Tool · Experience with source control tools such as GitHub and related dev process · Experience with workflow scheduling tools · Understanding of micro service architecture · Preferred: Strong understanding of developing complex data solutions · Willing to learn new skills and technologies · Has a passion for data solutions · understands data structures and algorithms · Understands solution and technical design · Has a strong problem solving and analytical mindset · Able to influence and communicate effectively, both verbally and written, with team members · Able to quickly pick up new programming languages, technologies, and frameworks *__* *Job Title: Technical Test Lead | Oracle Industry Solutions | Kronos WFM Solution* *Location: Seattle, WA - 98134Job Details: Must Have Skills (Top 3 technical skills only)* 1. Kronos WFM 2. Automation 3. Big Data Knowledge *Nice to have skills (Top 2 only)* 1. Kronos WFM 2. Big Data *Detailed Job Description:* QA Analyst with Kronos WFM, Automation and DW experience Minimum years of experience: 5+ *Top 3 responsibilities you would expect the Subcon to shoulder and execute*:* 1. Stakeholder Management 2. Good Communication Skills 3. Automation Skills *Interview Process:* Telephonic and Vedio interview is mandatory in case candidate is non-local to the client location, please make sure the candidate submitted are comfortable with this arrangement. *__* *Job Title: Senior Consultant | Cards and Payments | Cards Business AnalysisLocation: PhoenixAZ85054Contract duration: 12 Months* *Job Details: Must Have Skills (Top 3 technical skills only): * 1. Product Owner knowhow 2. Agile Experience 3. Cards, Payments knowledge *Detailed Job Description:* Client is looking for people with proven credentials having delivered in agile environment as Product Owners. Define User stories,Prioritize Backlog,Maintain conceptual and technical integrity of the Features,Streamline execution of program,Prepare and Participate in PI Planning,Participate in demo,Define Feature acceptance criteria. *Minimum years of experience: 6 years* *Top 3 responsibilities you would expect the Subcon to shoulder and execute:* 1. Product Owner 2. Use case and backlog tracking 3. Streamlining program execution -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit h
Re: Requirement for Big Data Engineer in Cincinnati, Ohio..
Hello, Hope you are doing well! Please find the attached resume of one of our consultant for the below position and let me know your interest. *Thanks & Regards,* *Naga Manikanta* | *IT Recruiter* Vintech Solutions, Inc | *Sales Team* ERP & IT SERVICES | Consulting - Development - Staffing Email: manika...@vintech.com | O: 314-989-9000 X 734 | F: 314-989-9009 <https://www.linkedin.com/in/mani-b-468a3313b/> <https://www.facebook.com/Vintechsolutions> <http://www.vintech.com/> On Mon, Aug 13, 2018 at 10:30 AM, Vamshi Krishna wrote: > Hi, > > If you are interested and available for the Job, Please revert back with > latest resume and other details required for submission to > vam...@techorbit.com > > *Role: Big Data Engineer* > > *Location: Cincinnati, Ohio* > > *Duration : 6 Months + * > > > > *Need 9 + year’s experience consultant. * > > > > Skills/Experience: > > > > Data modeling, SQL, Hive, Sqoop, Pig > > > > Candidate Name > > > > Tel No > > > > E-mail ID > > > > Skype ID > > > > Present location > > > > Last 4 Digit SSN > > > > Highest Degree of Education with Year of Passing > > > > Work Authorization & Validity > > > > LinkedIn ID > > > > Rate on C2C/W2 > > > > DOB > > > > Onsite availability (post-selection) > > > > Total onsite experience, working in US > > > > Overall relevant experience of candidate > > > > > > *Regards,* > > > > *Vamshi* > > *vam...@techorbit.com * > > *972-646-2158* > > [image: cid:image002.jpg@01D35CE0.BBC5D2D0] > > *1300 W Walnut Hill Ln. #260, Irving, TX 75038 > <https://goo.gl/maps/hRJfzPqWBbL2>. * > > *www.techorbit.com <http://www.techorbit.com/>* > > > > > > -- > You received this message because you are subscribed to the Google Groups > "CorptoCorp" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to corptocorp+unsubscr...@googlegroups.com. > To post to this group, send email to corptocorp@googlegroups.com. > Visit this group at https://groups.google.com/group/corptocorp. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout. SHIVA S.docx Description: MS-Word 2007 document
Fwd: Requirement for Big Data Engineer in Cincinnati, Ohio..
Hi, If you are interested and available for the Job, Please revert back with latest resume and other details required for submission to vam...@techorbit.com *Role: Big Data Engineer* *Location: Cincinnati, Ohio* *Duration : 6 Months + * *Need 9 + year’s experience consultant. * Skills/Experience: Data modeling, SQL, Hive, Sqoop, Pig Candidate Name Tel No E-mail ID Skype ID Present location Last 4 Digit SSN Highest Degree of Education with Year of Passing Work Authorization & Validity LinkedIn ID Rate on C2C/W2 DOB Onsite availability (post-selection) Total onsite experience, working in US Overall relevant experience of candidate *Regards,* *Vamshi* *vam...@techorbit.com * *972-646-2158* [image: cid:image002.jpg@01D35CE0.BBC5D2D0] *1300 W Walnut Hill Ln. #260, Irving, TX 75038 <https://goo.gl/maps/hRJfzPqWBbL2>. * *www.techorbit.com <http://www.techorbit.com/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
6 requirements Big Data Architect// developer// Data science // Data Engineer // AWS DevOps--
*Hi Folks,hope you are doing goodPlease go through the JD and share me the suitable profiles.Please reach me at :Email: sant...@humacinc.com Call : +1 623-242-2594 Required Passport Number/Copy for submissionNote: No OPT, CPT, GC-EAD* *Please check the requirements available for the day * *1. Big Data Architect – * *Experience: 10+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* *Must have skills:* 1. Expert level design and development experience on Cloud 2.AWS 3 Hadoop 4. Spark, 5.Scala 6. sqoop 7 SQL 8. Python *2.EMR/Spark/Python experts* *Experience: 8+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* *3. Data Scientist * *Experience: 10+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* Job Description: We are looking for someone with 8-10 years of experience manipulating data sets, building statistical/machine learning models, has a MS or PhD in Statistics, Mathematics, or Computer Science, and has the following background/skills: •Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and experience in feature engineering and choosing proper ML algorism •Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with for their applications •Experience in Data DAD (Discover/Access /Distill) using statistical computer languages (R, Python, etc.) to manipulate data and draw insights from large data sets •Experience creating machine learning models using ML toolkits (RCloud, H2O, etc.) and visualize them for determining patterns •Excellent written and verbal communication skills for working across teams •A drive to learn and master new technologies and techniques _ *4. Data Engineer (SQL + ETL + Python)* *Experience: 10+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* *Please find the job description.* · Strong experience with relational SQL or Teradata, Oracle or Snowflake. · Experience building cloud scalable, real time and high-performance data lake solutions · Language: Python · Strong Experience in any ETL Tool · Experience with source control tools such as GitHub and related dev process · Experience with workflow scheduling tools · Understanding of micro service architecture · Preferred: Strong understanding of developing complex data solutions · Willing to learn new skills and technologies · Has a passion for data solutions · understands data structures and algorithms · Understands solution and technical design · Has a strong problem solving and analytical mindset · Able to influence and communicate effectively, both verbally and written, with team members · Able to quickly pick up new programming languages, technologies, and frameworks __ *5. Hadoop Developer* *Experience:8+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* *Must Have Skills (Top 3 technical skills only) * * 1. Hadoop 2. Hive 3. Pig Detailed Job Description: Bachelors degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education. At least 4-5 years of experience in Big Data space.Strong Hadoop MAP REDUCE Hive Pig SQOOPO OZIE MUST Candidate should have hands on experience with Java, APIs, spring MUST Good exposure to columnar NoSQL DBs like HBase. Complex High Volume High Velocity projects end to end delivery experience Good experience with at least one of the scripting language l ___ *6. AWS DevOps* *Experience: 8+* *Location: Hillsboro,OR* *Note: NO OPT,CPT,GC-EAD* *Must have skills: -* 1. ALGORITHMS 2. AMAZON WEB SERVICES 3. CASSANDRA 4. CHEF 5. DATA STRUCTURES. -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
4 urgent Positions for Java Full stack /// Aws/Devops // Network Engineer // Data Engineer
*Hi Partner,* *Hope you are doing good!Please go through the requirement mentioned below and share the suitable profiles.* *Mail: sant...@humacinc.com * *Feel free to call +1 623-242-2594 * *Required Passport Number/Copy for submission* *Job title: Java Full stack* *Location: Phoenix,AZ* *Contract: Long term * *We have two immediate requirements in AMEX for a US Java full stack Developer, PHX location, with in-depth knowledge on* *·* *Spring framework online & batch, Spring boot* *·* *REST and SOAP API, API security* *·* *Microservices* *·* *Oracle and NoSQL - couchbase* *INFOSYS—NIKE* *Job Title: Aws/Devops* *Location: Hillisboro,OR* *Experience: 8+* Must have skills: - 1. ALGORITHMS 2. AMAZON WEB SERVICES 3. CASSANDRA 4. CHEF 5. DATA STRUCTURES. *Job title: Network Engineer* *Location: Norwalk CT* *Experience: 8+* · Designing and implementing new network solutions and/or improving the efficiency of current networks · Installing, configuring and supporting network equipment including routers, proxy servers, switches, WAN accelerators, DNS and DHCP · Procuring network equipment and managing subcontractors involved with network installation · Configuring firewalls, routing and switching to maximize network efficiency and security · Maximizing network performance through ongoing monitoring and troubleshooting · Arranging scheduled upgrades · Investigating faults in the network · Updating network equipment to the latest firmware releases · Reporting network status to key stakeholders · An analytical mind · An ability to learn new technologies quickly · Good time management skills · An ability to follow processes · Strong documentation skills · Good communication skills – both written and verbal · Commercial and business awareness *Role: **Data Engineer* *Location: Boston,MA* *Contract: 12 months * *Experience**: 8+* *Please find the job description.* · Strong experience with relational SQL or Teradata, Oracle or Snowflake. · Experience building cloud scalable, real time and high-performance data lake solutions · Language: Python · Strong Experience in any ETL Tool · Experience with source control tools such as GitHub and related dev process · Experience with workflow scheduling tools · Understanding of micro service architecture · Preferred: Strong understanding of developing complex data solutions · Willing to learn new skills and technologies · Has a passion for data solutions · understands data structures and algorithms · Understands solution and technical design · Has a strong problem solving and analytical mindset · Able to influence and communicate effectively, both verbally and written, with team members · Able to quickly pick up new programming languages, technologies, and frameworks -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Sr. Big Data Engineer :: NYC, NY
Hello Partners, Kindly, go through the below job description and revert with suitable candidate details to my E*mail: rajnish.mish...@rangtech.com or Call: 732-201-4112* *Position: **Sr. Big Data Engineer* *Location: **NYC, NY* *Visa Required: US Citizen, Green Card, H1B* *Required Key Skills: **AWS, Python, Apache, NodeJS.* *Responsibilities:* - Collaborate with cross-functional teams to conceptualize, build, test and ship software solutions that meet business and market needs. - Push cloud and big data technologies to the limits to help better our solutions and services. - Work with geographically dispersed team members. - Implement scalable and cost effective software systems that meet business requirements. - Develop high quality software using industry standard best practices including unit testing, code reviews and continuous integration - Detect deviation from project plan and take corrective measures to bring the project on course - You will also be responsible for, formulate and document the best practices and architectures needed to develop and manage highly resilient data solutions on public and private PaaS platforms. *Qualification and Skill Set:* - BS or MS in Computer Science or equivalent work experience. - 5+ years of *Enterprise class software development* experience is a must. - 2+ years of experience in building *AWS Data Pipelines using Apache Spark, SparkSQL, AWS Glue, S3 Data Lake, Redshift is must.* - Experience with one of the RDBMS and JDBC experience – Oracle/ MYSQL / Postrgres is must - 2+ years of hands-on experience using *Python and Data Munging and Statistical libraries.* - Solid understanding of software development best practices and methodologies. - 2+ years of hands-on software development experience preferably with Java and Node.js is nice to have. - Hands-on Experience with distributed data handling and storage technologies like Apache Kafka, Confluent, Apache Flume, Avro. - Experience with any of the NoSQL datastores such as ElasticSearch, MongoDB, DynamoDB, Cassandra. - Working knowledge of any of the data visualization tools like Tableau, Kibana, Amazon Quicksight. - Experience working with Git, Subversion or other SCM tools. - Strong understanding of query languages and tradeoffs between relational and non-relational systems. - Experience working in a team-oriented, collaborative environment. - Excellent communication and collaborative problem-solving skills. - Can lead by example & motivate fellow developers to meet challenging goals. Thanks, *Rajnish Mishra* *Social Media Recruiter* *Email: rajnish.mish...@rangtech.com * *Call: 732-201-4112* Thanks, *Rajnish Mishra* *Social Media Recruiter* *Email: rajnish.mish...@rangtech.com * *Phone: *(732) 947-4119 Ext 301 -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
AWS data architect/developers || Data Engineer || Informatica Administration || Infosys-
*Hi Folks,Greetings for the day!!Please go through the Job description mentioned below and share me the suitable profiles Please reach me at :Email: sant...@humacinc.com Call : +1 623-242-2594 Required Passport Number/Copy for submissionNote: No OPT, CPT, GC-EAD* ___ ___ __ *Role: AWS data architect/developers * *Location: Seattle, WA* *Experience required: 12 years * *Contract: 12 months * *Must Have skills:* Java AWS and Data Architect *Detail Job description:* We have couple urgent needs for AWS data architect/developers for Seattle, WA location. Solid Java , AWS and Data Architect (Data lake knowledge plus point) cum developer. Big Data knowledge is Plus point *Nice to have skills:* Data lake knowledge. Big Data knowledge __ __ *Role: Data Engineer* *Location: Boston,MA* *Experience required: 8+ years Contract: 12 months * *Please find the job description.* · Strong experience with relational SQL or Teradata, Oracle or Snowflake. · Experience building cloud scalable, real time and high-performance data lake solutions · Language: Python · Strong Experience in any ETL Tool · Experience with source control tools such as GitHub and related dev process · Experience with workflow scheduling tools · Understanding of micro service architecture · Preferred: Strong understanding of developing complex data solutions · Willing to learn new skills and technologies · Has a passion for data solutions · understands data structures and algorithms · Understands solution and technical design · Has a strong problem solving and analytical mindset · Able to influence and communicate effectively, both verbally and written, with team members · Able to quickly pick up new programming languages, technologies, and frameworks *Job Title:Informatica Administration* *Location*Hillsboro OR 97124* *Contract duration (in months)* 6* *Must Have Skills * 1. Intensive informatica 2. PowerCenter Platform Admin *Detailed Job Description:* Intensive informatica PowerCenter Platform Admin Informatica on Cloud, AWS *Top 3 responsibilities you would expect the Subcon to shoulder and execute*: * 1. Intensive informatica 2. PowerCenter Platform Admin *Interview Process (Is face to face required?)No * -- *Thanks & Regards...* *Santosh* *IT Analyst, Humac Inc.* *2730 W Agua Fria, Freeway, Suite#204* *Phoenix, AZ 85027* *Search: www.humacinc.com <http://www.humacinc.com/> E: sant...@humacinc.com * *Ph: +1 623-242-2594 Hangouts: recruiter.lo...@gmail.com * *Linkedin: https://www.linkedin.com/in/santosh-kumar-98b6ba15b/ <https://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@kenfill.co Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@kenfill.co Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Data Scientist OR Data Engineer @ Irving, TX
*Send resume to raj...@valconusa.com .* *Please No OPT/H1 Transfer candidate. **Need Visa Copy with I94 Or Passport No.* *Send resume with Below information of your candidate.* Full Name Contact number (Cell/ Home): Email Id.: Skype: Current Location: Preferred Location: Pay Rate: Employer Details for C2C Submissions: Work status in USA (If H1B, H1B Validity): Availability for an interview: Availability for start: Education Qualifications with passing year: == *Job Title*: Data Scientist OR Data Engineer *Location*: Irving, TX *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position *Qualifications:* · The architect must have experience leading and delivering large projects · Experience with Agile application development methodologies · The ability to lead projects made up of multiple teams · Experience in designing and architecting complex solutions · Data Modeling, Erwin (or equivalent is a must) tool experience, data analysis, Kimball and Inmon design concepts · Proven experience understanding and translating business needs into a comprehensive, cohesive architecture that meets all functional and non-functional requirements is required · Strong communication skills are also required with proven experience influencing others outside of the project reporting hierarchy including senior leadership *Experience in the following areas:* · Transformation of Legacy Applications and associated projects/efforts that have heavy data integration and analytic · Architecting modern Data Architectures that include the Hortonworks Stack, Hadoop platform, SPARK, Hive, Governed Data Discovery (GDD) and other emerging technologies that promote scalable and perform data ingestion, processing and exploration · Architecture strategy and approach for data and analytic capabilities and patterns through collaboration with our internal IT and business partners. *Rajesh | **T**: **+1 703-468-8165* raj...@valconusa.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Sr. Android Developer @ San Francisco, CA, Atlanta GA and Dallas TX // Data Scientist OR Data Engineer @ Irving, TX // Sr. / Lead ServiceNow Developer @ Atlanta GA
*Send resume to raj...@valconusa.com * *Job Title*: Sr. Android Developer *Location*: San Francisco, CA, Atlanta GA and Dallas TX *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position In this role, you will be responsible for hands-on design and development of mobile software applications, libraries, services that interface with wearable sensor platforms. This position requires ability to participate in all aspects of the software project from requirements definition to product delivery. The candidate will be a self-driven person who can not only identify problems but also come up with creative ways to solve them. But it doesn’t end there. Here’s a summary of this great opportunity. *Job Description* · Develop and maintain native Android mobile applications, libraries, and services · Work with internal and external stakeholders to identify use cases and interfaces to provide the stakeholders with the right solution. · Work with multi-disciplinary team in an agile and fast paced development environment to define, create, and maintain the software product. · Write and execute unit and integration tests. Perform and support system-level troubleshooting. · Write requirements and design documents in compliance with our internal processes. · Execute all development projects in compliance with company and regulatory guidelines. *Qualifications* · B.S. in computer science, software engineering, computer engineering, electrical engineering, or related area of study · 2+ years of software design and development experience · Familiarity with mobile software design and development concepts or a strong interest in learning to develop mobile software · Understanding of mobile application design patterns · Familiarity with communication interfaces (i.e. BT 2.0, BLE, wireless interfacing) is desired but not required · Understanding of Cloud interfacing concepts (e.g. JSON, RESTful interfacing) · Familiarity with embedded software concepts · Understanding of software development life cycle · Demonstrated ability to understand projects at the system-level · Ability to communicate effectively, in writing or verbal, with various stakeholders including hardware engineers, software engineers, scientists, technicians, clinical, regulatory, and marketing. · Medical device development experience or familiarity with FDA guidelines for medical device development desired but not required · Excellent presentation skills. Excellent communication skills (verbal and written). · Ability to read, analyze, and interpret complex documents. Ability to write presentations and other documents using original or innovative techniques or style. · Excellent organizational skills along with strong attention to detail · Ability to work both independently and collaboratively with small, cross-functional teams · Highly proficient with Microsoft Office Suite. *--* *Job Title*: Data Scientist OR Data Engineer *Location*: Irving, TX *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position *Qualifications:* · The architect must have experience leading and delivering large projects · Experience with Agile application development methodologies · The ability to lead projects made up of multiple teams · Experience in designing and architecting complex solutions · Data Modeling, Erwin (or equivalent is a must) tool experience, data analysis, Kimball and Inmon design concepts · Proven experience understanding and translating business needs into a comprehensive, cohesive architecture that meets all functional and non-functional requirements is required · Strong communication skills are also required with proven experience influencing others outside of the project reporting hierarchy including senior leadership *Experience in the following areas:* · Transformation of Legacy Applications and associated projects/efforts that have heavy data integration and analytic · Architecting modern Data Architectures that include the Hortonworks Stack, Hadoop platform, SPARK, Hive, Governed Data Discovery (GDD) and other emerging technologies that promote scalable and perform data ingestion, processing and exploration · Architecture strategy and approach for data and analytic capabilities and patterns through collaboration with our internal IT and business partners. *-* *Job Title*: Sr. / Lead ServiceNow Developer *Location*: Atlanta GA *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position *Duties & Responsibilities:* · Works with the Product team in an agile scrum to understand the challenges that users face in their day-to-day work and partner with them to design viable solut
Infosys requirements!!!!----- Data Engineer// UI (React JS)Developer // Hadoop developer
* Hi Partner,Hope you are doing good!Please go through the requirement mentioned below and share the suitable profiles.Mail: sant...@humacinc.com Feel free to call +1 623-242-2594 Required Passport Number/Copy for submission * ___ *Position: Data Engineer* *Location: Boston, Massachusetts * *Client: NIKE * *Duration: 1 year* *VISA: H1B, H4-AED, GC,USC* *Required Skills:* 8-10 Years of total experience, can go for high expr candidates only if they are interested to work on a junior role of hands on developer/engineer. 5+ years of development experience in data management/integration/BI tools such as Informatica, Business Objects/Tableau etc., Role will take direction from Tech Lead and/or Data Architect and will help in developing the data management solutions. Preferred to have knowledge in the data modeling, ETL patterns and/or standards in the data management space. *__* *Role:UI (React JS)Developer* *Location:Phoenix,AZ* *Client: AMEX * *Contract:long term * *VISA: OPT-EAD* *Skills:* React JS Angular JS HTML5,CSS3 *Detailed Description:* Need a strong UI/Front end Developer for client Amex at Phoenix. *Job Title: Hadoop developer* *Location: Phoenix AZ* *Client: AMEX* *Contract: long term * * VISA: OPT-EAD * *Job Details:* *Must Have Skills (Top 3 technical skills only):* 1. Hadoop ecosystem, 2. Java full stack 3. Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring (advanced) *Detailed Job Description:* Candidate should have Bigdata, Haddop handson experience. Proficient on Hive, Pig, Spark, Python, Kafka, MapR and JavaSpring advanced *Top 3 responsibilities you would expect the Subcon to shoulder and execute: * 1. Should engage with client to understand business requirement 2. Convert the requirement into design and work products 3. Engage with all stakeholders in necessary technical and business communications -- *Best Regards,* *Santosh Kumar* *IT ANALYST * *Mail: **sant...@humacinc.com * *Hangouts:* *recruiter.lo...@gmail.com * *" Hire Character, Train Skill "* LinkedIn : *www.linkedin.com/in/santosh-kumar-98b6ba15b <http://www.linkedin.com/in/santosh-kumar-98b6ba15b/>* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@gmail.com Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@gmail.com Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@gmail.com Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Required : Sr. Data engineer (Hadoop and AWS) - Malvern, PA
Share resumes at brijes...@gmail.com Job Title: Senior AWS / Data Engineer Location: Malvern, PA Experience: 8+yr JD : * Work as a data engineer (Hadoop and AWS) in the migration of code and data for applications residing in On-Prem Hadoop cluster to AWS cluster using hive, python, shell for data processing and NGA toolset for code migration * Build data pipeline between AWS Cloud and On-Prem * Create data ingestion framework in cloud using SQOOP and S3 for storage * Configurate EMR and schedule batch jobs using Control M * Integrate Tableau with EMR for reporting *Best Regards,* *Brijesh Kumar * *USA : 219-209-4155* -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
It’s a Urgent one_Big Data Engineer/Developer || Boston, MA
*NO OPT'S ,GC-EAD || Max Rate $55-60/hr on c2c !! Must Need : Passport Copy, Number * Please find an urgent requirement on C2C & please respond ASAP with your profile. Please send me Resume @ neh...@nityo.com Role : Big Data Engineer/Developer(8yr) Location : Boston, MA Duration: 12+ Months Implementation Partner : TCS(Tata consultancy services) Technical/Functional Skills:- MapReduce,Spark,Storm,Kafka Apache Hadoop. ... Apache Spark. ... NoSQL. ... SQL. ... Data Visualization. ... General Purpose Programming Languages. Thanks & Regards, Neha Gupta Team Lead Desk : 609-853-0818 * 2105 neh...@nityo.com neha.gupta1...@gmail.com www.nityo.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.
Sr. Android Developer @ San Francisco, CA, Atlanta GA and Dallas TX // Data Scientist OR Data Engineer @ Irving, TX // Sr. / Lead ServiceNow Developer @ Atlanta GA
*Send resume to raj...@valconusa.com * *Job Title*: Sr. Android Developer *Location*: San Francisco, CA, Atlanta GA and Dallas TX *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position In this role, you will be responsible for hands-on design and development of mobile software applications, libraries, services that interface with wearable sensor platforms. This position requires ability to participate in all aspects of the software project from requirements definition to product delivery. The candidate will be a self-driven person who can not only identify problems but also come up with creative ways to solve them. But it doesn’t end there. Here’s a summary of this great opportunity. *Job Description* · Develop and maintain native Android mobile applications, libraries, and services · Work with internal and external stakeholders to identify use cases and interfaces to provide the stakeholders with the right solution. · Work with multi-disciplinary team in an agile and fast paced development environment to define, create, and maintain the software product. · Write and execute unit and integration tests. Perform and support system-level troubleshooting. · Write requirements and design documents in compliance with our internal processes. · Execute all development projects in compliance with company and regulatory guidelines. *Qualifications* · B.S. in computer science, software engineering, computer engineering, electrical engineering, or related area of study · 2+ years of software design and development experience · Familiarity with mobile software design and development concepts or a strong interest in learning to develop mobile software · Understanding of mobile application design patterns · Familiarity with communication interfaces (i.e. BT 2.0, BLE, wireless interfacing) is desired but not required · Understanding of Cloud interfacing concepts (e.g. JSON, RESTful interfacing) · Familiarity with embedded software concepts · Understanding of software development life cycle · Demonstrated ability to understand projects at the system-level · Ability to communicate effectively, in writing or verbal, with various stakeholders including hardware engineers, software engineers, scientists, technicians, clinical, regulatory, and marketing. · Medical device development experience or familiarity with FDA guidelines for medical device development desired but not required · Excellent presentation skills. Excellent communication skills (verbal and written). · Ability to read, analyze, and interpret complex documents. Ability to write presentations and other documents using original or innovative techniques or style. · Excellent organizational skills along with strong attention to detail · Ability to work both independently and collaboratively with small, cross-functional teams · Highly proficient with Microsoft Office Suite. *--* *Job Title*: Data Scientist OR Data Engineer *Location*: Irving, TX *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position *Qualifications:* · The architect must have experience leading and delivering large projects · Experience with Agile application development methodologies · The ability to lead projects made up of multiple teams · Experience in designing and architecting complex solutions · Data Modeling, Erwin (or equivalent is a must) tool experience, data analysis, Kimball and Inmon design concepts · Proven experience understanding and translating business needs into a comprehensive, cohesive architecture that meets all functional and non-functional requirements is required · Strong communication skills are also required with proven experience influencing others outside of the project reporting hierarchy including senior leadership *Experience in the following areas:* · Transformation of Legacy Applications and associated projects/efforts that have heavy data integration and analytic · Architecting modern Data Architectures that include the Hortonworks Stack, Hadoop platform, SPARK, Hive, Governed Data Discovery (GDD) and other emerging technologies that promote scalable and perform data ingestion, processing and exploration · Architecture strategy and approach for data and analytic capabilities and patterns through collaboration with our internal IT and business partners. *-* *Job Title*: Sr. / Lead ServiceNow Developer *Location*: Atlanta GA *Interview Mode*: Skype/WebEx *Duration*: 12+ Months Contract Position *Duties & Responsibilities:* · Works with the Product team in an agile scrum to understand the challenges that users face in their day-to-day work and partner with them to design viable solut
Big Data Engineer : : Mountain View, CA
*Hi,* *Immediate Requirement* *Please reply On vin...@1pointsys.com * *Role: Big Data Engineer* *Location: **Mountain View, CA* *Interview: - Phone and Skype* *VISA: H1, GC-EAD, GC, USC* *6 of these positions* and client is wanting to move quickly *Must have extensive knowledge and experience in Big Data Engineering and be strong with Python, Java and Linux* *Job Description* Your core responsibility will be to maintain and scale our infrastructure for analytics as our data volume and needs continue to grow at a rapid pace. This is a high impact role, where you will be driving initiatives affecting teams and decisions across the company. You’ll be a great fit if you thrive when given ownership, as you would be the key decision maker in the realm of architecture and implementation. *Responsibilities* - Main duties include: Database Design, Programming, Building Front End Tools, Creating Workflow, System Automation** - Architect systems and end-to-end solutions that provide fast, efficient and reliable interfaces to heterogeneous data, meta data for internal users of the analytics infrastructure. - Automate existing processes and create systems that favor self-service data consumption. - Own the quality of our analytics data. - Implement a robust monitoring & logging framework that guarantees the trace-ability of inevitable incidents. - Evaluate whether the best solution for each problem at hand is to build, buy or contract the work. - Interface with data scientists, analysts, product managers and all other customers of the analytics infrastructure to understand their needs and expand the infrastructure as we grow. *Requirements* - BS/BA in Computer Science/Engineering, or relevant technical field with 7 years of experience as software engineer and/or data engineer and/or front-end engineer and/or full-stack engineer - *Strong with Python, Java and Linux* - Ability to manage data warehouse plans and communicate them to internal clients. - At least 7 years of experience as a Data Engineer, or in a role that required expertise in data pipeline technologies. - Strong overall programming skills, able to write modular, maintainable code and high-quality code - Strong in web-based programming (CSS, HTML, PHP) - Experience in one or more data visualization libraries like Tableau, Plotly, InfoGram, Material Design - Strong python programming specially in python machine learning and data mining libraries (SciPy, Panda, Numpy) - Strong in Linux and shell-scripting - Specialized experience with at least one of HDFS, EMR, Redshift, Spark, Flink, or Presto. - Experience with SQL RDBMS is required. - Experience in client/server, RESTFul architecture and tools like Jenkins, RunDeck - Experience in basics of data mining, clustering, classification and comfortable to work with large matrices of data effectively - Familiar with CUDA, Blast and machine leaning engineers like Tensorflow, Torch, PyTorch, DIGITS *Vineet Mishra* *Sr. Technical Recruiter* *1 Point System LLC* Unit 103, 206 N College St, Pineville, North Carolina, United States - 28134. *W*: www.1pointsys.com *P*: 803-317-2541 *E*: vin...@1pointsys.com -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To post to this group, send email to corptocorp@googlegroups.com. Visit this group at https://groups.google.com/group/corptocorp. For more options, visit https://groups.google.com/d/optout.