Dear Folks,
Please find the Hot Requirements and Respond back Immediately to
[email protected]


Opens Req //1. Workday //2. Sr. Technical Hadoop Architect //3. Sr. Talend
Developer with Big Data // 4. Hadoop Admin// 5. AWS Admin


*Sr. Workday Integration Consultant*
*Location – Foster City*
*Duration – 1 Year*
*Must have 10+ yr of total experience. Please don’t send resumes with 7 yrs
exp*
• Ability to help clients resolve integration issues requiring in depth
expertise in the solution
• Demonstrated integration experience in Human Resources related domains,
(Payroll, Benefits, Recruiting, etc.) and/or Financials Management.
• Coordinate assigned deliverable schedules to ensure client’s timelines
are met. Provide guidance to clients or service partners on integration
strategies.
• Contribute to the customers’ experience with integration products, tools,
and services in a way that results in high customer satisfaction
• Lead project teams through integration discovery and scoping phases of
their implementation projects
• Engage where appropriate in the sales cycle to help scope integration
projects
• 10+ years implementing integration solutions with ERP software,
Peoplesoft is a must
•Sound understanding of HCM domain
•Experience with HCM Applications like PeopleSoft and Workday
•Experience with migrating data from PeopleSoft to Workday
•Must have worked with reporting & data extraction from PeopleSoft
•Great to have experience with Inbound and Outbound Integrations with
Workday
•Experience with atleast one full implementation of Workday HCM
•Good Analytical skills and Database background
•Ability to write and understand SQL Sripts
•Team Player and Self Driven
•Ability to work with Business, IT and Vendor teams
• Command of Service Oriented Architecture Concepts
• EDI, Web Services, XML, XSLT, Java, .Net, Informatica or other
integration technologies

*Sr. Technical Hadoop Architect *
*Location – Campbell CA*
*Duration – 6 Months*
*Must have 12+ yr of total experience. *
·         12+ years of solid knowledge of Object Oriented design principles
and development
·         Recent hands-on experience in designing and developing java-based
secured, high-availability, enterprise-level platforms/products
·         Direct experience on Architectural and design patterns such as
n-tier, lambda etc.
·         3+ years of hands on experience on Hadoop Ecosystem technologies
like HBase, MapReduce, Spark, Pig, Hive, Flume, Sqoop, Cloudera Impala,
Zookeeper, Oozie, Hue, Kafka
·         Experience with search systems (Lucene, Solr, ElasticSearch)
·         Experience in Capacity Planning, Cluster Designing and
Deployment, Troubleshooting and tuning the clusters
·         Exposure to high availability configurations, Hadoop cluster
connectivity and tuning, and Hadoop security configurations
·         Strong experience with Cloudera/HortonWorks/MapR versions along
with Monitoring/Alerting tools (Nagios, Ambari, Cloudera Manager)
·         Experience in Hadoop Cluster migrations or Upgrades
·         Strong knowledge in RESTful Web Services
·         Strong scripting skills in Python / Shell Scripting.
·         Deep knowledge and experience with J2EE platform, Hibernate,
Spring
·         Strong knowledge of SQL, MS SQL Server, Oracle, MySQL or
PostgreSQL
·         Knowledge of hybrid cloud solutions (Azure, Google Compute Engine
and Amazon Cloud) with hands-on experience in direct API integration is an
advantage
·         Must have a track record of delivering products and experience
with the complete software lifecycle, including software requirements and
design specification, definition, and creation/verification of engineering
test procedures

*Sr. Talend Developer with Big Data *
*Location – Campbell CA*
*Duration – 6 Months*
*Must have 12+ yr of total experience. *
·         Proven understanding and related experience with Hadoop, HBase,
Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce
·         Experience working with Talend for BIG DATA
·         Basic UNIX OS working skills and Shell Scripting skillsPrefer
experience with Hortonworks Hadoop distribution components and custom
packages
·         Help, Design, Develop and Deploy the ETL processes for the DWH
team using HADOOP (PIG,HIVE) on Hortonworks Hadoop
·         Create, debug, and execute Talend mappings, sessions, tasks.
·         Participate in development, defect resolution on existing ETL
processes
·         Collaborate with the Data Warehouse team to design and develop
required ETL processes, performance tune ETL programs/scripts.
·         Maintain and enhance ETL code, work with the QA and DBA team to
fix performance issues.
·         Responsible for the documentation of all Extract, Transform and
Load (ETL) processes
·         Experience working with HortonWorks Hadoop
·         Excellent RDBMS (Oracle, SQL Server) knowledge for development
using SQL/PL SQL.
·         Analyze complex distributed production deployments, and make
recommendations to optimize performance.
·         Team-player excited to work in a fast-paced environment. Agile
experience preferred.

*Hadoop Admin*
*Location – Chicago IL*
*Duration – 3 Months*
*Must have 12+ yr of total experience. *
·         Installation and configuration of selected Hadoop version
(HDFS/HDInsight) and components thereof.
·         Installation and configuration of selected ETL tool
(Talend/Informatica)
·         Installation of RDBMS as needed (SQL Server)
·         Setting up Microstrategy and Tableau environment, adding them to
the new infrastructure
·         Working with and advising client infrastructure teams for the
following activities:
·         Setting up zoning/Security between the On-Premise and Cloud
Infrastructure
·         Provisioning the required VMs based on the Architecture/Tools
that are finalized (Hadoop, ETL, RDBMS) in the Enterprise AWS / AZURE cloud.
·         Single-Sign on Authentication from Client laptops to
Cloud/On-Premise Servers/VMs.
·         Configuration and management of firewall/Ports between Servers in
the Cloud infrastructure
·         VPN tunneling setup for data pull from on-premise datacenter to
Cloud applications.
·         Ensuring that network bandwidth between the on-premise datacenter
and Cloud infrastructure is optimal for data pull.
·         High Availability, Backups and Disaster Recovery management

*AWS Admin*
*Location – Campbell CA*
*Duration – 4 Weeks*
*Must have 10+ yr of total experience. *
Extensive hands on experience building and maintaining production web
infrastructures in Amazon’s Web Services (AWS) cloud platform.
Deep experience with some of the following AWS components - EC2, S3,
Elastic IPs,  EBS, Security Groups, Route 53, VPC, ElasticCache, and
CloudFormation.
Strong understanding of the HTTP protocol.
Strong knowledge of the following:
·         Connectivity between two AWS clouds
·         Firewall configuration ( between two AWS clouds/on premise to AWS
cloud)
·         Authentication ( SAML 2.0 compliant, Integration with
OKTA/Windows AD)
Support the setup, configuration, and maintenance of an effective AWS cloud
environment.
Ensure automatic provision of network and storage resources governed and
controlled by automation, standards, roles and policies.
Help develop a cloud architecture strategy and resolution of architectural
issues.
Contribute to the creation and maintenance of a cloud technology strategy
and roadmap to influence IT decision making.
Understand security best practices, policies and standards to design highly
secure cloud architectures for internal and external cloud solutions.
Collaborate with stakeholders to understand business requirements and build
cloud architecture designs that meet the business needs.


Thanks & Regards
Prasad


6500 Dublin Blvd., Suite 209
Dublin, CA 94568
Email - [email protected]
Fax - 925-905-4370
URL - www.zen-sol.com
An Company

Offices: Dublin, CA;Frisco, TX;Framingham,MA

-- 
You received this message because you are subscribed to the Google Groups 
"Business_Intelligence" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/business_intelligence.
For more options, visit https://groups.google.com/d/optout.

Reply via email to