Hello Everyone,


Hope you doing good...!!!!!!



Send me your consultant resume on my id *d...@technocraftsol.com
<d...@technocraftsol.com>* or please free to contact me on *614-662-1008* for
the below positions of *iScala Developer* and Big Data DevOps Engineer.


*Position Title:           iScala Developer*

*Location:                   Moore, OK, 73160*


*Description*

The resource will be responsible to develop a set of *SQL programs* *to
extract, cleanse, transform and transfer data between 2 systems. Knowledge
of SQL and Crystal reports are a plus.*



*Position Responsibilities*



The *iScala ERP* is in a disparate environment - each hotel has its own
instance of the ERP in a separate server. The integration is working very
smoothly and no issues today. The future direction it to operate from a
centralized server and application environment. Thus there is a need to
review the current integration programs and evaluate their ability to scale
and review if they are per proper coding standards with error and exception
handling etc and also develop new interfaces, reports and customs forms
etc. The integration will support 100+ hotels.



*Following are the proposed integration required: *

*ERP – ISCALA to Reconciliation tool – Blackline *

• *Eprocurement – Birchstreet to AP Workflow tool – Genpact ESM (two way)*

• *GEnpact ESM to ISCALA ( Two way) • ISACLA to Hyperion – for daily
reports* •

• Gather and analyze business requirements from business owners, evaluate
the feasibility and provide level of effort estimates to complete the
enhancements/change requests.

• Design and develop technical/functional solutions for addressing business
requirements in Iscala application.

• Independently write functional, technical design documents, test scripts
and coordinate Business UAT.

• Recommend changes in development, maintenance and system standards.

• Work on all 3rd party applications integrated with IScala application to
ensure processes are in place to run all external interfaces on an agreed
upon schedule, configuring and maintaining scheduled concurrent requests.

• Ensure program code is written according to documented software
development processes and standards.

• Develop efficient and effective test plans and protocols for all software
development solutions.

• Perform impact analysis, obtain approvals and collaborate with other team
members within IT and the business before production changes are made.
Additional skills which are must to have: Experience in developing
technical companies in *MS SQL DB Experience in developing reports in
Crystal reports Proficient in MS Servers Experience in Iscala application
is a plus.*



*Personal Attributes: *

• Strong written and oral communications skills.

• Strong systems/process orientation with demonstrated analytical thinking,
organization skills and problem solving skills.

• Ability to work in a team-oriented, collaborative environment.

• Ability to quickly pick up new tools and technologies.



*Deliverables                     *

*Data flow diagrams • Technical Specs for Integration and common components
Code components for interfaces, reports and custom forms etc Test cases*



*Position:        Big Data/ DevOps Consultant *


*Duration:       Long term Location:       Basking Ridge NJ *


*Core Keywords:*

·         *Hadoop, Hive, HBase, Flume, Spark, Storm, Kafka, Ambari, Nagios,
Ganglia, Cloudera, Mahout, Talend, Sqoop, Oozie, Python, Java, Pig, RHEL,
scripting*



*Experience:*

·         Bachelor's or Master's degree in Computer Science, Computer
Engineering, or related field

·         6+ years experience performing *DevOps primarily on Hadoop
ecosystems* employing above stack elements, *having owned/maintained 2 or
more unique running systems* in that time period

·         6+ years of scripting experience with *Python, R, Scala, Pig,
Oozie, Java or similar*

·         3+ years of recent experience designing or maintaining secured
environments using *Kerberos, PKI, ACLs, etc*

·         2+ years of *ETL experience with tools like Flume, Sqoop, Talend
or* *similar*



*Responsibilities*:

·         Streamline and enhance the day-to-day operational workflow of an
enterprise-level Hadoop environment.

·         Constantly monitor, measure, and debug performance of a system
streaming GBs of data per day, focusing on data-driven metrics and
reliability and verification of data flows.

·         Work closely with Big Data Architect and Business Owners to
ensure performance of system is consistent with intended design and
business cases, while looking for ways to simplify processes, improve data
ingestion, analysis, and delivery, and optimize the use of resources.

·         Suggest future improvements, risks, challenges, or strategies for
the platform as it develops and grows into the future.

·         Create and present reports, presentations, and visualizations to
technical leads and executives demonstrating functionality of the platform
and justifying operational behavior.



*Required Skills:*

·         Thorough and extensive knowledge of the *Hadoop ecosystem and
distributed computing, including but not limited to Hadoop, Hive, HBase,
MapReduce, Zookeeper, YARN, Flume, Tez, Spark, Storm, Kafka, Ambari,
Mahout, Flink, Talend, Sqoop, Oozie, Zeppelin*

·         Expert at writing and debugging multiple scripting languages *(R,
Python, Java, Pig, Oozie)* for low-level processing, scheduling tasks,
analytics, and similar

·         Understand multiple *Linux distributions at a very deep level
(RHEL required)* running in the cloud, containers, or bare metal

·         Expert of monitoring and debugging tools and practices, and
capable of surfacing performance metrics and other KPIs to leadership to
provide operational summaries and checkpoints, such as Ganglia, Nagios,
Cloudera Manager and more.

·         Knowledge of modern security best practices and techniques for
encrypting data in transit and at rest, protecting data privacy without
sacrificing performance or data analysis capabilities

·         Knowledge and experience interacting with application servers and
web servers such as Nginx, Redis, IBM WebSphere, Tomcat, WebLogic, etc.

·         Experience with ETL applications and techniques using *Flume,
Sqoop, Talend, Sybase, etc*.

·         Experience with virtualization technologies and cloud



*Other Desired Skills:*

·         Excellent interpersonal, oral, and written communication skills

·         Highly motivated and success-driven with a strong sense of
ownership

·         Comfortable working in a fast-paced, Agile, competitive
environment

·         Ability to work independently and in group environments

Ability to problem solve effectively and efficiently



*Thanks and Regards*


*Dev*

Sr. IT Recruiter

*Email* : d...@technocraftsol.com

*Contact*: 614-662-1008

*Yahoo ID / Gmail ID* :  divyanshutechnocraft

*Linkedin : *www.linkedin.com/pub/dev-grover/a7/88/a/

www.technocraftsol.com

www.xdimensiontech.com

Partner with XDimension Technology



Note: Technocraft Solutions LLC works with Direct Client’s and Preferred
Vendors Nationwide.

Your confirmation would means that you understand the level of Technocraft
Solutions LLC association for the mentioned project and will not approach
Technocraft Solutions LLC Client directly.

-- 
You received this message because you are subscribed to the Google Groups "Open 
Source Erp & Crm" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to open-source-erp-crm+unsubscr...@googlegroups.com.
To post to this group, send email to open-source-erp-crm@googlegroups.com.
Visit this group at https://groups.google.com/group/open-source-erp-crm.
For more options, visit https://groups.google.com/d/optout.

Reply via email to