*Data Architect*

*Cincinnati, OH*

*12 months*

*Must be 10+ years of experience*

*SKYPE & F2F*



*Job Description:-*

·         Extensive experience with data architecture, big data, data
lakes, data modeling (less critical), and strong functional knowledge of
Hadoop

·         Good understanding of best practices in Master Data Management,
Data Standards/Definitions, Data Quality, and Big DATA.

·         Experience with the major big data solutions like Hadoop,
MapReduce, Hive, HBase, MongoDB, Cassandra.

·         Experience in big data solutions like Impala, Oozie, Mahout,
Flume, ZooKeeper and/or Sqoop is a big plus

·         Should have a firm understanding of major programming/scripting
languages like Java, Linux, PHP, Ruby, Phyton and/or R and experience in
ETL tools like informatica, Pentaho etc.

·         Should have experience in designing solutions for multiple large
data warehouses with a good understanding of cluster and parallel
architecture as well as high-scale or distributed RDBMS and/or knowledge on
NoSQL platforms.

·         Good understanding of business process taxonomy – finance, credit
risk, fraud areas

·         TOGAF  certification is desirable

·         Individual contributor role, will be involved in strategic
discussions with customer leadership for building out this team

·         Must have completed at least two end to end Hadoop implementation
projects in Banking industry

·         Open to working in EST shift time



*Essential duties and responsibilities include the following:*

·         Contribute to the overall data architecture strategy and work
with customer leadership to define appropriate data architecture policies
based on the analysis, design and creation of the enterprise data
warehouses.

·         Perform detailed analysis of business process to define
architecture requirements.

·         To be able to benchmark systems, analyse system bottlenecks and
propose solutions to eliminate them

·         The initial focus of the data architecture candidate will be on
implementation of the customer’s big data / Hadoop platform and selection,
implementation of an integrated enterprise data model, and external
reporting

·         Evaluate data models, diagnose, research and resolve SQL and
database-related faults and performance problems

·         Function as an expert in the design, development, modification
and debugging of datamodels, databases, etc

·         Perform feasibility analysis on potential future data
implementations

·         The individual must be able to assess a variety of data-related
technologies, platforms and standards and share viewpoint that’s aligned to
business goals.

·         Perform cost-benefit analysis, if needed, in support of IT
planning, technology evaluation and budget management in coordination with
customer leadership.

·         Keep current with database, data modeling and data warehousing
trends and technological innovations in order to make appropriate
recommendations

·         Responsible for the overall design and development of a vision
that underlies a projected big data solution.





Thanks & Regards

Starc

st...@inteletechglobal.com

3214210611

-- 
You received this message because you are subscribed to the Google Groups "SAP 
Workflow" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-workflow+unsubscr...@googlegroups.com.
To post to this group, send email to sap-workflow@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-workflow.
For more options, visit https://groups.google.com/d/optout.

Reply via email to