*Hi                         *

*Please lookup the below position and if you feel comfortable ,then please
send me your updated resume   *



*Position: *




*Hadoop Architect   Location : Fort Worth TX Duration : 18+ Months
(Extendable) Interview Mode: Phone / Skype Job Description*
Experienced Hadoop administrator/architect to provide subject matter
expertise on all aspects of HortonWorks (required) and Cloudera (preferred)
Hadoop deployments.

Job Responsibilities

·         Architecting scalable & highly available Hadoop environments

·         Working with business and application development teams to
provide effective technical designs aligning with industry best practices

·         Capturing the as-is operational state and working with team
members to define the roadmap and future state

·         Defining secure, highly reliable integration strategies for
applications and customer systems

·         Interfacing with other groups such as security, network,
compliance, storage, etc.

·         Providing subject matter expertise, training, and direction to
team members.

·         Recommending and aiding in establishing development, testing and
documentation standards

·         Monitoring and ensuring compliance with architectural and
development standards

·         Identifying and recommending new technologies, architectures,
processes and tools to increase efficiency and productivity

·         Engaging with external vendors to evaluate products and lead POC
development

·         Working with multiple products and technologies at all tiers of
the architecture to guide the design and implementation of innovative,
scalable and robust solutions

·         Installation, configuration, monitoring, and administering of
large Hadoop clusters

·         Designing and participation in testing of DR, replication & high
availability solutions

·         Implementing Kerberos, Knox, Ranger, and other security
enhancements

·         Arranging and managing maintenance windows to minimize impact of
outages to end users

·         Independently managing, leading, and executing complex projects
with cross-functional teams

Job Requirements

·         4+ years of experience on Hadoop clusters; minimum of 2 years
with the HortonWorks distribution

·         Expert level experience in Hadoop infrastructure design and
deployment

·         8+ years of experience in Linux-based systems or database
administration

·         Experience with analytical tools, languages, or libraries

·         Hands-on experience with production deployments of Hadoop
applications

·         Strong understanding of best practices and standards for Hadoop
application design and implementation

·         Hadoop administration experience that includes:

o    Installing, configuring, monitoring, and administering large Hadoop
clusters

o    Backup & Recovery of HDFS

o    DR, Replication & High Availability of Hadoop infrastructure

o    Securing the cluster using Kerberos, LDAP Integration, and/or Centrify

o    Managing & Scheduling jobs

o    Managing Hadoop Queues, Access Controls, user quotas etc.

o    Capacity planning, configuration management, monitoring, debugging,
and performance tuning

·         Experience monitoring and troubleshooting using a variety of open
source and proprietary toolsets

·         Hands-on experience with Big Data use cases & development

·         Experience with Hadoop tools/languages/concepts such as HDP 2.0+,
HDFS, Hive, Oozie, Sqoop, PIG, Flume, Spark, Kafka, Solr, Hbase, Ranger,
Knox, Map Reduce, etc.

·         Expert understanding of MapReduce and how the Hadoop Distributed
File System works

·         Understanding of enterprise ETL tools (e.g. DataStage)

·         Experience with related/complementary open source software
platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef,
Puppet)

·         Understanding of relational databases (RDBMS), SQL & No-SQL
databases

·         Ability to coordinate and prioritize multiple tasks in a
fast-paced environment

·         Strong verbal/written communication and presentation skills

·         Experience with highly-scalable or distributed RDBMS (Teradata,
Netezza, Greenplum etc.) is preferred

·         Knowledge of cloud computing infrastructure and considerations
for scalable, distributed systems is preferred

·         Experience with Statistical Modeling, Data Mining and/or Machine
Learning is preferred

·         Ability to build relationships and work effectively with
cross-functional, international team







Regards

Ranjeet Kumar

Technical Recruiter

VSG Business Solutions

Phone No.  302-261-3207 Ext - 112

Email :ranj...@vsgbusinesssolutions.com

Hang Out  :vsgranjeet

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.

Reply via email to