Job Type* : 6 Months Contract*


Location: *Palo Alto, CA*



Client: *Verizon*


Big Data Architect



Only Locals



*Responsibilities *

Verizon Big Data team is looking for a big data architect with expert level
experience in web services, messaging and Big Data technologies. You will
be part of the team building one of the largest Big Data Platform(s) in the
world that can ingest 100s of Terabytes of data that will be consumed for
Business Analytics, Operational Analytics, Text Analytics, Data Services
and build Big Data Solutions for various Verizon Business units. You will
be architecting big data platform and solutions.



*Responsibility: *

Architect, design and build big data platform primarily based on Hadoop
echo system that is fault-tolerant & scalable.

Build high throughput messaging framework to transport high volume data.

Use different protocols as needed for different data services
(NoSQL/JSON/REST/JMS).

Develop framework to deploy Restful web services.

Build ETL, distributed caching, transactional and messaging services.

Architect and build security compliant user management framework for
multitenant big data platform.

Build High-Availability (HA) architectures and deployments primarily using
big data technologies.

Creating and managing Data Pipelines.



*Desired Skills: *

·         Bachelor¹s degree in Computer Science, Management Information
Systems

·         or related field.

·         At least 12 years of experience building and managing complex
products/solutions.

·         Proven track record of Architecting Distributed Solutions dealing
with real high volume of data.

·         Strong understanding of virtual machine technologies, physical
machines, networking and storage systems.

·         7+ years of experience with distributed, highly-scalable,
multi-node environments.

·         Expert level experience with Big Data Technologies
(Lucene/Solr/Elasticsearch, Hive, HBase, Spark, Kafka, Yarn, Storm, Splunk,
Vertica/Cassandra), understands the concepts and technology ecosystem
around both real-time/streaming and batch processing in Hadoop.

·         Expert level experience with Couchbase server.

·         Expert level experience with GlusterFS.

·         Experience in Dev/Ops (Puppet, Chef, Python, Ansible)

·         Experience developing Restful web services in Spring framework

·         Working knowledge of web technologies and protocols
(NoSQL/JSON/REST/JMS)

·         7+ years of experience working in Linux/Unix environment.

·         Expert level experience architecting, building and maintaining
Enterprise grade Hadoop Petabyte store.

·         Expert level experience with Shell Scripting, Perl and Python.

·         Expert level experience in Java.

·         Experience in Scala programming and knowledge in SMNP protocol
and netflow information will be a plus.

Most importantly be a good team player and willingness to learn and mentor
the team members





*Thanks & Regards*

*RK*

*Sales Manager*

Gtlak/yahoo : grk2403

*Ph:(925) 973-0000 x 219(Off) | Fax: (408) 273-6002*

*Email: **r...@allianceit.com* <va...@allianceit.com>*
|http://www.allianceit.com <http://www.allianceit.com> *

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT.Groups" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to us_itgroups+unsubscr...@googlegroups.com.
To post to this group, send email to us_itgroups@googlegroups.com.
Visit this group at http://groups.google.com/group/us_itgroups.
For more options, visit https://groups.google.com/d/optout.

Reply via email to