*please send resumes to [email protected]
<[email protected]>*


*Big Data Architect*

Rate:DOE

Location: San Ramon, CA



We are looking for an architect with expertise on Cassandra skills.
The candidate should have the following skills and at least 3+ years of
working experience in Cassandra.

   - Thorough knowledge and experience of Cassandra, Architecture, Data
   modeling & Development.
   - Good Data warehouse experience
   - ETL/ELT Experience in traditional and Big data platform.
   - Working Experience on RDBMS & NoSQL databases
   - Hands-on experience in Java and Scala.


This role is expected to provide architectural guidance to the data
engineering team to define strategies, develop and deliver solutions that
enable the collection, processing and management of information from
variety of sources, and the subsequent delivery of information to audiences
in support of key business processes.

The architect is expected to be self-driven, take charge in identifying and
fixing problems, and excel in a fast changing, growing environment.

The Big data architect understands multiple Big data platforms and is able
to apply the power, flexibility and performance of Big data to Digital
programs.

*Responsibilities:*

• Work with customers, users, architects, and application designers to
define the data requirements and structure for the application.
• Provide technology guidance, evaluate tools, perform POC’s, Design
solutions & Documentation
• Establish and maintain policies, procedures and standards for data
management and processing.
• Identifies, assesses and solves complex business problems for area of
responsibility, where analysis of situations or data requires an in-depth
evaluation of variable factors.
• Lead analysis, model and design the application data structure, storage,
integration, deployment and support.
• Work with customers to implement security and access controls
• Design and implement integrated solutions with Analytics and Machine
learning modules
• Work with the integration solution architects and designers to design the
integration solution.
• Designing and documenting conceptual/logical models and implementing
physical models
• Assist in determining the cross-application data standards, data
distribution standards, and tuning strategies.
• Work with various NO SQL databases, in memory, indexing and search tools
to implement complex solutions.
• Work with different relational databases, ETL/ETL tools and Big Data
technologies such as Hadoop, Hive, Pig, HBase, Hadoop Eco System.
• Review the database deliverables throughout development to ensure quality
and traceability to requirements and adherence to all quality management
plans and standards.
• Support the development and test teams with the creation of test data.
Ensure test data conforms to data security requirements.
• Has some latitude in decision-making, acts independently to determine
methods and procedures on new assignments.
• Manages large - medium sized teams and/or work efforts and also hands on
technologies to be an individual contributor

*Required Skills & Experience:*

• Good understanding of SDLC process and experience working in agile
environment.
• Minimum 10 years of experience in working on Data warehouse and
integration solutions.
• Experience in Data Analytics with Big Data
• Good working Experience on Data Integration, warehousing concepts -
Dimensional and Relational models, ETL tools, and Reporting.
• Good Experience integrating multiple Big Data solutions and legacy
database systems
• Experience processing large amounts of structured and unstructured data.
• Expert level knowledge of Hadoop ecosystem components – Hadoop, Map
Reduce, Pig, Hive, Solr, Elastic search, Spark, Kafka, Storm, Falcon,
Oozie, Hawq, Gemfire XD etc.
• Expert level knowledge of one or more NoSQL databases - HBase, Cassandra,
MongoDB
• Advanced skills using one or more scripting languages (e.g. python, UNIX
shell scripts)
• Ability to quickly understand business problems, find patterns and
insights.
• Ability to quickly learn new technologies and work effectively in a very
dynamic environment.
• Proven ability to build, manage and foster a team-oriented environment
• Proven ability to work creatively and analytically in a problem-solving
environment
• Excellent communication (written and oral) and interpersonal skills
• Excellent leadership and management skills

*Additional Preferred Qualifications:*

• Master’s degree or higher in Computer Science or related field
• Working experience in IOT projects
• Good knowledge in data sciences and data analytics and utilization of
Machine learning algorithms.
• Deep knowledge in data mining, machine learning, natural language
processing, or information retrieval.
• Strong knowledge of and experience with statistics; potentially other
advanced math as well.



Thanks,
* Annie Green*

Lead Recruiter,

Zenithsoft Systems Inc.,
2975 Bowers Ave, Suite 327, Santa Clara CA 95051
Phone: +1 (925)-592-6011
www.zenithsoftsystems.com
*It's our pleasure to serve you!*
Serving IT industry since Year 2001

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT.Groups" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/us_itgroups.
For more options, visit https://groups.google.com/d/optout.

Reply via email to