*Hello Partners,*

*Please check the following  (2)  requirements and share your candidate 
resumes individually to email: [email protected] <[email protected]> / 
ph- 919-689-5606.*

*Please share candidate resumes with contact details.  please E-mail me if  
i'm not able to respond to your calls.*

 

*1)Role - Data Modeler *

*Client - Sumitomo *

*Location - NYC, NY*

*Duration: Longterm*

*Any Visa is Accepted (OPT, CPT, H1B, H1B Transfer, H4-EAB, GC-EAD, E3 
,L1,L2-EAD,USC and GC) accepted.*  

 

*The objectives of this position include, but are not limited to:*


   - Minim 8+ years of IT Experience.
   - Elicit business requirements for information (data) modeling focusing 
   on data objects
   - Visually depict information requirements for projects/subject areas 
   where data are well-defined and relationships are reflected
   - Create As-Is and To-Be information models reflecting business-related 
   data elements
   - Model Conceptual, Logical, and Physical Information Models / Data Models 
   in Rational Software Architect (RSA) tool
   - Generate Data Dictionary in a spreadsheet from the information model

*Required:*


   - One to five years of experience with data modeling
   - Previous experience modeling conceptual and logical information data 
models 
   using UML
   - Proficient with Microsoft Office Suite including Visio
   - Ability to obtain and maintain a security clearance.


*2) **Role: Sr. Data Engineer / Big Data / Hadoop Developer.*
*Location: Richmond VA or McLean VA *
*Client: Capital One*
*Duration: Long term*
*Any Visa is Accepted*
*Minimum Experience: 8+ Years (Please share resumes with minimum  8 years 
Experience).*
*Rate: Negotiable as per experience*

*Responsibilities of the role: *

   - Build data pipeline frameworks to automate high-volume and 
   real-time data delivery to our cloud platform
   - Build the infrastructure required for optimal extraction, 
   transformation, and loading of data from a wide variety of data sources 
   using SQL and AWS ‘big data’ technologies
   - Develop and enhance applications using a modern technology stack such 
   as Java, Python, Shell Scripting, Scala, Postgres, Angular JS, React, and 
   Cloud based data warehousing services such as Snowflake.
   - Perform unit tests and conduct reviews with other team members to make 
   sure your code is rigorously designed, elegantly coded, and effectively 
   tuned for performance.

*Required Experience:*

   - 5+ years of experience building data pipelines and using ETL tools to 
   solve complex business problems in an Agile environment
   - 5+ years of experience in at least one scripting language (SQL, 
   Python, Perl, JavaScript, Shell) 
   - 3+ year of experience using relational database systems (Snowflake, 
   PostgreSQL, or MySQL) 
   - 3+ year experience working on streaming data applications (Spark 
   Streaming, Kafka, Kinesis, and Flink)
   - 3+ years of experience in big data technologies (MapReduce, Cassandra, 
   Accumulo, HBase, Spark, Hadoop, HDFS, AVRO, MongoDB, or Zookeeper).
   - 2+ years of experience with Amazon Web Services (AWS).

-- 
You received this message because you are subscribed to the Google Groups 
"Resumes" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/resumes/8f0882a6-d6e8-418d-ba6b-504a52bd2310o%40googlegroups.com.

Reply via email to