*Hi ,*


*We have opening for Data Engineer (ETL)             *


*Job Title : Data Engineer (ETL)             *

*Location :  Irvine , CA*

*Duration:   Long Term Contract*


*Job Description:*


Job Description:


• Bachelor’s degree (or higher) in Computer Science, Engineering, Science 
and Math or related technical discipline required.

• 7+ years of experience in Design and Development with expert knowledge of 
DW Architecture and ETL Processes. Strong programming skills in ETL.

• Proven skill in data ETL (Extraction, Transformation and Load) - for 
extracting data from a specific source, applying transformation rules, and 
loading it into the data warehouse area. 

• Expert SQL skills (T-SQL, stored procedure programing, functions, 
triggers) and willingness to code. 

• Experience or ability to learn sufficient C# and .NET

• Experience gathering and processing raw data (structured/unstructured) at 
scale, including writing scripts, web scraping, calling APIs, writing SQL 
queries on a variety of platforms, etc… 

• System Administration/Architecture experience. 

• Data Warehousing: expertise in relational databases/multidimensional data 
warehouse design 

• Ability to diagnose and resolve system bottlenecks 

• Experience in capacity planning for computation and storage needs 

• Competency in developing solutions for diverse and complex business 

• Proactive, initiative taker, possess ability to work independently with 
minimal supervision

• A collaborative working style and ability to work independently in a team 

• Excellent verbal and written communication skills 

• Ability to work under pressure, adapt to changing environments, manage 
multiple large projects How you stand out 

• MS degree in a technical field or equivalent • Experience with SQL Server 

• Experience building SSIS packages in Microsoft Visual Studio

• Experience building systems to transition from datasets ranging from 
gigabytes to terabytes

• Implement Big Data technologies (e.g., Pig, Hive, Spark, Hbase, Presto, 
Sqoop, Hadoop, Impala, Spark)

• Knowledge of AWS Data Stack using S3, EMR, Data Pipeline • Technical 
experience with Cloud Infrastructure

• Project Management Skills and Tools 

• Experience working in complex global environment 

• Container tools experience with Docker or Kubernetes.

• Knowledge of Code Repositories such as: Git/Github, TFS, SVN/Subversion, 
or Perforce.

• Scripting using one of the following: Python, Bash, CMD, Power Shell, 
GoLang, , NodeJS, or Ruby.



*Thanks & Regards ,*

*Naga vamshi*

*Work: *+1 734-928-2201 Ext:445

*Direct: *+1 734-928-2256

*Email: *nagavamsh...@greatlogics.com

*Web: *www.greatlogics.com



You received this message because you are subscribed to the Google Groups 
"CorptoCorp" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to corptocorp+unsubscr...@googlegroups.com.
To post to this group, send email to corptocorp@googlegroups.com.
Visit this group at https://groups.google.com/group/corptocorp.
For more options, visit https://groups.google.com/d/optout.

Reply via email to