*Role: ETL Architect*

*Location: Denver, CO (Day1 Onsite)Duration: Contract*

*Client: Dish*




*Role & Responsibilities*

*What you Must Have*

·         12+ years of IT experience in Data Engineering, Data Quality,
Data Migrations, Data Architecture, Data Lake formation and Data Analytics.

·         5+ Years hands on solid Experience on AWS services like S3, EMR,
VPC, EC2, IAM, EBS, RDS, Glue, Lambda, Lake Formation etc.

·         Must have worked in producing architecture document for small to
large solution implementations.

·         In depth understanding of Spark Architecture including Spark
Code, Spark SQL, Data frames, Spark Streaming, Spark MLiB, etc. Experience
on handing very high-volume streaming data in various format like JSON,
XML, AVRO, Snappy etc.

·         Good Exposure to Kafka to design future capacity planning,
partition Planning, Read and write

·         Must have worked with Big Data and should have good knowledge Mar
reduce and Spark.

·         Must have very good working exposure on different kind of
databases like RDBMS, No SQL Columnar, Document, distributed databases,
Could Databases, in memory databases etc.

·         Python Exposure is an added advantage.

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/rtc-linux/CAOAWMy9xejmVev4iLPE8%3D3m5xgisxEc%2BYYmwiSexp9Mb7MeFEg%40mail.gmail.com.

Reply via email to