Hi,
Please find the below requirement
*Any Visa is Fine but no Transfers *
*Max Rate: $60/h on C2C don’t send me the resumes over that rate*
*Must have Azure API Development Experience. It is Absolute must don’t send
profiles with out that experience*
Position: Azure Data Engineer
Location: 100% Remote
Duration: 12+ Months Contract
Job Description:
Required Skills-strong Azure Data Services -MUST HAVE Azure API development
experience -Scala or PySpark -strong SQL knowledge
*Job Description:*
Key Responsibilities:
- Designs and automates deployment of our distributed system for
ingesting and transforming data from various types of sources (relational,
event-based, unstructured).
- Designs and implements framework to continuously monitor and
troubleshoot data quality and data integrity issues.
- Implements data governance processes and methods for managing
metadata, access, retention to data for internal and external users.
- Designs and provide guidance on building reliable, efficient, scalable
and quality data pipelines with monitoring and alert mechanisms that
combine a variety of sources using ETL/ELT tools or scripting languages.
- Designs and implements physical data models to define the database
structure.
- Optimizing database performance through efficient indexing and table
relationships.
- Participates in optimizing, testing, and troubleshooting of data
pipelines.
- Designs, develops and operates large scale data storage and processing
solutions using different distributed and cloud based platforms for storing
data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo,
DynamoDB, others).
- Uses innovative and modern tools, techniques and architectures to
partially or completely automate the most-common, repeatable and tedious
data preparation and integration tasks in order to minimize manual and
error-prone processes and improve productivity.
- Assists with renovating the data management infrastructure to drive
automation in data integration and management.
- Ensures the timeliness and success of critical analytics initiatives
by using agile development technologies such as DevOps, Scrum, Kanban
Coaches and develops less experienced team members
Skillset:
- Hands-on Azure Data experience
- Tools to have: Databricks, Synapse, Data Factory, etc.
- High functional knowledge of Big Data
- Hands-on experience automating the ETL process
- No specific tool is required: Jenkins preferred
- Scala or PySpark knowledge (Scala preferred)
Thanks,
Manohar Reddy
IndSoft, Inc
630-524-0011
--
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
---
You received this message because you are subscribed to the Google Groups
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/rtc-linux/CAPY1naGnEq0ARZvKCYURZEKru3M199s9mFbDMF_fKrqPTGy46A%40mail.gmail.com.