Hi

This is *Abdul Raheem* from *ExaTech Inc*. I have an immediate position
mentioned below. Please review and let me know your interest.



*Role: GCP Data Engineer*

Location: Dallas, TX (OR) Hartford, CT (Hybrid, Locals preferred)

Duration: 12+ months

Interview: Video




*Must have: GCP and Teradata*


*Job Description:*

We are seeking a skilled Data Engineer to support a high-impact enterprise
data migration initiative. The goal is to migrate data warehouse assets and
ETL pipelines from Teradata to Google Cloud Platform (GCP). The role
involves hands-on development, testing, and optimization of data pipelines
and warehouse structures in GCP, ensuring minimal disruption and maximum
performance.



*Key Responsibilities:*

·         Lead and execute migration of data and ETL workflows from
Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow,
Dataproc, and Composer (Airflow).

·         Analyze and map existing Teradata workloads to appropriate GCP
equivalents.

·         Rewrite SQL logic, scripts, and procedures in GCP-compliant
formats (e.g., standard SQL for BigQuery).

·         Collaborate with data architects and business stakeholders to
define migration strategies, validate data quality, and ensure compliance.

·         Develop automated workflows for data movement and transformation
using GCP-native tools and/or custom scripts (Python/Java).

·         Optimize data storage, query performance, and costs in the cloud
environment.

·         Implement monitoring, logging, and alerting for all migration
pipelines and production workloads.



*Required Skills:*

·         6+ years of experience in Data Engineering, with at least 2 years
in GCP.

·         Strong hands-on experience in Teradata data warehousing, BTEQ,
and complex SQL.

·         Solid knowledge of GCP services: BigQuery, Dataflow, Cloud
Storage, Pub/Sub, Composer, and Dataproc.

·         Experience with ETL/ELT pipelines using tools like Informatica,
Apache Beam, or custom scripting (Python/Java).

·         Proven ability to refactor and translate legacy logic from
Teradata to GCP.

·         Familiarity with CI/CD, Git, and DevOps practices in cloud data
environments.

·         Strong analytical, troubleshooting, and communication skills.



*Preferred Qualifications:*

·         GCP certification (e.g., Professional Data Engineer).

·         Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on
GCP.

·         Experience working in the healthcare, retail, or finance domains.

·         Knowledge of data governance, security, and compliance in cloud
ecosystems



Thanks and Regards

*Abdul Raheem, Sr. Talent Acquisition Lead*

*Email: **rah...@exatechinc.com <rah...@exatechinc.com>*

Skype & Hangout: mdabdulrahee...@gmail.com

*4555 Lake Forest Drive, Suite 650 **| **Cincinnati, OH  45242  *

An E-Verified Company

USA-Canada-INDIA

-- 
You received this message because you are subscribed to "rtc-linux".
Membership options at http://groups.google.com/group/rtc-linux .
Please read http://groups.google.com/group/rtc-linux/web/checklist
before submitting a driver.
--- 
You received this message because you are subscribed to the Google Groups 
"rtc-linux" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to rtc-linux+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/rtc-linux/CAJHuG4NfsHPgtU3b3R-v1zj7FSJsTqbKsj65dCbN32wgzW28Tw%40mail.gmail.com.

Reply via email to