Hello Vendors
Hope you are doing well…!!!
Candidate should be available on Saturday.
Make sure you didn’t submitted before your profile to IMPETUS/AMEX in last
4-5 months.
Visa :: OPT EAD,H1B,USC Only 5 Years of exp workable.
Not workable :: Do not share H4EAD, GC EAD, GC Candidates.

Please share your interest and resume at *vi...@stglobaltech.com*.


*Role : Bigdata Engineer with GCP Location- Phoenix, AZ. Only Locals*
Job Description:
We are looking for a Big Data Engineer with expertise in Google Cloud
Platform (GCP) to design, develop, and optimize large-scale data processing
systems. The ideal candidate will have experience working with GCP data
services, big data frameworks, and data pipeline orchestration to drive
scalable and efficient data solutions.
Key Responsibilities:
• Design, develop, and maintain end-to-end data pipelines on GCP.
• Work with BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and other
GCP services for data processing.
• Optimize data storage, retrieval, and transformation processes for
scalability and performance.
• Develop and maintain ETL/ELT pipelines using Apache Spark, Apache Beam,
or Cloud Data Fusion.
• Ensure data quality, governance, and security within the cloud
environment.
• Collaborate with data scientists, analysts, and application teams to
deliver data-driven solutions.
• Automate data workflows and orchestration using Cloud Composer (Apache
Airflow).
• Implement real-time data streaming solutions using Pub/Sub, Kafka, or
similar tools.
• Monitor and troubleshoot data pipelines to ensure reliability and
performance.
• Work with Terraform, CloudFormation, or Infrastructure as Code (IaC) for
environment setup and automation.
Required Skills & Qualifications:
• 10+ years of experience in Big Data Engineering with a focus on GCP.
• Hands-on experience with Google Cloud BigQuery, Dataflow, Dataproc, Cloud
Composer (Airflow), and Pub/Sub.
• Strong programming skills in Python, Java, or Scala.
• Experience with SQL, NoSQL databases, and data warehousing concepts.
• Expertise in Apache Spark, Apache Beam, or Hadoop ecosystems.
• Familiarity with real-time data processing and streaming technologies.
• Knowledge of CI/CD, DevOps practices, and Infrastructure as Code (IaC).
• Strong understanding of data governance, security, and compliance best
practices.
• Experience with Terraform, Kubernetes, or Docker is a plus.
• GCP certification (e.g., Professional Data Engineer) is a plus.
Preferred Qualifications:
• Experience working with multi-cloud or hybrid cloud environments.
• Familiarity with machine learning workflows and MLOps.
• Experience integrating GCP services with third-party tools and API

-- 
You received this message because you are subscribed to the Google Groups 
"Powerbuilder Assignments" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to powerbuilder-assignments+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/powerbuilder-assignments/CALm%3D67-dOM1-%2Bkp8BZgGK_nBc8b9BZP6wVsKztZKZEc1VdF%2BVw%40mail.gmail.com.

Reply via email to