fenglu-g commented on a change in pull request #4083: [AIRFLOW-3211] Reattach
to GCP Dataproc jobs upon Airflow restart
URL: https://github.com/apache/incubator-airflow/pull/4083#discussion_r227528926
##########
File path: airflow/contrib/hooks/gcp_dataproc_hook.py
##########
@@ -33,12 +33,82 @@ def __init__(self, dataproc_api, project_id, job,
region='global',
self.dataproc_api = dataproc_api
self.project_id = project_id
self.region = region
+
+ # Check if the job to submit is already running on the cluster.
+ # If so, don't resubmit the job.
+ try:
+ cluster_name = job['job']['placement']['clusterName']
+ except KeyError:
+ self.log.error('Job to submit is incorrectly configured.')
+ raise
+
+ jobs_on_cluster_response =
dataproc_api.projects().regions().jobs().list(
+ projectId=self.project_id,
+ region=self.region,
+ clusterName=cluster_name).execute()
+
+ UUID_LENGTH = 9
Review comment:
This seems to be a bit error prone. How about the following? Let's modify
the submit job operator interface by taking a new parameter job_dedupe_regex,
when set, the hook will match existing jobs with this job_dedupe_regex and skip
a new submission if a new match is found and other conditions are met.
This way, the hook implementation is entirely decoupled from the operator
implementation re: job-name and open to other ways of deduping dataproc jobs.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services