josh-fell commented on code in PR #31188:
URL: https://github.com/apache/airflow/pull/31188#discussion_r1194193072


##########
airflow/providers/dbt/cloud/operators/dbt.py:
##########
@@ -154,17 +159,28 @@ def execute(self, context: Context):
                 return self.run_id
             else:
                 end_time = time.time() + self.timeout
-                self.defer(
-                    timeout=self.execution_timeout,
-                    trigger=DbtCloudRunJobTrigger(
-                        conn_id=self.dbt_cloud_conn_id,
-                        run_id=self.run_id,
-                        end_time=end_time,
-                        account_id=self.account_id,
-                        poll_interval=self.check_interval,
-                    ),
-                    method_name="execute_complete",
-                )
+                job_run_info = JobRunInfo(account_id=self.account_id, 
run_id=self.run_id)
+                job_run_status = self.hook.get_job_run_status(**job_run_info)
+                if not DbtCloudJobRunStatus.is_terminal(job_run_status):
+                    self.defer(
+                        timeout=self.execution_timeout,
+                        trigger=DbtCloudRunJobTrigger(
+                            conn_id=self.dbt_cloud_conn_id,
+                            run_id=self.run_id,
+                            end_time=end_time,
+                            account_id=self.account_id,
+                            poll_interval=self.check_interval,
+                        ),
+                        method_name="execute_complete",
+                    )
+                elif job_run_status == DbtCloudJobRunStatus.SUCCESS.value:
+                    self.log.info("Job run %s has completed successfully.", 
str(self.run_id))
+                    return self.run_id
+                elif job_run_status in (
+                    DbtCloudJobRunStatus.CANCELLED.value,

Review Comment:
   Great question. I think this could be handled in a separate PR though.
   
   I could see hard-failing the task on user cancellation being unexpected or 
expected. Perhaps this could be a new parameter to control how cancelled runs 
are handled?



##########
airflow/providers/dbt/cloud/operators/dbt.py:
##########
@@ -154,17 +159,28 @@ def execute(self, context: Context):
                 return self.run_id
             else:
                 end_time = time.time() + self.timeout
-                self.defer(
-                    timeout=self.execution_timeout,
-                    trigger=DbtCloudRunJobTrigger(
-                        conn_id=self.dbt_cloud_conn_id,
-                        run_id=self.run_id,
-                        end_time=end_time,
-                        account_id=self.account_id,
-                        poll_interval=self.check_interval,
-                    ),
-                    method_name="execute_complete",
-                )
+                job_run_info = JobRunInfo(account_id=self.account_id, 
run_id=self.run_id)
+                job_run_status = self.hook.get_job_run_status(**job_run_info)

Review Comment:
   This hook method does have logging lines of
   ```python
   self.log.info("Getting the status of job run %s.", str(run_id))
   ```
   and 
   ```python
   self.log.info(
       "Current status of job run %s: %s", str(run_id), 
DbtCloudJobRunStatus(job_run_status).name
   )
   ```
   which should handle the state logging.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to