pankajkoti commented on code in PR #34018:
URL: https://github.com/apache/airflow/pull/34018#discussion_r1313488200


##########
airflow/providers/google/cloud/operators/bigquery.py:
##########
@@ -443,6 +443,10 @@ def execute(self, context: Context) -> None:  # type: 
ignore[override]
                     method_name="execute_complete",
                 )
             self._handle_job_error(job)
+            # job.result() returns a RowIterator. Mypy expects an instance of 
SupportsNext[Any] for
+            # the next() call which the RowIterator does not resemble to. 
Hence, ignore the arg-type error.
+            records = next(job.result())  # type: ignore[arg-type]
+            self.check_value(records)

Review Comment:
   I gave the DAG a local run and saw that the task was not getting deferred at 
all, which means it does not defer after setting deferrrable=True.
   
   This happens because of 
https://github.com/apache/airflow/blob/3ae6b4e86fe807c00bd736c59df58733df2b9bf9/airflow/providers/google/cloud/operators/bigquery.py#L307
 
   The running state of the job evaluates to False and it does not defer. Had 
it deferred, the trigger would check it well and pass on the execution to 
execute_complete in the operator.
   
   But in scenarios where it does not defer, I have added this check later to 
actually achive what the operator is meant to do :) 
   
   Sorry I missed explaining this earlier, hope it makes sense now. 
   
   And yes, we cannot remove the check from the Trigger code as when it defers 
it goes to Triggererer where it checks and then returns to execute_complete



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to