TobKed commented on a change in pull request #11726:
URL: https://github.com/apache/airflow/pull/11726#discussion_r517393241



##########
File path: airflow/providers/google/cloud/operators/dataflow.py
##########
@@ -324,6 +344,23 @@ class DataflowTemplatedJobStartOperator(BaseOperator):
             `https://cloud.google.com/dataflow/pipelines/specifying-exec-params
             
<https://cloud.google.com/dataflow/docs/reference/rest/v1b3/RuntimeEnvironment>`__
     :type environment: Optional[dict]
+    :param wait_until_finished: (Optional)
+        If True, wait for the end of pipeline execution before exiting. If 
False,
+        it only waits for it to starts (``JOB_STATE_RUNNING``).
+
+        The default behavior depends on the type of pipeline:
+
+        * for the streaming pipeline, wait for jobs to start,
+        * for the batch pipeline, wait for the jobs to complete.
+
+        .. warning::
+
+            You cannot call ``PipelineResult.wait_until_finish`` method in 
your pipeline code for the operator
+            to work properly. i. e. you must use asynchronous execution. 
Otherwise, your pipeline will
+            always wait until finished. For more information, look at:

Review comment:
       `always wait until finished` means that task will wait for the terminal 
state of the job. It is the current behaviour for batch jobs and it is kept as 
default behaviour for backward compatibility. But there may be a case that user 
doesn't want to wait (after successful job start) for the end of the batch job 
but go further in the DAG.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to