This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 94145446d1a Update documentation for Dataflow operators (#46954)
94145446d1a is described below

commit 94145446d1a803af3dc2424b4d3bee30c91d795d
Author: VladaZakharova <[email protected]>
AuthorDate: Mon Feb 24 10:41:06 2025 +0100

    Update documentation for Dataflow operators (#46954)
    
    * Update doc for Dataflow operators for update
    
    * Update providers/google/docs/operators/cloud/dataflow.rst
    
    Co-authored-by: Ankit Chaurasia <[email protected]>
    
    ---------
    
    Co-authored-by: Ulada Zakharava <[email protected]>
    Co-authored-by: Ankit Chaurasia <[email protected]>
---
 providers/google/docs/operators/cloud/dataflow.rst | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/providers/google/docs/operators/cloud/dataflow.rst 
b/providers/google/docs/operators/cloud/dataflow.rst
index 44c48d736b1..f4b4408f591 100644
--- a/providers/google/docs/operators/cloud/dataflow.rst
+++ b/providers/google/docs/operators/cloud/dataflow.rst
@@ -344,6 +344,14 @@ Here is an example how you can use this operator:
     :start-after: [START howto_operator_delete_dataflow_pipeline]
     :end-before: [END howto_operator_delete_dataflow_pipeline]
 
+Updating a pipeline
+^^^^^^^^^^^^^^^^^^^
+Once a streaming pipeline has been created and is running, its configuration 
cannot be changed because it is immutable. To make any modifications, you need 
to update the pipeline's definition (e.g., update your code or template), and 
then submit a new job.Essentially, you'll be creating a new instance of the 
pipeline with the desired updates.
+
+For batch pipelines, if a job is currently running and you want to update its 
configuration, you must cancel the job. This is because once a Dataflow job has 
started, it becomes immutable. Although batch pipelines are designed to process 
a finite amount of data and will eventually be completed on their own, you 
cannot update a job that is in progress. If you need to change any parameters 
or the pipeline logic while the job is running, you will have to cancel the 
current run and then laun [...]
+
+If the batch pipeline has already been completed successfully, then there is 
no running job to update; the new configuration will only be applied to the 
next job submission.
+
 .. _howto/operator:DataflowJobStatusSensor:
 .. _howto/operator:DataflowJobMetricsSensor:
 .. _howto/operator:DataflowJobMessagesSensor:

Reply via email to