This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-9-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f299149da4681b680050fc5e5cf68ce667a31102
Author: raphaelauv <[email protected]>
AuthorDate: Thu Apr 25 06:40:06 2024 +0200

    doc: dynamictaskmapping pythonoperator op_kwargs (#39242)
    
    Co-authored-by: raphaelauv <[email protected]>
    (cherry picked from commit 6781c632e3ba7fd4404f21c8b932e1374a4a9322)
---
 .../authoring-and-scheduling/dynamic-task-mapping.rst  | 18 ++++++++++++++++++
 1 file changed, 18 insertions(+)

diff --git 
a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst 
b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst
index acf9cfba3d..739bc6fee9 100644
--- a/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst
+++ b/docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst
@@ -291,6 +291,24 @@ Sometimes an upstream needs to specify multiple arguments 
to a downstream operat
 
 This produces two task instances at run-time printing ``1`` and ``2`` 
respectively.
 
+Also it's possible to mix ``expand_kwargs`` with most of the operators 
arguments like the ``op_kwargs`` of the PythonOperator
+
+.. code-block:: python
+
+    def print_args(x, y):
+        print(x)
+        print(y)
+        return x + y
+
+
+    PythonOperator.partial(task_id="task-1", 
python_callable=print_args).expand_kwargs(
+        [
+            {"op_kwargs": {"x": 1, "y": 2}, "show_return_value_in_logs": True},
+            {"op_kwargs": {"x": 3, "y": 4}, "show_return_value_in_logs": 
False},
+        ]
+    )
+
+
 Similar to ``expand``, you can also map against a XCom that returns a list of 
dicts, or a list of XComs each returning a dict. Re-using the S3 example above, 
you can use a mapped task to perform "branching" and copy files to different 
buckets:
 
 .. code-block:: python

Reply via email to