dstandish commented on a change in pull request #19214:
URL: https://github.com/apache/airflow/pull/19214#discussion_r736126450
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -224,11 +224,11 @@ is automatically set to true.
Note, If you manually set the ``multiple_outputs`` parameter the inference is
disabled and
the parameter value is used.
-Adding dependencies to decorated tasks from regular tasks
----------------------------------------------------------
-The above tutorial shows how to create dependencies between python-based
tasks. However, it is
-quite possible while writing a DAG to have some pre-existing tasks such as
:class:`~airflow.operators.bash.BashOperator` or
:class:`~airflow.sensors.filesystem.FileSensor`
-based tasks which need to be run first before a python-based task is run.
+Adding dependencies to decorated tasks from traditional tasks
Review comment:
```suggestion
Adding dependencies between decorated tasks and traditional tasks
```
the dependency is mutual not exactly _from_ one _to_ another
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -224,11 +224,11 @@ is automatically set to true.
Note, If you manually set the ``multiple_outputs`` parameter the inference is
disabled and
the parameter value is used.
-Adding dependencies to decorated tasks from regular tasks
----------------------------------------------------------
-The above tutorial shows how to create dependencies between python-based
tasks. However, it is
-quite possible while writing a DAG to have some pre-existing tasks such as
:class:`~airflow.operators.bash.BashOperator` or
:class:`~airflow.sensors.filesystem.FileSensor`
-based tasks which need to be run first before a python-based task is run.
+Adding dependencies to decorated tasks from traditional tasks
+-------------------------------------------------------------
+The above tutorial shows how to create dependencies between TaskFlow
functions. However, it is
+quite possible while writing a DAG to have some traditional tasks (such as
:class:`~airflow.operators.bash.BashOperator` or
:class:`~airflow.sensors.filesystem.FileSensor`)
+which need to be run first before a TaskFlow function is run.
Review comment:
```suggestion
TaskFlow functions may be combined with traditional tasks in a dag.
```
This may be venturing into nit pickery here, but seems a bit verbose
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -251,18 +251,41 @@ Building this dependency is shown in the code below:
file_task >> order_data
-In the above code block, a new python-based task is defined as
``extract_from_file`` which
+In the above code block, a new TaskFlow function is defined as
``extract_from_file`` which
reads the data from a known file location.
In the main DAG, a new ``FileSensor`` task is defined to check for this file.
Please note
that this is a Sensor task which waits for the file.
-Finally, a dependency between this Sensor task and the python-based task is
specified.
+Finally, a dependency between this Sensor task and the TaskFlow function is
specified.
-Consuming XCOMs with decorated tasks from regular tasks
----------------------------------------------------------
-You may additionally find it necessary to consume an XCOM from a pre-existing
task as an input into python-based tasks.
+Consuming XComs from traditional tasks as inputs
+------------------------------------------------
+You may find it necessary to consume an XCom from traditional task, either
pushed within the task's execution
+or via its return value, as an input into downstream tasks. You can access the
pushed XCom (also known as an
+``XComArg``) by simply utilizing the ``.output`` property exposed for all
operators.
Review comment:
```suggestion
``XComArg``) with the ``.output`` property exposed for all operators.
```
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -251,18 +251,41 @@ Building this dependency is shown in the code below:
file_task >> order_data
-In the above code block, a new python-based task is defined as
``extract_from_file`` which
+In the above code block, a new TaskFlow function is defined as
``extract_from_file`` which
reads the data from a known file location.
In the main DAG, a new ``FileSensor`` task is defined to check for this file.
Please note
that this is a Sensor task which waits for the file.
-Finally, a dependency between this Sensor task and the python-based task is
specified.
+Finally, a dependency between this Sensor task and the TaskFlow function is
specified.
-Consuming XCOMs with decorated tasks from regular tasks
----------------------------------------------------------
-You may additionally find it necessary to consume an XCOM from a pre-existing
task as an input into python-based tasks.
+Consuming XComs from traditional tasks as inputs
+------------------------------------------------
+You may find it necessary to consume an XCom from traditional task, either
pushed within the task's execution
+or via its return value, as an input into downstream tasks. You can access the
pushed XCom (also known as an
+``XComArg``) by simply utilizing the ``.output`` property exposed for all
operators.
-Building this dependency is shown in the code below:
+By default, using the ``.output`` property to retrieve an XCom result is the
equivalent of:
+
+.. code-block:: python
+
+ task_instance.xcom_pull(task_ids="my_task_id", key="return_value")
+
+To retrieve an XCom result for a key other than ``return_value``, you can use:
+
+.. code-block:: python
+
+ my_op = MyOperator(...)
+ my_op_output = my_op.output["some_other_xcom_key"]
+ # OR
+ my_op_output = my_op.output.get("some_other_xcom_key")
+
+.. note::
+ Using the ``.output`` property as an input to another task is supported
only for operator parameters
+ listed as a ``template_field``.
+
+In the code example below, a
:class:`~airflow.providers.http.operators.http.SimpleHttpOperator` result
+is captured via :doc:`XComs </concepts/xcoms>`. This XCom result, which is the
task output, is then passed
+to a TaskFlow function to parse the response as JSON.
Review comment:
```suggestion
to a TaskFlow function which parses the response as JSON.
```
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -279,18 +302,43 @@ Building this dependency is shown in the code below:
return json.loads(api_results)
- parsed_results = parsed_results(get_api_results_task.output)
+ parsed_results = parsed_results(api_results=get_api_results_task.output)
+
+Not only can you use traditional operator outputs as inputs for TaskFlow
functions, but also as inputs to
+other traditional operators. In the example below, the output from the
:class:`~airflow.providers.amazon.aws.transfers.salesforce_to_s3.SalesforceToS3Operator`
+task (which is an S3 URI for a destination file location) is used an input for
the
:class:`~airflow.providers.amazon.aws.operators.s3_copy_object.S3CopyObjectOperator`
+task to copy the same file to a date-partitioned storage location in S3 for
long-term storage in a data lake.
Review comment:
does this particular example move out of the realm of TaskFlow and back
into simply XCom (in which case maybe it belongs in an xcom doc)? Or is
`.output` considered taskflow-specific? I thought taskflow was really about
decorated tasks?
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -251,18 +251,41 @@ Building this dependency is shown in the code below:
file_task >> order_data
-In the above code block, a new python-based task is defined as
``extract_from_file`` which
+In the above code block, a new TaskFlow function is defined as
``extract_from_file`` which
reads the data from a known file location.
In the main DAG, a new ``FileSensor`` task is defined to check for this file.
Please note
that this is a Sensor task which waits for the file.
-Finally, a dependency between this Sensor task and the python-based task is
specified.
+Finally, a dependency between this Sensor task and the TaskFlow function is
specified.
-Consuming XCOMs with decorated tasks from regular tasks
----------------------------------------------------------
-You may additionally find it necessary to consume an XCOM from a pre-existing
task as an input into python-based tasks.
+Consuming XComs from traditional tasks as inputs
+------------------------------------------------
+You may find it necessary to consume an XCom from traditional task, either
pushed within the task's execution
+or via its return value, as an input into downstream tasks. You can access the
pushed XCom (also known as an
Review comment:
```suggestion
A TaskFlow function can consume an XCom from a traditional task. You can
access the pushed XCom (also known as an
```
a little more direct this way.
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -224,11 +224,11 @@ is automatically set to true.
Note, If you manually set the ``multiple_outputs`` parameter the inference is
disabled and
the parameter value is used.
-Adding dependencies to decorated tasks from regular tasks
----------------------------------------------------------
-The above tutorial shows how to create dependencies between python-based
tasks. However, it is
-quite possible while writing a DAG to have some pre-existing tasks such as
:class:`~airflow.operators.bash.BashOperator` or
:class:`~airflow.sensors.filesystem.FileSensor`
-based tasks which need to be run first before a python-based task is run.
+Adding dependencies to decorated tasks from traditional tasks
Review comment:
another thought:
```suggestion
Adding dependencies between ``TaskFlow`` tasks and traditional tasks
```
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -224,11 +224,11 @@ is automatically set to true.
Note, If you manually set the ``multiple_outputs`` parameter the inference is
disabled and
the parameter value is used.
-Adding dependencies to decorated tasks from regular tasks
----------------------------------------------------------
-The above tutorial shows how to create dependencies between python-based
tasks. However, it is
-quite possible while writing a DAG to have some pre-existing tasks such as
:class:`~airflow.operators.bash.BashOperator` or
:class:`~airflow.sensors.filesystem.FileSensor`
-based tasks which need to be run first before a python-based task is run.
+Adding dependencies to decorated tasks from traditional tasks
+-------------------------------------------------------------
+The above tutorial shows how to create dependencies between TaskFlow
functions. However, it is
Review comment:
my suggestion was `TaskFlow functions may be combined with traditional
tasks in a dag` replacing the whole paragraph. but anyway this is sort of
tangential to your PR just figured since we're looking at it... but i can
always followup with language cleanup later.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]