josh-fell commented on a change in pull request #19214:
URL: https://github.com/apache/airflow/pull/19214#discussion_r736515521
##########
File path: docs/apache-airflow/tutorial_taskflow_api.rst
##########
@@ -279,18 +302,43 @@ Building this dependency is shown in the code below:
return json.loads(api_results)
- parsed_results = parsed_results(get_api_results_task.output)
+ parsed_results = parsed_results(api_results=get_api_results_task.output)
+
+Not only can you use traditional operator outputs as inputs for TaskFlow
functions, but also as inputs to
+other traditional operators. In the example below, the output from the
:class:`~airflow.providers.amazon.aws.transfers.salesforce_to_s3.SalesforceToS3Operator`
+task (which is an S3 URI for a destination file location) is used an input for
the
:class:`~airflow.providers.amazon.aws.operators.s3_copy_object.S3CopyObjectOperator`
+task to copy the same file to a date-partitioned storage location in S3 for
long-term storage in a data lake.
Review comment:
IMO the TaskFlow API (I guess more broadly AIP-31) is about a more
functional DAG authoring style + passing data between tasks via `XComArgs`. The
`.output` property is a means to access `XComArgs` for TaskFlow functions and
traditional tasks alike. Separating this example to another place in the
documentation might be a little disjointed with providing this information.
Having answers for "what can I do with the TaskFlow API and a functional DAG
style?"-type questions in this doc seems useful.
Although I'm not opposed to adding this information to the `XComs` concepts
page as well.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]