BasPH commented on code in PR #27540:
URL: https://github.com/apache/airflow/pull/27540#discussion_r1025180942
##########
docs/apache-airflow/concepts/taskflow.rst:
##########
@@ -81,6 +81,104 @@ To use logging from your task functions, simply import and
use Python's logging
Every logging line created this way will be recorded in the task log.
+Passing Arbitrary Objects As Arguments
+--------------------------------------
+
+.. versionadded:: 2.5.0
+
+As mentioned TaskFlow uses XCom to pass variables to each task. This requires
that variables that are used as arguments
+need to be able to be serialized. Airflow out of the box supports all built-in
types (like int or str) and it
+supports objects that are decorated with ``@dataclass`` or ``@attr.define``.
The following example shows the use of
+a ``Dataset``, which is ``@attr.define`` decorated, together with TaskFlow.
+
+::
+
+ Note: An additional benefit of using ``Dataset`` is that it automatically
registers as an ``inlet`` in case it is used as an input argument. It also auto
registers as an ``outlet`` if the return value of your task is a ``dataset`` or
a ``list[Dataset]]``.
+
+.. code-block:: python
+
+ import json
+ import pendulum
+ import requests
+
+ from airflow import Dataset
+ from airflow.decorators import dag, task
+
+ SRC =
Dataset("https://www.ncei.noaa.gov/access/monitoring/climate-at-a-glance/global/time-series/globe/land_ocean/ytd/12/1880-2022.json")
+ now = pendulum.now()
+
+
+ @dag(start_date=now, schedule="daily", catchup=False)
Review Comment:
```suggestion
@dag(start_date=now, schedule="@daily", catchup=False)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]