kaxil commented on a change in pull request #8962:
URL: https://github.com/apache/airflow/pull/8962#discussion_r443078056



##########
File path: docs/concepts.rst
##########
@@ -173,6 +213,62 @@ Each task is a node in our DAG, and there is a dependency 
from task_1 to task_2:
 We can say that task_1 is *upstream* of task_2, and conversely task_2 is 
*downstream* of task_1.
 When a DAG Run is created, task_1 will start running and task_2 waits for 
task_1 to complete successfully before it may start.
 
+.. _concepts:task_decorator:
+
+Python task decorator
+---------------------
+
+Airflow ``task`` decorator converts any Python decorated function to a Python 
Airflow operator.
+The decorated function can be called once to set the arguments and key 
arguments for operator execution.

Review comment:
       ```suggestion
   Airflow ``task`` decorator converts any Python function to an Airflow 
operator.
   The decorated function can be called once to set the arguments and key 
arguments for operator execution.
   ```

##########
File path: docs/concepts.rst
##########
@@ -173,6 +213,62 @@ Each task is a node in our DAG, and there is a dependency 
from task_1 to task_2:
 We can say that task_1 is *upstream* of task_2, and conversely task_2 is 
*downstream* of task_1.
 When a DAG Run is created, task_1 will start running and task_2 waits for 
task_1 to complete successfully before it may start.
 
+.. _concepts:task_decorator:
+
+Python task decorator
+---------------------
+
+Airflow ``task`` decorator converts any Python decorated function to a Python 
Airflow operator.
+The decorated function can be called once to set the arguments and key 
arguments for operator execution.
+
+
+.. code:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+
+    @dag.task
+    def hello_world():
+      print('hello world!')
+
+
+    # Also...
+
+    from airflow.decorators import task
+
+    @task
+    def hello_name(name: str):
+      print(f'hello {name}!')
+
+    hello_name('Airflow users')
+
+Task decorator captures returned values and sends them to the :ref:`XCom 
backend <concepts:xcom>`. By default, returned
+value is saved as a single XCom value. You can set ``multiple_outputs`` key 
argument to ``True`` to unroll dictionaries,
+lists or tuples into seprate XCom values. This can be used with regular 
operators to create
+:ref:`functional DAGs <concepts:functional_dags>`.
+
+Calling a decorated function returns an ``XComArg`` instance. You can use it 
to set templated fields on downstream
+operators.
+
+You can call a decorated function more than once in a DAG. The decorated 
function will automatically generate unique
+a ``task_id`` for each generated operator.

Review comment:
       ```suggestion
   You can call a decorated function more than once in a DAG. The decorated 
function will automatically generate
   a unique ``task_id`` for each generated operator.
   ```

##########
File path: docs/concepts.rst
##########
@@ -173,6 +213,62 @@ Each task is a node in our DAG, and there is a dependency 
from task_1 to task_2:
 We can say that task_1 is *upstream* of task_2, and conversely task_2 is 
*downstream* of task_1.
 When a DAG Run is created, task_1 will start running and task_2 waits for 
task_1 to complete successfully before it may start.
 
+.. _concepts:task_decorator:
+
+Python task decorator
+---------------------
+
+Airflow ``task`` decorator converts any Python decorated function to a Python 
Airflow operator.
+The decorated function can be called once to set the arguments and key 
arguments for operator execution.
+
+
+.. code:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+
+    @dag.task
+    def hello_world():
+      print('hello world!')
+
+
+    # Also...
+
+    from airflow.decorators import task
+
+    @task
+    def hello_name(name: str):
+      print(f'hello {name}!')
+
+    hello_name('Airflow users')

Review comment:
       ```suggestion
        with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
   
                @dag.task
                def hello_world():
                        print('hello world!')
   
   
          # Also...
   
                from airflow.decorators import task
   
                @task
                def hello_name(name: str):
                print(f'hello {name}!')
   
                hello_name('Airflow users')
   ```

##########
File path: docs/concepts.rst
##########
@@ -173,6 +213,62 @@ Each task is a node in our DAG, and there is a dependency 
from task_1 to task_2:
 We can say that task_1 is *upstream* of task_2, and conversely task_2 is 
*downstream* of task_1.
 When a DAG Run is created, task_1 will start running and task_2 waits for 
task_1 to complete successfully before it may start.
 
+.. _concepts:task_decorator:
+
+Python task decorator
+---------------------
+
+Airflow ``task`` decorator converts any Python decorated function to a Python 
Airflow operator.
+The decorated function can be called once to set the arguments and key 
arguments for operator execution.
+
+
+.. code:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+
+    @dag.task
+    def hello_world():
+      print('hello world!')
+
+
+    # Also...
+
+    from airflow.decorators import task
+
+    @task
+    def hello_name(name: str):
+      print(f'hello {name}!')
+
+    hello_name('Airflow users')
+
+Task decorator captures returned values and sends them to the :ref:`XCom 
backend <concepts:xcom>`. By default, returned
+value is saved as a single XCom value. You can set ``multiple_outputs`` key 
argument to ``True`` to unroll dictionaries,
+lists or tuples into seprate XCom values. This can be used with regular 
operators to create
+:ref:`functional DAGs <concepts:functional_dags>`.
+
+Calling a decorated function returns an ``XComArg`` instance. You can use it 
to set templated fields on downstream
+operators.
+
+You can call a decorated function more than once in a DAG. The decorated 
function will automatically generate unique
+a ``task_id`` for each generated operator.
+
+.. code:: python
+
+  with DAG('my_dag', start_date=datetime(2020, 5, 15)) as dag:
+
+    @dag.task
+    def update_user(user_id: int):
+      ...
+
+    # Avoid generating this list dynamically to keep dag topology stable 
between DAG runs

Review comment:
       ```suggestion
       # Avoid generating this list dynamically to keep DAG topology stable 
between DAG runs
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to