jedcunningham commented on code in PR #32169:
URL: https://github.com/apache/airflow/pull/32169#discussion_r1258974375


##########
docs/apache-airflow/howto/setup-and-teardown.rst:
##########
@@ -0,0 +1,162 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Setup and Teardown
+~~~~~~~~~~~~~~~~~~
+
+In data workflows it's common to create a resource (such as a compute 
resource), use it to do some work, and then tear it down. Airflow provides 
setup and teardown tasks to support this need.
+
+Key features of setup and teardown tasks:
+
+  * If you clear a task, its setups and teardowns will be cleared.
+  * By default, teardown tasks are ignored for the purpose of evaluating dag 
run state.
+  * A teardown task will run if its setup was successful, even if its work 
tasks failed.
+  * Teardown tasks are ignored when setting dependencies against task groups.
+  * A setup task must always have a teardown and vice versa. You may use 
EmptyOperator as a setup or teardown.
+
+How setup and teardown works
+""""""""""""""""""""""""""""
+
+Basic usage
+"""""""""""
+
+Suppose you have a dag that creates a cluster, runs a query, and deletes the 
cluster. Without using setup and teardown tasks you might set these 
relationships:
+
+.. code-block:: python
+
+  create_cluster >> run_query >> delete_cluster
+
+To enable create_cluster and delete_cluster as setup and teardown tasks, we 
mark them as such methods ``as_setup`` and ``as_teardown`` and add an upstream 
/ downstream relationship between them:
+
+.. code-block:: python
+
+  create_cluster.as_setup() >> run_query >> delete_cluster.as_teardown()
+  create_cluster >> delete_cluster
+
+For convenience we can do this in one line by passing ``create_cluster`` to 
the ``as_teardown`` method:
+
+.. code-block:: python
+
+  create_cluster >> run_query >> 
delete_cluster.as_teardown(setups=create_cluster)
+
+Additionally, if we have multiple tasks to wrap, we can use the teardown as a 
context manager:
+
+.. code-block:: python
+
+  with delete_cluster.as_teardown(setups=create_cluster):
+      [RunQueryOne(), RunQueryTwo()] >> DoSomeOtherStuff()
+      WorkOne() >> [do_this_stuff(), do_other_stuff()]
+
+This will set create_cluster to run before the tasks in the context, and 
delete_cluster after them.
+
+Note that if you are attempting to add an already-instantiated task to a setup 
context you need to do it explicitly:
+
+.. code-block:: python
+
+  with my_teardown_task as scope:
+      scope.add_task(work_task)  # work_task was already instantiated elsewhere
+
+Observations:

Review Comment:
   I feel like these observations should be moved back up before the context 
manager. They don't make sense here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to