Khrol commented on a change in pull request #6638: [AIRFLOW-6043] Update dag
reference to all tasks in sub_dag
URL: https://github.com/apache/airflow/pull/6638#discussion_r352555018
##########
File path: tests/models/test_dag.py
##########
@@ -968,3 +968,14 @@ def
test_duplicate_task_ids_for_same_task_is_allowed(self):
self.assertEqual(t1, t2)
self.assertEqual(dag.task_dict, {t1.task_id: t1, t3.task_id: t3})
self.assertEqual(dag.task_dict, {t2.task_id: t2, t3.task_id: t3})
+
+ def test_sub_dag_updates_all_references_while_deepcopy(self):
+ with DAG("test_dag", start_date=DEFAULT_DATE) as dag:
+ t1 = DummyOperator(task_id='t1')
+ t2 = DummyOperator(task_id='t2')
+ t3 = DummyOperator(task_id='t3')
+ t1 >> t2
+ t2 >> t3
+
+ sub_dag = dag.sub_dag('t2', include_upstream=True,
include_downstream=False)
Review comment:
This code is not related to SubDag operator at all and is about completely
different thing. Naming is very confusing, true.
This `sub_dag` method is called from
https://github.com/apache/airflow/blob/master/airflow/www/views.py#L1349
when Graph UI is filtered by root. It takes some mental effort to understand
what happens in `def graph` to render corresponding view but the logic there is
fine. The problem is in the incorrect deep_copy optimizations in `sub_dag`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services