GitHub user Easthy edited a comment on the discussion: DAGS disappear from UI 
[2.5.1]

Tried versions 2.5.3 and 2.6.3 — the issue persists.
Reverted to 2.3.2, and everything works fine.

The DAG, which contains a large number of dynamically generated tasks using 
Variable, disappears from the UI (other, without using of Variable to create 
tasks, are fine).
Running the command airflow dag reserialize temporarily brings it back, but it 
disappears again shortly after.

In the logs, I found the following:
```
[2025-04-09T15:28:27.304+0000] {processor.py:157} INFO - Started process 
(PID=****) to work on /airflow/dags/***/source_data.py
[2025-04-09T15:28:27.306+0000] {processor.py:826} INFO - Processing file 
/airflow/dags/***/source_data.py for tasks to queue
[2025-04-09T15:28:27.307+0000] {dagbag.py:541} INFO - Filling up the DagBag 
from /airflow/dags/***/source_data.py
[2025-04-09T15:28:27.381+0000] {s3_cleanup_operator.py:32} WARNING - 
UserWarning: The `template_fields` value for S3CleanupOperator is a string but 
should be a list or tuple of strings. Wrapping it in a list automatically.
[2025-04-09T15:28:34.507+0000] {processor.py:836} INFO - DAG(s) 
dict_keys(['***.source_data']) retrieved from /airflow/dags/***/source_data.py

[2025-04-09T15:29:47.971+0000] {processor.py:157} INFO - Started process 
(PID=****) to work on /airflow/dags/***/source_data.py
[2025-04-09T15:29:47.972+0000] {processor.py:826} INFO - Processing file 
/airflow/dags/***/source_data.py for tasks to queue
[2025-04-09T15:29:47.973+0000] {dagbag.py:541} INFO - Filling up the DagBag 
from /airflow/dags/***/source_data.py
[2025-04-09T15:29:48.042+0000] {s3_cleanup_operator.py:32} WARNING - 
UserWarning: The `template_fields` value for S3CleanupOperator is a string but 
should be a list or tuple of strings. Wrapping it in a list automatically.
[2025-04-09T15:29:58.013+0000] {processor.py:836} INFO - DAG(s) 
dict_keys(['***.source_data']) retrieved from /airflow/dags/***/source_data.py

[2025-04-09T15:30:38.548+0000] {processor.py:273} WARNING - Killing 
DAGFileProcessorProcess (PID=****)
```

It happens on AWS EC2 t3.xlarge (4 vCPU, 16 RAM) with gp2 hard drive (ssd). 
Changed dag import timeout to 300 second, didn't help me. The machine is not 
under load in terms of RAM or CPU during DAG parsing.

GitHub link: 
https://github.com/apache/airflow/discussions/30364#discussioncomment-13051733

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to