carlos54 opened a new issue, #59272:
URL: https://github.com/apache/airflow/issues/59272
### Apache Airflow version
Other Airflow 2/3 version (please specify below)
### If "Other Airflow 2/3 version" selected, which one?
3.1.4
### What happened?
Description :
The Dag Processor correctly identifies and logs an import error within a DAG
file, but the corresponding error message and Stack Trace are not visible in
the Airflow User Interface (no "import Errors" tab).
```
DAG File Processing Stats
Bundle File Path PID Current Duration # DAGs # Errors Last Duration Last Run
At
-------- ---------------------------------- ----- ------------------
-------- ---------- --------------- -------------------
sandbox sandbox_dag005_dataset_producer.py 1 0 0.26s 2025-12-10T11:38:29
sandbox sandbox_dag_demo.py 0 1 0.21s 2025-12-10T11:38:29
```
Error in log dag processor log file are correctly catch :
```
{"timestamp":"2025-12-10T11:33:23.270722Z","level":"error","event":"AirflowException(\"Task
Policy validation failed for 'DO_DATASET':\\n[DATASET OUTLETS] Dataset URI
'event_data01_dispo' in
task 'DO_DATASET' must start with
'sandbox_'.\")","logger":"airflow.models.dagbag.DagBag","filename":"dagbag.py","lineno":556,"error_detail":[{"exc_type":"AirflowException","exc_value":"Tas
k Policy validation failed for 'DO_DATASET':\n[DATASET OUTLETS] Dataset URI
'event_data01_dispo' in task 'DO_DATASET' must start with
'sandbox_'.","exc_notes":[],"syntax_error":null,"is_caus
e":false,"frames":[{"filename":"/usr/local/lib/python3.12/site-packages/airflow/models/dagbag.py","lineno":552,"name":"bag_dag"},{"filename":"/usr/local/lib/python3.12/site-packages/airflow/settings.py","lineno":183,"name":"task_policy"},{"filename":"/usr/local/lib/python3.12/site-packages/pluggy/_hooks.py","lineno":512,"name":"__call__"},{"filename":"/usr/local/lib/python3.12/site-packages/pluggy/_manager.py","lineno":120,"name":"_hookexec"},{"filename":"/usr/local/lib/python3.12/site-packages/pluggy/_callers.py","lineno":167,"name":"_multicall"},{"filename":"/usr/local/lib/python3.12/site-packages/pluggy/_callers.py","lineno":121,"name":"_multicall"},{"filename":"/opt/airflow/config/cluster_policy.py","lineno":173,"name":"task_policy"}],"is_group":false,"exceptions":[]}]}
```
This prevents developers from quickly identifying the source of the failure,
especially those caused by the cluster_policy.
### What you think should happen instead?
The "Import Errors" view in the Airflow UI should be display .
### How to reproduce
The dag_processor ariflow.cfg:
dag_bundle_config_list: '{{ .Values.DAGS.bundle_list }}'
dag_bundle_storage_path: '/dags/airflow/bundles'
refresh_interval: 60
bundle_refresh_check_interval : 15
print_stats_interval: 60
parsing_processes: 2
dag_file_processor_timeout: 60
min_file_process_interval: 30
stale_dag_threshold : 300
### Operating System
redhat/ubi9:9.6
### Versions of Apache Airflow Providers
apache-airflow==3.1.4 \
structlog==25.5.0 \
psycopg2-binary==2.9.11 \
asyncpg==0.31.0 \
apache-airflow-providers-fab==3.0.3 \
apache-airflow-providers-redis==4.4.0 \
apache-airflow-providers-git==0.1.0 \
apache-airflow-providers-cncf-kubernetes==10.11.0 \
flask-limiter==3.12 \
redis==5.3.1 \
authlib==1.6.5 \
PyJWT==2.10.1 \
cryptography==42.0.8 \
requests==2.32.5
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else?
_No response_
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]