mikoloay opened a new issue, #58995:
URL: https://github.com/apache/airflow/issues/58995
### Apache Airflow version
3.1.3
### If "Other Airflow 2/3 version" selected, which one?
_No response_
### What happened?
Hi, maybe it should be marked as a feature request, but I decided to mark it
as a bug since this feature is present in Airflow 2.
In Airflow 3 when a dag script is being parsed for too long and a dagbag
parsing timeout occurs, the dagprocessor subprocess is being immediately killed
and no info appears in the UI, as if the dag script didn't exist.
Here's a log from a dag processor pod:
```[error ] Processor for
DagFileInfo(rel_path=PosixPath('repo/timeout_dag.py'),
bundle_name='dags-folder', bundle_path=PosixPath('/opt/airflow/dags'),
bundle_version=None) with PID 115 started 50 ago killing it.```
### What you think should happen instead?
Airflow users should still be able to see timeout errors in the Dag Import
Errors section in the UI.
### How to reproduce
The following script can be used to replicate this behaviour
```python
from airflow.sdk import DAG
import time
with DAG(
dag_id="timeout_dag",
schedule=None
):
time.sleep(10000)
```
### Operating System
Debian GNU/Linux 12 (bookworm)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else?
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]