melicheradam opened a new issue, #57712: URL: https://github.com/apache/airflow/issues/57712
### Apache Airflow Provider(s) standard ### Versions of Apache Airflow Providers apache-airflow-providers-standard==1.9.0 ### Apache Airflow version 3.1.1 / 2.10.5 ### Operating System Linux, MacOS, no difference ### Deployment Official Apache Airflow Helm Chart ### Deployment details Reproduced with both helm chart deployment or docker compose. ### What happened When there is a HTTP connection open inside a task that holds indefinitely, the `execution_timeout` parameter is not respected and the task is never killed, and runs indefinitely. If I manually kill the task via UI, it works as expected. But not the automatic. This is really problematic if you are controlling the concurrency of tasks because it blocks other runs. Interesting is, that it shows in the logs as timeouted, but the task keeps in running state. <img width="1141" height="486" alt="Image" src="https://github.com/user-attachments/assets/18b09bc7-deb6-4db0-a6ac-44d7ddf7d5c8" /> ``` [2025-11-02 11:31:20] INFO - New Python virtual environment created in /tmp/venv_cache/venv-da1df057 source=airflow.task.operators.airflow.providers.standard.decorators.python_virtualenv._PythonVirtualenvDecoratedOperator loc=python.py:861 [2025-11-02 11:31:20] INFO - Use 'pickle' as serializer. source=airflow.task.operators.airflow.providers.standard.decorators.python_virtualenv._PythonVirtualenvDecoratedOperator loc=python.py:509 [2025-11-02 11:31:20] INFO - Executing cmd: /tmp/venv_cache/venv-da1df057/bin/python /tmp/venv-callsouh24fg/script.py /tmp/venv-callsouh24fg/script.in /tmp/venv-callsouh24fg/script.out /tmp/venv-callsouh24fg/string_args.txt /tmp/venv-callsouh24fg/termination.log /tmp/venv-callsouh24fg/airflow_context.json source=airflow.utils.process_utils loc=process_utils.py:188 [2025-11-02 11:31:20] INFO - Output: source=airflow.utils.process_utils loc=process_utils.py:192 [2025-11-02 11:32:18] ERROR - Process timed out pid=91 source=task loc=timeout.py:37 ``` ### What you think should happen instead Task should be killed automatically ### How to reproduce ```python from airflow import DAG from datetime import datetime, timedelta from airflow.sdk import task default_args = { "venv_cache_path": "/tmp/venv_cache", "requirements": ["requests"], "execution_timeout": timedelta(minutes=1), "prority_weight": 10, } with DAG( dag_id="blocking_dag", start_date=datetime(2025, 1, 30), description=VERSION, schedule=None, default_args=default_args, ): @task.virtualenv() def block_x_seconds_http_venv(blocking_time): """Block on an HTTP request""" import requests import time url = "https://httpbun.org/delay/{}".format(blocking_time) print(f"Making blocking HTTP request to: {url}") response = requests.get(url) print(f"Response status: {response.status_code}") print(f"Response time: {response.elapsed.total_seconds():.2f} seconds") return response.status_code @task() def block_x_seconds_http(blocking_time): """Block on an HTTP request""" import requests import time url = "https://httpbun.org/delay/{}".format(blocking_time) print(f"Making blocking HTTP request to: {url}") response = requests.get(url) print(f"Response status: {response.status_code}") print(f"Response time: {response.elapsed.total_seconds():.2f} seconds") return response.status_code block_x_seconds_http_venv.override(task_id="block_venv_300_seconds")(300) block_x_seconds_http.override(task_id="block_300_seconds")(300) ``` ### Anything else Complete logs after task is finished (the http block time is released) ``` [2025-11-02 11:31:18] INFO - Using CPython 3.12.12 interpreter at: /usr/python/bin/python source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:18] INFO - Creating virtual environment with seed packages at: /tmp/venv_cache/venv-da1df057 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:19] INFO - + pip==25.3 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:19] INFO - Executing cmd: uv pip install --python /tmp/venv_cache/venv-da1df057/bin/python -r /tmp/venv_cache/venv-da1df057/requirements.txt source=airflow.utils.process_utils loc=process_utils.py:188 [2025-11-02 11:31:19] INFO - Output: source=airflow.utils.process_utils loc=process_utils.py:192 [2025-11-02 11:31:19] INFO - Using Python 3.12.12 environment at: /tmp/venv_cache/venv-da1df057 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - Resolved 5 packages in 614ms source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - Prepared 5 packages in 304ms source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - Installed 5 packages in 2ms source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - + certifi==2025.10.5 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - + charset-normalizer==3.4.4 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - + idna==3.11 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - + requests==2.32.5 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - + urllib3==2.5.0 source=airflow.utils.process_utils loc=process_utils.py:196 [2025-11-02 11:31:20] INFO - New Python virtual environment created in /tmp/venv_cache/venv-da1df057 source=airflow.task.operators.airflow.providers.standard.decorators.python_virtualenv._PythonVirtualenvDecoratedOperator loc=python.py:861 [2025-11-02 11:31:20] INFO - Use 'pickle' as serializer. source=airflow.task.operators.airflow.providers.standard.decorators.python_virtualenv._PythonVirtualenvDecoratedOperator loc=python.py:509 [2025-11-02 11:31:20] INFO - Executing cmd: /tmp/venv_cache/venv-da1df057/bin/python /tmp/venv-callsouh24fg/script.py /tmp/venv-callsouh24fg/script.in /tmp/venv-callsouh24fg/script.out /tmp/venv-callsouh24fg/string_args.txt /tmp/venv-callsouh24fg/termination.log /tmp/venv-callsouh24fg/airflow_context.json source=airflow.utils.process_utils loc=process_utils.py:188 [2025-11-02 11:31:20] INFO - Output: source=airflow.utils.process_utils loc=process_utils.py:192 [2025-11-02 11:32:18] ERROR - Process timed out pid=91 source=task loc=timeout.py:37 [2025-11-02 11:36:22] ERROR - Task failed with exception source=task loc=task_runner.py:972 AirflowTaskTimeout: Timeout, PID: 91 File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 920 in run File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1302 in _execute_task File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py", line 416 in wrapper File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/decorator.py", line 252 in execute File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py", line 416 in wrapper File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py", line 491 in execute File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py", line 416 in wrapper File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py", line 216 in execute File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py", line 888 in execute_callable File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/python.py", line 578 in _execute_python_callable_in_subprocess File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/process_utils.py", line 177 in execute_in_subprocess File "/home/airflow/.local/lib/python3.12/site-packages/airflow/utils/process_utils.py", line 195 in execute_in_subprocess_with_kwargs File "/home/airflow/.local/lib/python3.12/site-packages/airflow/sdk/execution_time/timeout.py", line 38 in handle_timeout ``` ### Are you willing to submit PR? - [x] Yes I am willing to submit a PR! ### Code of Conduct - [x] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
