tirkarthi commented on issue #63573: URL: https://github.com/apache/airflow/issues/63573#issuecomment-4060624307
This looks like an issue from the backend response where if a task is retried then the task_instance_history doesn't immediately create an entry but the task_instance table is updated with the try_number and the duration remains the same. Hence on tries endpoint it takes the existing tries and adds the current task instance which doesn't have duration reset but holds on to the last try's duration value. https://github.com/apache/airflow/blob/56090bd8e466052c30e9a565197c2d4cfb8b81c8/airflow-core/src/airflow/api_fastapi/core_api/routes/public/task_instances.py#L342-L346 ``` mysql> select try_number, start_date, task_id, state, run_id, duration from task_instance where task_id = "fail_task" order by run_id; +------------+----------------------------+-----------+---------+--------------------------------------+----------+ | try_number | start_date | task_id | state | run_id | duration | +------------+----------------------------+-----------+---------+--------------------------------------+----------+ | 2 | 2026-03-14 19:41:00.203532 | fail_task | running | scheduled__2026-03-14T01:00:00+00:00 | 43.7898 | +------------+----------------------------+-----------+---------+--------------------------------------+----------+ 1 row in set (0.01 sec) mysql> select try_number, start_date, task_id, state, run_id, duration from task_instance_history where task_id = "fail_task" order by run_id; +------------+----------------------------+-----------+--------+--------------------------------------+----------+ | try_number | start_date | task_id | state | run_id | duration | +------------+----------------------------+-----------+--------+--------------------------------------+----------+ | 1 | 2026-03-14 19:40:00.024494 | fail_task | failed | scheduled__2026-03-14T01:00:00+00:00 | 43.7898 | +------------+----------------------------+-----------+--------+--------------------------------------+----------+ ``` ```python from __future__ import annotations from datetime import timedelta from pendulum import datetime from airflow.exceptions import AirflowException from airflow.sdk import DAG, task with DAG( dag_id="gh63573", schedule="0 1 * * *", start_date=datetime(2026, 1, 1), catchup=False, default_args={"retries": 3, "retry_delay": timedelta(seconds=10)}, ): @task def fail_task(): import time time.sleep(1000) raise AirflowException("fail") fail_task() ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
