pluto0007 opened a new issue, #58381:
URL: https://github.com/apache/airflow/issues/58381
Apache Airflow Version
3.1.0
Python 3.x
Postgres metadata DB
AIRFLOW_HOME = /opt/airflow
What happened
After upgrading to Airflow 3.1.0, the system no longer writes task-level
SUCCESS or RUNNING events into the log table (and therefore /api/v2/eventLogs
and the UI Audit Log show no success/running entries).
Only failed and state mismatch events are written for tasks.
DAG-level SUCCESS events are logged, but task-level ones are not.
This is a regression from Airflow 2.10.x, where task-level success and
running events were always logged.
Reproducible example
A DAG (hiport_v2) with a single task (hiport_v2_1) is run multiple times.
The tasks complete successfully.
Then we query the audit log table:
SELECT dttm, event, dag_id, task_id, run_id
FROM log
WHERE dag_id = 'hiport_v2'
ORDER BY dttm DESC
LIMIT 20;
Actual output (sample):
dttm | event | dag_id | task_id
| run_id
------------------------------+-----------------+------------+--------------+----------------------------------------
2025-10-22 14:19:53.705162+0 | trigger_dagrun | hiport_v2 | |
2025-10-16 12:30:02.359807+0 | failed | hiport_v2 | hiport_v2_1
| manual__2025-10-16T12:18:43.261104+00:00
2025-10-16 12:30:02.355367+0 | state mismatch | hiport_v2 | hiport_v2_1
| manual__2025-10-16T12:18:43.261104+00:00
...
There are no success or running events at all.
A global event summary confirms this:
SELECT event, COUNT(*)
FROM log
GROUP BY event
ORDER BY 2 DESC;
Output:
failed | 4102
state mismatch | 4102
trigger_dag_run | 38
cli_api_server | 51
...
(no "success")
(no "running")
REST API evidence
Calling:
GET /api/v2/eventLogs?dag_id=hiport_v2&order_by=-when
Returns only failed/state_mismatch, never success.
Even DAG runs that succeed show empty task-level events.
Expected Behavior
Task-level audit logging should:
Insert success events for every task that finishes successfully
Insert running events for every task that starts
Show these in:
UI → “Audit Log Events”
/api/v2/eventLogs
The log table
This was the behavior in Airflow 2.10.x and is also suggested in the 3.x
audit log documentation (task lifecycle events include success/running/failure).
Actual Behavior
Only task-level failure events are recorded (failed, state mismatch).
No task-level success or running events are ever written.
/api/v2/eventLogs cannot be used to consume full lifecycle events anymore.
DAG-level success events still appear.
Why this matters
We use the official REST API (/api/v2/eventLogs) to drive an external
orchestration/audit service that depends on complete event history.
With Airflow 3.1.x, task-level success events disappear completely, breaking
downstream integrations.
How to Reproduce
Create any DAG with one or more tasks.
Run it successfully.
Query the audit log table:
SELECT * FROM log WHERE dag_id='<your_dag>' ORDER BY dttm DESC;
Observe that:
failed and state mismatch are logged,
but no success/running events appear.
/api/v2/eventLogs also shows no success events.
Additional Notes
Metadata backend: Postgres
Scheduler/Webserver run as root and use:
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql://airflowdbo:****@localhost:5432/airflow_db
Behavior is the same for all DAGs, not only one.
Request
Please confirm whether this is:
An intentional change in Airflow 3.x audit logging,
or
A regression/bug where task lifecycle events (success/running) are no longer
logged.
If this is not intentional, can task-level success/running audit logging be
restored in a 3.1.x+ patch?
We can provide more logs or a minimal DAG if needed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]