ferruzzi commented on code in PR #50957:
URL: https://github.com/apache/airflow/pull/50957#discussion_r2138330321
##########
airflow-core/src/airflow/api_fastapi/core_api/datamodels/dag_run.py:
##########
@@ -68,6 +69,7 @@ class DAGRunResponse(BaseModel):
end_date: datetime | None
data_interval_start: datetime | None
data_interval_end: datetime | None
+ deadlines: list[DeadlineResponse] | None
Review Comment:
I brought it up on the dev call last week and we're going to make (a
variation on) the change you suggested. Here's what the lifespan of a Deadline
will be, using Dagrun as an example:
- When a new Dagrun is created:
- If the dag has a deadline, calculate the value and store it along with
the dag_id, run_id, etc in the `deadline` table
- When a dagrun finishes:
- If (and only if) the current time is before the calculated deadline,
then remove the deadline from the `deadline` table
- The scheduler loop will query the deadline table to see if any deadlines
have expired
- If yes then the callback is sent to the Triggerer to process
- once the callback is run, the Triggerer will move the failed deadline
to a new table (working name is just `missed_deadlines`?)
So `deadlines` table will have all upcoming and unprocessed deadlines.
`missed_deadlines` table will have all expired deadlines which have been
resolved. If the serialized DAG has a deadline and it is in neither table,
then it will be assumed that it completed successfully.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]