o-nikolas commented on code in PR #55088:
URL: https://github.com/apache/airflow/pull/55088#discussion_r2320520849


##########
airflow-core/src/airflow/models/deadline.py:
##########
@@ -355,6 +355,31 @@ def _evaluate_with(self, *, session: Session, **kwargs: 
Any) -> datetime:
 
             return _fetch_from_db(DagRun.queued_at, session=session, **kwargs)
 
+    class AverageRuntimeDeadline(BaseDeadlineReference):
+        """A deadline that calculates the average runtime from past DAG 
runs."""
+
+        required_kwargs = {"dag_id"}
+
+        @provide_session
+        def _evaluate_with(self, *, session: Session, **kwargs: Any) -> 
datetime:
+            from airflow.models import DagRun
+
+            dag_id = kwargs["dag_id"]
+
+            # Query for completed DAG runs with both start and end dates
+            query = select(func.avg(func.extract("epoch", DagRun.end_date - 
DagRun.start_date))).filter(
+                DagRun.dag_id == dag_id, DagRun.start_date.isnot(None), 
DagRun.end_date.isnot(None)
+            )
+
+            avg_seconds = session.execute(query).scalar()

Review Comment:
   We can't error on zero otherwise we'll never get past zero right? I think 
the best we can do is log a warning saying that we can't yet compute and 
average.
   
   Then it's up to documentation to instruct people ("hey, you really should 
only add this deadline to an established dag") but surely people will not 
follow that (and can't in all cases of deploying dags via ci/cd).
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to