nikie commented on issue #18304:
URL: https://github.com/apache/airflow/issues/18304#issuecomment-949030839


   @uranusjr 
   There is already a case when Airflow kills deadlocked dagruns: 
https://github.com/apache/airflow/blob/34e586a162ad9756d484d17b275c7b3dc8cefbc2/airflow/models/dagrun.py#L520
   Maybe, it would be better to fail the run in our case as well for 
consistency? Scheduling is already a fairly magic thing, so adding more magic 
like "running but not actually" or turning dagrun off/on would complicate it 
even more. In case of "running but not actually" solution, bug reports about 
"active dag runs exceed the max_active_runs setting" are likely to appear.
   
   We can try to extend above check to also fire if `not none_depends_on_past` 
(i.e. there are some "on past" dependencies), but `max_active_runs` is already 
reached and there are no running tasks in other runs.
   The "max active runs reached" state could be passed from the method 
`SchedulerJob._schedule_dag_run`, which calls the `DagRun.update_state`.
   How it would be better to check for running tasks in other runs?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to