soltanianalytics commented on issue #13407:
URL: https://github.com/apache/airflow/issues/13407#issuecomment-908381563


   My core usecase is re-running DAGs that are not currently running.
   
   So I have a DAG with `max_active_runs=1` and `catchup=True`. This DAG 
depends on the previous `DagRun`s being successful. I implement this logic via 
a sensor that senses the success of the last task of the previous `DagRun`. If 
the previous `DagRun` failed, the current one will keep sensing into the abyss. 
It might fail after some time, too. Then I might have `n` DagRuns that I want 
to re-run. This `n` can be in the dozens. If I just clear the tasks of all the 
DAGs I want to run, they may not be running in order, but because all but the 
oldest tasks will have sensors that will not be successful until the previous 
`DagRun` was successful, this will only execute properly if the `DagRun`s are 
executed in chronological order.
   
   I can make that happen if I let the currently active `DagRun` fail before 
clearing tasks and setting `DagRun` states to `running`. _Most of the time_, 
that should do the trick. If not, I'll just delete all relevant `DagRun`s and 
then they'll re-appear chronologically.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to