Hello there, I have a DAG which is only ever triggered externally, via POSTing against the http://airflow-server/api/experimental/dags/my_triggered_dag/dag_runs url.
I have max_active_runs=1 in the DAG in order to enforce that there's only one run at a time. If a run is currently happening, subsequent automated requests to the triggered dag URL end up getting queued up for execution after the already running job completes. Although the work is idempotent, it would be inefficient to do subsequent runs that have queued up when the first run completes the work needed. It's also possible that an error could occur where the dag is triggered hundreds of times in a short time window. I'd rather these subsequent requests were either dropped or ignored. Is there a way to limit the number of queued (to-be-ran) dag runs for a specific dag? As in, the scheduler simply doesn't attempt to queue more of a given dags runs beyond a certain point? Thanks!
