philippefutureboy opened a new issue, #24010:
URL: https://github.com/apache/airflow/issues/24010
### Apache Airflow version
2.2.3
### What happened
Following #19699, we decided
Failing a DAGRun via the API:
```
res = requests.patch(
f"{AIRFLOW_HOST}/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}",
headers=headers,
auth=auth,
data=json.dumps({"state": "failed"}),
)
```
does not update the state of queued tasks that are part of said DagRun,
which over time can result in the scheduler tasks slots (max_active_tasks) to
fill up and clog the Dag.
Full explanation can be found here:
https://www.loom.com/share/f44d43562ae64626994b3e1d22d0fdd4
### What you think should happen instead
Failing a DAGRun via the API (via UI or otherwise) should result in queued
tasks being marked as failed (or skipped, or another state of your choice) to
avoid the buildup of queued tasks in older dags.
### How to reproduce
This is how I would attempt reproducing:
1. Create a DAG with a max_active_tasks = 2 and with two DummyOperator tasks
2. Create a DagRun for said DAG
3. Queue both tasks (manually via UPDATE statements)
4. Fail the DagRun
5. Trigger a second DagRun
6. Notice that tasks are scheduled, but not queued.
### Operating System
apache/airflow:2.2.3-python3.8
### Versions of Apache Airflow Providers
apache-airflow-providers-celery==2.1.0; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:b9de35a66bf3a37b57eae55f2cb175bed3c321e5221a53d86976dcca9f9b8944 \
--hash=sha256:f59fd9848be981faad31f89896020d1d416e025703acf97ecf848cbdd9b8a76d
apache-airflow-providers-ftp==2.0.1; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:c4f5b2fa61bae3f4281bcc0b8c2c29eda81a2107a00aafd50781f395feadd156 \
--hash=sha256:37232dbd2e26c1774e42e598ae9594e4daaebd1c2d2d68ce6c1d533a5ce0cad3
apache-airflow-providers-google==6.3.0; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:b59c707a26a2afa95065a3c425004ac89bbefa74927bff1629effdfffcb2e669 \
--hash=sha256:fc4281ea00b5bc83ae3a1f2c2dbe55fc479918fcae703b2bb8167409b16187fd
apache-airflow-providers-http==2.0.2; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:4927de9045fa2cf5d2f00790707cf43c9a8d032c8d6dfcf7126909fcb1e33db4 \
--hash=sha256:1f1f7c4e6e1425b20ddb553b77311c19841444ce12392c9796be0d25212dd036
apache-airflow-providers-imap==2.1.0; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:7bb815192e5cbd9c20d1a12eb6c71f8362e49bb366f660a638935f414b2ba94f \
--hash=sha256:a7ecc72a6e82003159dba4fba0ae72b73e4743b64c58fc984919cdd4eef7d44c
apache-airflow-providers-papermill==2.2.0; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:d6713f010c35d38fb27a4193ae88c97f170eeb775182a5df593fbbe0373d0206 \
--hash=sha256:24256be81798b300007bea4159ec49a25bb4d9822fcbb4bd514b6fc2b9393d70
apache-airflow-providers-postgres==2.4.0; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:5b12826bc94eb955547b3c892ba913c8f18a43a79778348e2112fa7e6874fb82 \
--hash=sha256:b6fab6252a524ae81b14732e9e28a03db15751ad1b09edcf8cf7a0f002bc8626
apache-airflow-providers-sqlite==2.0.1; python_version >= "3.6" and
python_version < "4.0" \
--hash=sha256:9a991e10f8b7bc4028ff3b389f280607e06423f97d4327b136383e6a52d9fcf9 \
--hash=sha256:4e1ed0f2d25e3c3aecd5575dd46a78799bd205ba3c5d53b0248057fc30dd2aa9
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
Once every 1-2 month (but YMMV depending on your max_active_tasks and how
often you fail a DAGRun with Queued tasks).
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]