ephraimbuddy commented on a change in pull request #16177:
URL: https://github.com/apache/airflow/pull/16177#discussion_r642650643
##########
File path: airflow/config_templates/config.yml
##########
@@ -167,10 +167,8 @@
default: "32"
- name: dag_concurrency
description: |
- The maximum number of task instances allowed to run concurrently in
each DAG. To calculate
- the number of tasks that is running concurrently for a DAG, add up the
number of running
- tasks for all DAG runs of the DAG. This is configurable at the DAG
level with ``concurrency``,
- which is defaulted as ``dag_concurrency``.
+ The maximum number of task instances allowed to run concurrently in
each DAG Run for that
Review comment:
```suggestion
The maximum number of task instances allowed to run concurrently
across the DAG Runs for that
```
It appears the behaviour is across the DagRuns for that DAG. Wrong
implementation? I checked by setting the dag_concurrency to 5 and triggered the
dag multiple times. Only 5 tasks run across the DagRuns at a time and the log
message is saying DAG's `task concurrency`( a bit confusing)
```
[2021-05-30 06:28:06,727] {scheduler_job.py:1025} INFO - DAG
example_bash_operator has 5/5 running and queued tasks
[2021-05-30 06:28:06,727] {scheduler_job.py:1033} INFO - Not executing
<TaskInstance: example_bash_operator.also_run_this 2021-05-30
06:26:58.716604+00:00 [scheduled]> since the number of tasks running or queued
from DAG example_bash_operator is >= to the DAG's task concurrency limit of 5
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]