potiuk opened a new issue, #38167:
URL: https://github.com/apache/airflow/issues/38167

   ### Discussed in https://github.com/apache/airflow/discussions/38165
   
   <div type='discussions-op-text'>
   
   <sup>Originally posted by **HenryLee19** March 14, 2024</sup>
   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### If "Other Airflow 2 version" selected, which one?
   
   2.8.3
   
   ### What happened?
   
   I changed  the "max_active_runs_per_dag" from the default of 16 to 2, in the 
Airflow config. Then manually triggered a DAG 10x via the UI and observed 10 
instances of the DAG in the "running" state.
   
   ### What you think should happen instead?
   
   I expected there to be 2 instances in the "running" state and 8 in the 
"queued" state. 
   
   (Note, setting "max_active_runs" at the DAG level works. It is only the 
global "max_active_runs_per_dag" that appears to be not working.)
   
   ### How to reproduce
   
   1. in airflow.cfg, change "max_active_runs_per_dag" from the default of 16 
to 2
   2. restart Airflow (Docker container, in my case)
   3. trigger a DAG via the UI play button 10x
   4. observe 10 instances of the DAG in the "running" state
   
   ### Operating System
   
   Debian GNU/Linux 12 (bookworm)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   Used the docker-compose found here: 
https://airflow.apache.org/docs/apache-airflow/2.8.3/docker-compose.yaml
   
   ### Anything else?
   
   DAG definition:
   
   ```
   from datetime import datetime
   import time
   
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   
   default_args = {
       'owner': 'airflow',
       'depends_on_past': False,
       'start_date': datetime(2024, 1, 1),
       'email': ['[email protected]'],
       'email_on_failure': False,
       'email_on_retry': False,
       'retries': 0,
   }
   
   
   def _task1(**context):
       time.sleep(120)
       return
   
   
   with DAG(
       dag_id='dag1',
       default_args=default_args,
       schedule=None,
       catchup=False
   ) as dag:
   
       task1 = PythonOperator(
           task_id='task1',
           python_callable=_task1
       )
   ```
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   </div>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to