stas-snow commented on issue #22193:
URL: https://github.com/apache/airflow/issues/22193#issuecomment-1067079346
Code:
```
import logging
import time
# Airflow packages
from random import uniform
from airflow.decorators import dag, task
logger = logging.getLogger(__name__)
default_dag_args = {
'owner': 'airflow',
'start_date': '2022-03-01',
'schedule_interval': '@daily',
'retries': 0,
'catchup': True,
'max_active_runs': 1,
'depends_on_past': False,
}
@dag(default_args=default_dag_args)
def test2_dag():
@task()
def start():
logging.info("Starting DAG. Going to sleep")
time.sleep(uniform(1, 5) * 60)
logging.info("First task, exiting")
return None
@task()
def do():
time.sleep(uniform(1, 5) * 60)
return None
@task()
def end():
logging.info(f"Finishing...")
time.sleep(3)
return None
# Define dependency between tasks
start() >> do() >> end()
# start the DAG
test_pipeline = test2_dag()
```
Screenshot of multiple DAG instances all in "Running" state.
<img width="546" alt="image"
src="https://user-images.githubusercontent.com/93541868/158224423-fae9b635-abd2-4be5-a292-fa2c6d3c5398.png">
Documentation says that there should not be more than max_active_runs in
Running state:
```
max_active_runs – maximum number of active DAG runs, beyond this number of
DAG runs in a running state, the scheduler won’t create new active DAG runs
```
I would expect only 1 DAG in Running state if I have "max_active_runs = 1".
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]