eladkal commented on issue #49508:
URL: https://github.com/apache/airflow/issues/49508#issuecomment-2834521949

   It seems that starvation can happen. Also reported in 
https://github.com/apache/airflow/issues/45636
   
   Given 2 pools `default_pool` and `special_pool` with 2 DAGs:
   
   ```
   from datetime import datetime
   from airflow import DAG
   from airflow.decorators import task
   import time
   
   default_args = {
       'owner': 'airflow',
       'start_date': datetime(2023, 2, 1)
   }
   
   with DAG('scheduler1', schedule="@daily", catchup=True, 
default_args=default_args):
   
       @task()
       def sleep_it1():
           time.sleep(3600) # 1 hour
       sleep_it1()
   
   with DAG('scheduler2', schedule=None, catchup=False, 
default_args=default_args):
       @task(pool="special_pool")
       def sleep_it2():
           time.sleep(60) # 1 min
   
       sleep_it2()
   ```
   
   
   The scheduler1 dag creates many dagruns as it has catchup. Once those runs 
are created there is large "backlog" of tasks to complete, create a run in 
scheduler2. The task sleep_it2 will not be executed till all runs of sleep_it1 
are finished. That is though  in each loop of the scheduler it can not really 
schedule tasks of sleep_it1 but can schedule sleep_it2.
   I waited for more than 30 min and sleep_it2 did not get scheduled though it 
can be.
   
   <img width="1619" alt="Image" 
src="https://github.com/user-attachments/assets/caab1d8e-8220-400f-b7b6-d6803e31457c";
 />
   
   This happens both in Airflow 2 and Airflow 3.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to