Hi all,

at the moment I’m facing a problem regarding the prioritized scheduling of 
tasks of two dags in a scaling PoC for our testing workflow automation.

I run a 2.0.1 airflow instance on a K8s cluster with 20 schedulers and 20 
celery workers. I have defined two large dags that I want to execute with 
priorities on this cluster.
Each dag has 10000 python tasks that are executed in parallel. The tasks of the 
first dag have the priority_weight 0 each. The ones contained in the second dag 
have the priority 9. All tasks are running in the same default_pool. My 
assumption is that when I start both dags, the second dag’s tasks are executed 
first before the tasks of the first dag are queued.
But what I’m seeing is that the tasks are queued assorted. Meaning that from 
the first and second dag the queued tasks are alternately executed.

What do I need to configure or implement to run the tasks as assumed?

Thank you in advance,

Chris

Mit freundlichen Grüßen / Best regards

Christian Lellmann

Engingeering Deterministic open loop (XC-AD/ESB4)
Robert Bosch GmbH | Hessbruehlstrasse 21 | 70565 Stuttgart-Vaihingen | GERMANY 
| www.bosch.com
Tel. +49 7062 911-2134 | Mobil +49 152 24358570 | Telefax +49 7062 911-01 | 
[email protected]<mailto:[email protected]>

Sitz: Stuttgart, Registergericht: Amtsgericht Stuttgart, HRB 14000;
Aufsichtsratsvorsitzender: Franz Fehrenbach; Geschäftsführung: Dr. Volkmar 
Denner,
Prof. Dr. Stefan Asenkerschbaumer, Filiz Albrecht, Dr. Michael Bolle, Dr. 
Christian Fischer,
Dr. Stefan Hartung, Dr. Markus Heyn, Harald Kröger, Rolf Najork, Uwe Raschke
​

Reply via email to