GitHub user MaksymSofer edited a discussion: How parallelism works in 
KubernetesExecutor?

Hello,
I use Airflow 2.10.1 in GKE, parallelism is set to 32
i have three dags that can bring (even on theoretical level only 12 
simultaneous tasks). 
but i see constant issue with Scheduler going into 
`Executor parallelism limit reached. 0 open slots.`

Checking executor code, i see that open_slots defined as:

[open_slots = self.parallelism - 
len(self.running)](https://github.com/apache/airflow/blob/945083250d2a4c58344c4d2db64f83b6caa06a6a/airflow/executors/base_executor.py#L244C13-L244C62)

I spotted that KubernetesExecutor leave "Succeeded" pods - is it possible that 
they calculated into "running" amount?

Please help me.




GitHub link: https://github.com/apache/airflow/discussions/47805

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to