nicolamarangoni commented on issue #24538:
URL: https://github.com/apache/airflow/issues/24538#issuecomment-1181562132

   @potiuk I have several pods with AirFlow 2.3.3. In some of them I set the 
KubernetesExecutor, in some others the CeleryExecutor with 2 Workers. Some of 
the Pods with Celery looks fine but have max 100 DAGs and very few concurrently 
running DAGs (maybe max 2-3 DAGs).
   The pods with the CeleryExecutor and several DAGs (> 150) on the other hand 
have the Scheduler crashing with the same error message and the same error 
stack as @meetri wrote.
   I cannot tell how many concurrent DAGs/Jobs would be running on those pods 
because the scheduler crashes right after importing the DAGs.
   Which other information would be useful for analysis?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to