GitHub user dimon222 created a discussion: How to use sharding for schedulers?
Hi there. I use 2.11.0 with celery executor. I have reached the stage of 6000 dags on my cluster and my scheduler loop basically takes now enormous amount of time slowing down the task scheduling/turnaround. I had multiple schedulers, played with max dagrun settings, sort mode, but I feel it's time to shard the work each scheduler gets. Would saving dags to independent folders externally and point different schedulers on separate folders help reduce the burden of "parsing entire tons of dags every single scheduler loop"? I don't want to deploy extra airflow clusters and explicitly interested in having one big one. GitHub link: https://github.com/apache/airflow/discussions/56294 ---- This is an automatically sent email for [email protected]. To unsubscribe, please send an email to: [email protected]
