Jorricks commented on issue #13542: URL: https://github.com/apache/airflow/issues/13542#issuecomment-877087945
> https://github.com/apache/airflow/blob/b0f7f91fe29d1314b71c76de0f11d2dbe81c5c4a/airflow/jobs/scheduler_job.py#L336 > > . This particular line looks like it limits the query results per the amount of max executions of the task instances (via parallelism). Doesn't this hinder the actual results to what CAN be scheduled? There shouldn't be a limit here as the for loop underneath logically goes through the results and schedules the queued tasks right? I agree with you that this limit prevents actual TaskInstances from being scheduled in that run. This is also why I increased the `parallelism` option to 1000 and increased the `pool` size. However, I am a bit puzzled by that increasing these two doesn't solve your issue with Airflow 2.1.1. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
