Hi Airflow Team,

We are building an usecase in which we need to run 1000 parallel tasks in a
scheduled manner.

For this we are planning to use Airflow with database as Postgres and
Celery as executor.

While testing with 100 parallel tasks in Airflow, the observations is :
  - 100 postgres connections are open and will remain open utill tasks get
completed.

So assumption is each task is opening one connection and keeps connection
open untill task get completed.

Now in order run 1000 parallel tasks, will it require 1000 postgres
connections?

Is there any optimization or connection pooling for this.

Please suggest.


Regards,
Abhay

Reply via email to