I see, there are a couple of reasons, for example, you don't have enough celery workers or workers are faulty. You may want to inspect the celery state using the GUI-based flower <http://docs.celeryproject.org/en/latest/userguide/monitoring.html> tool. Alternatively, try delete all queued tasks, Airflow scheduler should re-generate and re-queue these task instances.
On Fri, Jun 1, 2018 at 4:30 PM Pedro Machado <pe...@205datalab.com> wrote: > > > > Using postgres and redis running in their containers. The set up is based > > on the astronomer open set up: > > > https://github.com/astronomerio/astronomer/blob/master/examples/airflow-enterprise/docker-compose.yml >