We noticed that in the past few days we keep seeing tasks stay in the
queued state.  Looking into celery, we see that the task had failed.

Traceback (most recent call last):
  File "/python/lib/python2.7/site-packages/celery/app/trace.py", line
367, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/python/lib/python2.7/site-packages/celery/app/trace.py", line
622, in __protected_call__
    return self.run(*args, **kwargs)
  File 
"/python/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
line 59, in execute_command
    raise AirflowException('Celery command failed')
AirflowException: Celery command failed


Why does airflow not learn about this and recover? And what can we do to
prevent this?

Thanks for your time reading this email.

Reply via email to