Thanks Sergei.


On Mar 3, 2017 10:33 PM, "Sergei Iakhnin" <[email protected]> wrote:

This happened to me too. I ended up putting them in queued status via db
update and then they got scheduled.

On Thu, 2 Mar 2017, 15:27 twinkle, <[email protected]> wrote:

> Hi,
>
> We plan to use Airflow along with Celery as the backend.
>  Today within a DAG run,  despite showing some of the tasks in a DAG as
> successful, Airflow was not scheduling the next potential tasks in it.
> Looking at the Celery Flower, following exceptions were observed:
>
> Traceback (most recent call last):
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/celery/app/trace.py",
> line 367, in trace_task
>     R = retval = fun(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/celery/app/trace.py",
> line 622, in __protected_call__
>     return self.run(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/airflow/executors/celery_executor.py",
> line 45, in execute_command
>     raise AirflowException('Celery command failed')
> AirflowException: Celery command failed
>
> There has been no failure logs at the Airflow side, and it has marked the
> task as Succeeded.
>
> Looking at the meta data table, i found the state of the task as FAILURE.
> It seems like some of the link is broken, as to some extent Airflow
> realises the failure, due to which it stopped scheduling the tasks
further,
> but it is not complete, as the UI showed different state.
>
> Has anyone else experienced it?
>
> Regards,
> Twinkle
>
--

Sergei

Reply via email to