I should have been clearer, in my case the tasks were marked as shutdown.
This was reflected in the *state* column in the *task_instance* table.

On Sun, Mar 5, 2017 at 2:18 PM Bolke de Bruin <[email protected]> wrote:

> Can you provide a bit more details on “SUCCEEDED” vs “FAILURE”? We use the
> db as a state keeper. And only the Task itself can mark SUCCESS or FAILED.
> Som I am wondering were did you see those states?
>
> B
>
> > On 2 Mar 2017, at 15:27, twinkle <[email protected]> wrote:
> >
> > Hi,
> >
> > We plan to use Airflow along with Celery as the backend.
> > Today within a DAG run,  despite showing some of the tasks in a DAG as
> > successful, Airflow was not scheduling the next potential tasks in it.
> > Looking at the Celery Flower, following exceptions were observed:
> >
> > Traceback (most recent call last):
> >  File
> >
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
> > line 367, in trace_task
> >    R = retval = fun(*args, **kwargs)
> >  File
> >
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
> > line 622, in __protected_call__
> >    return self.run(*args, **kwargs)
> >  File
> >
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
> > line 45, in execute_command
> >    raise AirflowException('Celery command failed')
> > AirflowException: Celery command failed
> >
> > There has been no failure logs at the Airflow side, and it has marked the
> > task as Succeeded.
> >
> > Looking at the meta data table, i found the state of the task as FAILURE.
> > It seems like some of the link is broken, as to some extent Airflow
> > realises the failure, due to which it stopped scheduling the tasks
> further,
> > but it is not complete, as the UI showed different state.
> >
> > Has anyone else experienced it?
> >
> > Regards,
> > Twinkle
>
> --

Sergei

Reply via email to