Re: Airflow with Celery

2018-05-16 Thread Driesprong, Fokko
I had similar issues with Airflow running the Celery executor.

The celery_result_backend should be a persistent database like Postgres or
MySql. What broker are you using? I would recommend using Redis or
RabbitMQ, depending on what you like the most.

Cheers, Fokko

2018-05-15 21:12 GMT+02:00 David Capwell <dcapw...@gmail.com>:

> What I find is that when celery rejects we hit this.  For us we don't do
> work on the hosts so solve by over provisioning tasks in celery
>
> On Tue, May 15, 2018, 6:30 AM Andy Cooper <andy.coo...@astronomer.io>
> wrote:
>
>> I have had very similar issues when there was a problem with the
>> connection
>> string pointing to the message broker. Triple check those connection
>> strings and attempt to connect outside of airflow.
>>
>> On Tue, May 15, 2018 at 9:27 AM Goutham Pratapa <pratapagout...@gmail.com
>> >
>> wrote:
>>
>> > Hi all,
>> >
>> > I have been using airflow with Celery executor in the background
>> >
>> > https://hastebin.com/sipecovomi.ini --> airflow.cfg
>> >
>> > https://hastebin.com/urutokuvoq.py   --> The dag I have been using
>> >
>> >
>> >
>> > This shows that the dag is always in running state.
>> >
>> >
>> >
>> >
>> > Airflow flower shows nothing in the tasks or in the broker.
>> >
>> >
>> > Did I miss anything can anyone help me in this regard.
>> >
>> >
>> > --
>> > Cheers !!!
>> > Goutham Pratapa
>> >
>>
>


Re: Airflow with Celery

2018-05-15 Thread David Capwell
What I find is that when celery rejects we hit this.  For us we don't do
work on the hosts so solve by over provisioning tasks in celery

On Tue, May 15, 2018, 6:30 AM Andy Cooper <andy.coo...@astronomer.io> wrote:

> I have had very similar issues when there was a problem with the connection
> string pointing to the message broker. Triple check those connection
> strings and attempt to connect outside of airflow.
>
> On Tue, May 15, 2018 at 9:27 AM Goutham Pratapa <pratapagout...@gmail.com>
> wrote:
>
> > Hi all,
> >
> > I have been using airflow with Celery executor in the background
> >
> > https://hastebin.com/sipecovomi.ini --> airflow.cfg
> >
> > https://hastebin.com/urutokuvoq.py   --> The dag I have been using
> >
> >
> >
> > This shows that the dag is always in running state.
> >
> >
> >
> >
> > Airflow flower shows nothing in the tasks or in the broker.
> >
> >
> > Did I miss anything can anyone help me in this regard.
> >
> >
> > --
> > Cheers !!!
> > Goutham Pratapa
> >
>


Re: Airflow with Celery

2018-05-15 Thread Andy Cooper
I have had very similar issues when there was a problem with the connection
string pointing to the message broker. Triple check those connection
strings and attempt to connect outside of airflow.

On Tue, May 15, 2018 at 9:27 AM Goutham Pratapa <pratapagout...@gmail.com>
wrote:

> Hi all,
>
> I have been using airflow with Celery executor in the background
>
> https://hastebin.com/sipecovomi.ini --> airflow.cfg
>
> https://hastebin.com/urutokuvoq.py   --> The dag I have been using
>
>
>
> This shows that the dag is always in running state.
>
>
>
>
> Airflow flower shows nothing in the tasks or in the broker.
>
>
> Did I miss anything can anyone help me in this regard.
>
>
> --
> Cheers !!!
> Goutham Pratapa
>


Airflow with Celery

2018-05-15 Thread Goutham Pratapa
Hi all,

I have been using airflow with Celery executor in the background

https://hastebin.com/sipecovomi.ini --> airflow.cfg

https://hastebin.com/urutokuvoq.py   --> The dag I have been using



This shows that the dag is always in running state.




Airflow flower shows nothing in the tasks or in the broker.


Did I miss anything can anyone help me in this regard.


-- 
Cheers !!!
Goutham Pratapa


Re: Airflow and Celery - co-ordination issue

2017-03-05 Thread twinkle
Thanks Sergei.



On Mar 3, 2017 10:33 PM, "Sergei Iakhnin" <lle...@gmail.com> wrote:

This happened to me too. I ended up putting them in queued status via db
update and then they got scheduled.

On Thu, 2 Mar 2017, 15:27 twinkle, <twinkle.sachd...@gmail.com> wrote:

> Hi,
>
> We plan to use Airflow along with Celery as the backend.
>  Today within a DAG run,  despite showing some of the tasks in a DAG as
> successful, Airflow was not scheduling the next potential tasks in it.
> Looking at the Celery Flower, following exceptions were observed:
>
> Traceback (most recent call last):
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/celery/app/trace.py",
> line 367, in trace_task
> R = retval = fun(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/celery/app/trace.py",
> line 622, in __protected_call__
> return self.run(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/
site-packages/airflow/executors/celery_executor.py",
> line 45, in execute_command
> raise AirflowException('Celery command failed')
> AirflowException: Celery command failed
>
> There has been no failure logs at the Airflow side, and it has marked the
> task as Succeeded.
>
> Looking at the meta data table, i found the state of the task as FAILURE.
> It seems like some of the link is broken, as to some extent Airflow
> realises the failure, due to which it stopped scheduling the tasks
further,
> but it is not complete, as the UI showed different state.
>
> Has anyone else experienced it?
>
> Regards,
> Twinkle
>
--

Sergei


Re: Airflow and Celery - co-ordination issue

2017-03-03 Thread Sergei Iakhnin
This happened to me too. I ended up putting them in queued status via db
update and then they got scheduled.

On Thu, 2 Mar 2017, 15:27 twinkle, <twinkle.sachd...@gmail.com> wrote:

> Hi,
>
> We plan to use Airflow along with Celery as the backend.
>  Today within a DAG run,  despite showing some of the tasks in a DAG as
> successful, Airflow was not scheduling the next potential tasks in it.
> Looking at the Celery Flower, following exceptions were observed:
>
> Traceback (most recent call last):
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
> line 367, in trace_task
> R = retval = fun(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
> line 622, in __protected_call__
> return self.run(*args, **kwargs)
>   File
>
> "/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
> line 45, in execute_command
> raise AirflowException('Celery command failed')
> AirflowException: Celery command failed
>
> There has been no failure logs at the Airflow side, and it has marked the
> task as Succeeded.
>
> Looking at the meta data table, i found the state of the task as FAILURE.
> It seems like some of the link is broken, as to some extent Airflow
> realises the failure, due to which it stopped scheduling the tasks further,
> but it is not complete, as the UI showed different state.
>
> Has anyone else experienced it?
>
> Regards,
> Twinkle
>
-- 

Sergei


Airflow and Celery - co-ordination issue

2017-03-02 Thread twinkle
Hi,

We plan to use Airflow along with Celery as the backend.
 Today within a DAG run,  despite showing some of the tasks in a DAG as
successful, Airflow was not scheduling the next potential tasks in it.
Looking at the Celery Flower, following exceptions were observed:

Traceback (most recent call last):
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
line 367, in trace_task
R = retval = fun(*args, **kwargs)
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/celery/app/trace.py",
line 622, in __protected_call__
return self.run(*args, **kwargs)
  File
"/home/allocation/.pyenv/versions/2.7.12/lib/python2.7/site-packages/airflow/executors/celery_executor.py",
line 45, in execute_command
raise AirflowException('Celery command failed')
AirflowException: Celery command failed

There has been no failure logs at the Airflow side, and it has marked the
task as Succeeded.

Looking at the meta data table, i found the state of the task as FAILURE.
It seems like some of the link is broken, as to some extent Airflow
realises the failure, due to which it stopped scheduling the tasks further,
but it is not complete, as the UI showed different state.

Has anyone else experienced it?

Regards,
Twinkle