Hi Walther,

Thank you for suggestion!

No, I use mysql as results backend, but it seems that flower uses same
queue as results backend for it's monitoring purposes.
Issue seemed to resolve itself after I restarted all the workers and
changed the setup of rabbitmq to remove the queues that have no consumers.

Best, Dima

On Sun, Jun 18, 2017 at 4:55 PM, Georg Walther <[email protected]>
wrote:

> Hi Dima,
>
>
> do you use RabbitMQ as the Celery result backend?
> If so try using e.g. Redis as result backend (parameter
> "celery_result_backend" in the airflow.cfg) while
> keeping RabbitMQ as the message broker (broker_url).
>
>
> Best,
>
> Georg
>
> On Mon, May 29, 2017 at 1:59 PM, Dmitry Smirnow <[email protected]>
> wrote:
>
> > Hi,
> >
> > I've noticed that in the rabbitmq which is used as a broker for Airflow,
> > there are thousands of heartbeat messages from workers piling up (type:
> > "worker-heartbeat"). The version I use is 1.7.1.3.
> >
> > I googled around and it seems that those are the events used by celery
> > flower for monitoring.
> > I may misunderstood something, but it seemed that to stop those messages
> I
> > should for example set some celery settings to make the unused queues
> > expire.
> > What would be the right way to deal with it? I'm really not sure which
> > config should I touch. Any ideas are welcome and if I need to provide
> more
> > info about configuration - please suggest which one.
> >
> > Thank you in advance,
> > Best regards, Dima
> >
> > --
> >
> > Dmitry Smirnov (MSc.)
> > Data Engineer @ Yousician
> > mobile: +358 50 3015072
> >
>



-- 

Dmitry Smirnov (MSc.)
Data Engineer @ Yousician
mobile: +358 50 3015072

Reply via email to