Norio Akagi created AIRFLOW-1979:
------------------------------------
Summary: Redis celery backend not work on 1.9.0 (configuration is
ignored)
Key: AIRFLOW-1979
URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
Project: Apache Airflow
Issue Type: Bug
Components: celery, worker
Affects Versions: 1.9.0
Reporter: Norio Akagi
Worker tries to connect to RabbigMQ based on a default setting and shows an
error as below:
{noformat}
[2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables from
/usr/lib/python2.7/lib2to3/Grammar.txt
[2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables from
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
[2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key
[celery/celery_ssl_active] not found in config
[2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor will
run without SSL
[2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor CeleryExecutor
[2018-01-09 16:45:43,140: WARNING/MainProcess]
/usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161:
CDeprecationWarning:
Starting from version 3.2 Celery will refuse to accept pickle by default.
The pickle serializer is a security concern as it may give attackers
the ability to execute any command. It's important to secure
your broker from unauthorized access when using pickle, so we think
that enabling pickle should require a deliberate action and not be
the default choice.
If you depend on pickle then you should set a setting to disable this
warning and to be sure that everything will continue working
when you upgrade to Celery 3.2::
CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
You must only enable the serializers that you will actually use.
warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
[2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to
amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
Trying again in 2.00 seconds...
{noformat}
I deploy Airflow on kubernetes so each component (web, scheduler, worker, and
flower) is containerized and distributed among nodes. I set
{{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be
seen when I run {{printenv}} in a container, but it looks completely ignored.
Moving these values to {{airflow.cfg}} doesn't work either.
It worked just perfectly 1.8 and suddenly stopped working when I upgraded
Airflow to 1.9.
Do you have any idea what may cause this configuration issue?
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)