Thanks anyway!

And I am not deploying the setup to AWS EC2. Currently I am just running the 
docker containers locally.

Best,
Sophie



________________________________
From: Aaron Grubb <[email protected]>
Sent: Tuesday, April 7, 2020 4:59 PM
To: [email protected] <[email protected]>
Subject: RE: Taskes not running with celery gevent/eventlet worker


That solved the issue for me so I didn’t pursue it beyond bringing it up in 
Slack. Just out of curiosity, are you trying to deploy this setup on AWS EC2? I 
had some suspicions that this issue might have been specific to that hosting 
solution.



From: Sophie Herrmann <[email protected]>
Sent: Tuesday, April 7, 2020 10:27 AM
To: [email protected]
Subject: Re: Taskes not running with celery gevent/eventlet worker



Hi,



Thanks for the quick answer! Sadly that is not an option in our setup.



It seems to me Airflow 1.10.9 does not properly create the gevent celery 
worker. When I check the workers in Flower's Web UI (localhost:5555) I see the 
worker but when clicking on the worker's name  I just get "Unknown worker 
'celery@dfa2e29c0b96'".



Did you find any other workaround not requiring celery CLI? Do you know if 
there is an active issue on this? Do you know if it will be fixed in Airflow 
2.0?



Cheers,

Sophie

________________________________

From: Aaron Grubb <[email protected]<mailto:[email protected]>>
Sent: Tuesday, April 7, 2020 3:05 PM
To: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>>
Subject: RE: Taskes not running with celery gevent/eventlet worker



I think this is the same issue I encountered. Try running the celery workers 
manually (i.e. using the celery command to launch them).



From: Sophie Herrmann <[email protected]<mailto:[email protected]>>
Sent: Tuesday, April 7, 2020 8:21 AM
To: [email protected]<mailto:[email protected]>
Subject: Taskes not running with celery gevent/eventlet worker



Hi everyone,



I am using Airflow 1.10.9 with the CeleryExecutor (docker image based on 
puckel/docker-airflow) and I am trying to switch from prefork to 
gevent/eventlet pool for one worker.
With prefork everything is working as expected but when I use a gevent/eventlet 
worker instead, it seems tasks are not picked up.



I checked the logs and the scheduler queues the task and sends them to the 
worker. The worker then receives the task but after that nothing else happens 
(execution is not started).

Below the last logs I get from the worker:



[2020-04-07 12:13:13,376: INFO/MainProcess] Received task: 
airflow.executors.celery_executor.execute_command[32ba869c-80b8-47b5-8eca-aadd64c7fd7c]

[2020-04-07 12:13:13,614: INFO/MainProcess] Scaling up 1 processes.

[2020-04-07 12:13:13,630: DEBUG/MainProcess] TaskPool: Apply <function 
_fast_trace_task at 0x7fb8af20fdd0> 
(args:('airflow.executors.celery_executor.execute_command', 
'32ba869c-80b8-47b5-8eca-aadd64c7fd7c', {'lang': 'py', 'task': 
'airflow.executors.celery_executor.execute_command', 'id': 
'32ba869c-80b8-47b5-8eca-aadd64c7fd7c', 'shadow': None, 'eta': None, 'expires': 
None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 
'32ba869c-80b8-47b5-8eca-aadd64c7fd7c', 'parent_id': None, 'argsrepr': 
"[['airflow', 'run', 'jb-10baee25-8b57-46d1-bb9b-75380504df33', 
'cancel_sensor', '2020-04-07T12:13:09.378481+00:00', '--local', '--pool', 
'default_pool', '-sd', 
'/usr/local/airflow/dags/dag_jb-10baee25-8b57-46d1-bb9b-75380504df33.py']]", 
'kwargsrepr': '{}', 'origin': 'gen76@f2822480e988', 'reply_to': 
'e07ad7c8-e252-3533-a528-11b3eb712a22', 'correlation_id': 
'32ba869c-80b8-47b5-8eca-aadd64c7fd7c', 'hostname': 'celery@9ede4f9fb1d1', 
'delivery_info': {'exchange': '', 'routing_key': 'sensor', 'priority': 0, 
'redelivered': None}, 'args': [['airflow', 'run', 
'jb-10baee25-8b57-46d1-bb9b-75380504df33', 'cancel_sensor',... kwargs:{})



In the Airflow UI the task is set to queued and does not change.



Any idea what could be the problem?



Best,

Sophie




Reply via email to