victorjourne commented on issue #28380:
URL: https://github.com/apache/airflow/issues/28380#issuecomment-1356891596

   After testing many configurations about celery backend, the solution I found 
is the combination of :
   - Force celery **not to store the task return** by passing the env variable 
: `CELERY_IGNORE_RESULT='True'` 
([docs](https://docs.celeryq.dev/en/stable/userguide/configuration.html#std-setting-task_ignore_result)).
 
   Fortunately, airflow stores through Xcom the task result to process it 
further.
   - Use this bug fix from @potiuk  :  #28283
   
   However, it seems to be inconsistent with this airflow 
[diagram](https://airflow.apache.org/docs/apache-airflow/stable/executor/celery.html#task-execution-process),
 the task status in the metadata should be updated from the result backend 
table, which is empty.... I would go deeper in the code if I had the time.
   
   In any case, the whole issue about celery green threads is related to the 
way of airflow calls the **result backend**. There is something blocking the 
celery workers to stop. I should to investigate it more, but quite astonish to 
be the first user to undergo this, since it is a quite common pattern to 
concurrently call **IO tasks** with green threads. To achieve that, you guys 
may use `CeleryExecutor`, or the `LocalExecutor`?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to