ian-byrne edited a comment on issue #16025: URL: https://github.com/apache/airflow/issues/16025#issuecomment-888489320
I am having a similar issue where all celery tasks are marked as state=Failure. All of the tasks are success in the Airflow UI and appear to have worked proper with moving data to the database. The celery error found in flower is as follows: ``` Traceback (most recent call last): File "/home/airflow/.local/lib/python3.6/site-packages/celery/app/trace.py", line 412, in trace_task R = retval = fun(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/celery/app/trace.py", line 704, in __protected_call__ return self.run(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 88, in execute_command _execute_in_fork(command_to_exec) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 99, in _execute_in_fork raise AirflowException('Celery command failed on host: ' + get_hostname()) airflow.exceptions.AirflowException: Celery command failed on host: 380778ca99d1 -- ``` EDIT: after poking around a bit longer it looks like cloudwatch integration could be the issue #13824 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org