tomerav0 opened a new issue #17956:
URL: https://github.com/apache/airflow/issues/17956
### Apache Airflow version
2.1.1
### Operating System
container-Optimized OS
### Versions of Apache Airflow Providers
PIP used:
google-cloud-bigquery
pandas
gcsfs
google-cloud-storage
google-cloud-logging
google-cloud-firestore
google-oauth2-tool
google-auth
psutil
boto3
### Deployment
Composer
### Deployment details
Image version:
composer-1.17.0-preview.10-airflow-2.1.1
configs:
worker_concurrency: 12
operation_timeout: 60
dag_dir_list_interval: 60
scheduler_zombie_task_threshold: 86400
dagbag_import_timeout: 60000
dags_are_paused_at_creation: True
dag_file_processor_timeout: 60000
auth_backend: airflow.api.auth.backend.default
enable_experimental_api: True
Our system have a dynamic DAG making with 780 active DAG's running at
different times (not everything together).
### What happened
The DAG's fails when Celery fails with the error:
`{celery_executor.py:120} ERROR - Failed to execute task maximum recursion
depth exceeded.
`"
```
ERROR - Task
airflow.executors.celery_executor.execute_command[c1065b14-8105-4d0d-9e69-32621f4ee534]
raised unexpected: AirflowException('Celery command failed on host:
airflow-worker-6bfd475c6d-pkvls')
"
```
```
"Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/logging/__init__.py", line 1159, in
close
stream.close()
OSError: [Errno 125] Operation canceled
"
```
`{celery_executor.py:120} ERROR - Failed to execute task [Errno 2] No such
file or directory: '/tmp/tmp_xp_yfjs'
`
<img width="1038" alt="Screenshot 2021-09-01 at 13 04 39"
src="https://user-images.githubusercontent.com/5702095/131653375-f8c192e9-ebef-4c43-a844-33d008e995da.png">
<img width="1047" alt="Screenshot 2021-09-01 at 13 08 10"
src="https://user-images.githubusercontent.com/5702095/131653383-e83d2c64-5a9e-4f71-87e2-556835ad69b3.png">
<img width="1047" alt="Screenshot 2021-09-01 at 13 09 01"
src="https://user-images.githubusercontent.com/5702095/131653476-d02c004d-7ba7-4e61-a60a-a250e4754a91.png">
### What you expected to happen
The Celery not to crash
### How to reproduce
It happens when there is a lot of DAG's running, not sure about the extcat
number but the more DAG's the more its fails.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]