mrybas opened a new issue, #40861:
URL: https://github.com/apache/airflow/issues/40861
### Official Helm Chart version
1.14.0 (latest released)
### Apache Airflow version
v2.9.3
### Kubernetes Version
v1.29.5
### Helm Chart configuration
```
executor: "CeleryKubernetesExecutor"
config:
logging:
remote_logging: "True"
remote_base_log_folder: s3://path/to/logs/
remote_log_conn_id: ceph_default
encrypt_s3_logs: 'False'
logs:
persistence:
enabled: true
size: 10Gi
```
### Docker Image customizations
_No response_
### What happened
When using Celery Kubernetes Executor, logs are not saved after deleting a
pod, nor in persistence storage with settings
```
logs:
persistence:
enabled: true
size: 10Gi
```
nor in s3 repository with settings
```
config:
logging:
remote_logging: "True"
remote_base_log_folder: s3://path/to/logs/
remote_log_conn_id: ceph_default
encrypt_s3_logs: 'False'
```
At the same time, Celery Executor saves logs both in persistence storage and
in s3
### What you think should happen instead
after deleting the pod, the logs should be saved in the s3 storage
### How to reproduce
deploy airflow from helm chart with
```
executor: "CeleryKubernetesExecutor"
config:
logging:
remote_logging: "True"
remote_base_log_folder: s3://path/to/logs/
remote_log_conn_id: ceph_default
encrypt_s3_logs: 'False'
```
and run dag
```
import datetime
import airflow
from airflow.operators.python import PythonOperator
import logging
from time import sleep
with airflow.DAG(
"sample_celery_kubernetes",
start_date=datetime.datetime(2022, 1, 1),
schedule_interval=None,
) as dag:
def kubernetes_example():
logging.info("This task runs using KubernetesExecutor")
sleep(10)
logging.info("Task completed")
def celery_example():
logging.info("This task runs using CeleryExecutor")
sleep(10)
logging.info("Task completed")
# To run with KubernetesExecutor, set queue to kubernetes
task_kubernetes = PythonOperator(
task_id="task-kubernetes",
python_callable=kubernetes_example,
dag=dag,
queue="kubernetes",
)
# To run with CeleryExecutor, omit the queue argument
task_celery = PythonOperator(
task_id="task-celery", python_callable=celery_example, dag=dag
)
_ = task_kubernetes >> task_celery
```
for Kubernetes Executor `*** No logs found on s3 for `
for celery `*** Found logs in s3:`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]