vba commented on issue #45516:
URL: https://github.com/apache/airflow/issues/45516#issuecomment-2587306687
Hi @potiuk,
> Can you please explain what executir and what log volume configuration you
have?
As you can see in the stacktrace in the middle of my issue, the executor is
`kubernetes_executor` and as for the volume used for logging, here's the
configuration (as shown in the issue):
```ini
base_log_folder = /opt/airflow/logs
remote_logging = True
remote_log_conn_id = s3_airflow_logs
delete_local_logs = False
google_key_path =
remote_base_log_folder = s3://the-bucket
```
here is how it's run by the k8s cluster:
```yml
# ...
spec:
containers:
- volumeMounts:
- mountPath: /opt/airflow/logs
name: logs
# ...
volumes
- emptyDir:
sizeLimit: 10Gi
name: logs
```
> I believe it might have something to do with the the volume you are using
to store the logs. This looks very much like the volume does not allow to
concurrently write and read files from. I think it would be great if you could
check that and see what type of volume you have there.
I don't think my problem is purely a configuration issue. If I downgrade my
airflow instance to version `2.4.3`, everything works fine for the same k8s
infrastructure. A silly test reveals no difficulty:
```
echo "my test" > /opt/airflow/logs/log.txt && cat /opt/airflow/logs/log.txt
...
my test
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]