GitHub user vba added a comment to the discussion: Unable to see logs in the 
web UI when the job is running

Hi @potiuk, 

> Can you please explain what executir and what log volume configuration you 
> have?

As you can see in the stacktrace in the middle of my issue, the executor is 
`kubernetes_executor` and as for the volume used for logging, here's the 
configuration (as shown in the issue):

```ini
base_log_folder = /opt/airflow/logs
remote_logging = True
remote_log_conn_id = s3_airflow_logs
delete_local_logs = False
google_key_path = 
remote_base_log_folder = s3://the-bucket
```

here is how it's run by the k8s cluster:

```yml
# ...
spec:
  containers:
  - volumeMounts:
    - mountPath: /opt/airflow/logs
      name: logs
# ...
  volumes
  - emptyDir:
      sizeLimit: 10Gi
    name: logs
```

> I believe it might have something to do with the the volume you are using to 
> store the logs. This looks very much like the volume does not allow to 
> concurrently write and read files from. I think it would be great if you 
> could check that and see what type of volume you have there.

I don't think my problem is purely a configuration issue. If I downgrade my 
airflow instance to version `2.4.3`, everything works fine for the same k8s 
infrastructure. A silly test reveals no difficulty:
```
echo "my test" > /opt/airflow/logs/log.txt && cat /opt/airflow/logs/log.txt
...
my test
```



GitHub link: 
https://github.com/apache/airflow/discussions/45624#discussioncomment-11823672

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to