kuikeelc opened a new issue, #30991:
URL: https://github.com/apache/airflow/issues/30991
### Apache Airflow version
2.6.0
### What happened
Hi, since Airflow version 2.3.2 we can't read the remote logging on Google
Cloud Storage from Apache Airflow. The message in v2.6.0 is the following:
`*** No logs found in GCS; ti=%s <TaskInstance: mydag.my_task
manual__2023-05-01T12:25:55.937029+00:00 [success]>`
We do see the logfiles on Google Cloud Storage though. They are situated in
the correct spot.
The logging configuration is the following:
```
[logging]
base_log_folder = /home/airflow/logs
remote_logging = True
remote_base_log_folder = gs://{{ .Values.airflow.logs.bucket
}}
colored_console_log = True
colored_log_format = [%%(blue)s%%(asctime)s%%(reset)s]
%%(log_color)s%%(levelname)s%%(reset)s - %%(log_color)s%%(message)s%%(reset)s
colored_formatter_class =
airflow.utils.log.colored_log.CustomTTYColoredFormatter
log_format = [%%(asctime)s] %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s -
%%(message)s
log_filename_template = dag_id={{ "{{" }} ti.dag_id {{ "}}"
}}/run_id={{ "{{" }} ti.run_id {{ "}}" }}/task_id={{ "{{" }} ti.task_id {{ "}}"
}}/{%% if ti.map_index >= 0 %%}map_index={{ "{{" }} ti.map_index {{ "}}" }}/{%%
endif %%}attempt={{ "{{" }} ti.try_number {{ "}}" }}.log
log_processor_filename_template = {{
.Values.airflow.logs.log_processor_filename_template }}
dag_processor_manager_log_location =
/home/airflow/logs/dag_processor_manager/dag_processor_manager.log
```
We did change some of the logging settings, to see it that resulted in the
bug, but to no avail.
I have posted this request earlier, but it was never solved. The bug remains.
### What you think should happen instead
When logging in to the Airflow webserver and go to a task run. And we would
then click on the log button, we would expect to the logfile. We can however
ONLY see the logs, when the task is running. Not when it is in the e.g. success
or failed state.
### How to reproduce
Use remote_logging=True and upload them to GCloud storage
### Operating System
Linux, but this is a Python issue
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==8.0.0
apache-airflow-providers-celery==3.1.0
apache-airflow-providers-cncf-kubernetes==6.1.0
apache-airflow-providers-common-sql==1.4.0
apache-airflow-providers-docker==3.6.0
apache-airflow-providers-elasticsearch==4.4.0
apache-airflow-providers-ftp==3.3.1
apache-airflow-providers-google==10.0.0
apache-airflow-providers-grpc==3.1.0
apache-airflow-providers-hashicorp==3.3.1
apache-airflow-providers-http==4.3.0
apache-airflow-providers-imap==3.1.1
apache-airflow-providers-microsoft-azure==6.0.0
apache-airflow-providers-mysql==5.0.0
apache-airflow-providers-odbc==3.2.1
# Editable install with no version control
(apache-airflow-providers-onesecondbefore==71.0.0)
-e
/home/airflow/logs/.hidden/providers/apache-airflow-providers-onesecondbefore
apache-airflow-providers-postgres==5.4.0
apache-airflow-providers-redis==3.1.0
apache-airflow-providers-sendgrid==3.1.0
apache-airflow-providers-sftp==4.2.4
apache-airflow-providers-slack==7.2.0
apache-airflow-providers-snowflake==4.0.5
apache-airflow-providers-sqlite==3.3.2
apache-airflow-providers-ssh==3.6.0
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
docker, helm, k8s
### Anything else
It always occurs.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]