dkondidatov opened a new issue, #58771:
URL: https://github.com/apache/airflow/issues/58771

   ### Apache Airflow version
   
   3.1.3
   
   ### If "Other Airflow 2/3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Hi! We are working on setup Airflow3+ in our environment and we've met 
significant issues with the log display in Airflow Web UI. Because of the 
issues with Elasticsearch integration and changes in log format we decided to 
use S3 bucket to store our logs. After finishing with configuration and getting 
our logs stored in S3 we've got problems with accessing the logs with WebUI.
   
   What happened? 
   - dag started and successfully finished it's tasks
   - logs was uploaded to s3 bucket
   - in the WebUI I'm trying to see the logs, but UI displays nothing if show 
**All Sources** is chosen
   <img width="1970" height="1258" alt="Image" 
src="https://github.com/user-attachments/assets/6f326a91-abb5-438c-9554-d9937f64901b";
 />
   
   - if I manually pick any source except **airflow.sdk.bases.hooks** I'll see 
the logs
   <img width="1985" height="1266" alt="Image" 
src="https://github.com/user-attachments/assets/274cfcec-213a-4d11-b77f-ff30b49621c1";
 />
   
   - if I'll add **airflow.sdk.bases.hooks** the display will show nothing
   <img width="1971" height="1247" alt="Image" 
src="https://github.com/user-attachments/assets/8ce1cb23-0b54-4a39-9c38-92cef829cdb7";
 />
   
   Also I want to report two additional problems:
   - scrolling in the log display window feels weird. Log lines disappears in 
the middle of window on scroll.
   - log message MUST be shown as WARNING, not at DEBUG level. It was difficult 
to find the reason why pod stucks on connection to S3 step.
   ```log
   {"timestamp":"2025-11-27T10:16:37.079377Z","level":"debug","event":"Missing 
endpoint_url in extra config of AWS Connection with id CENSORED. Using default 
AWS service 
   ```
   
   ### What you think should happen instead?
   
   Logs must be available without any manual actions. Log scrolling must be 
fixed. Lines must not disappear. 
   
   ### How to reproduce
   
   - start airflow3.1.3-python3.10 cluster with apache-airflow helm chart
   - configure remote logging to s3
   - configure connection  with `"extra": "config_kwargs": {"s3": 
{"addressing_style": "virtual"}}` parameter
   - run a DAG and have your logs uploaded to S3
   - try to access the task logs with Airflow Web UI
   
   
   ### Operating System
   
   Image airflow3.1.3-python3.10 from DockerHub was used
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==9.16.0
   apache-airflow-providers-celery==3.13.0
   apache-airflow-providers-cncf-kubernetes==10.9.0
   apache-airflow-providers-common-compat==1.8.0
   apache-airflow-providers-common-io==1.6.4
   apache-airflow-providers-common-messaging==2.0.0
   apache-airflow-providers-common-sql==1.28.2
   apache-airflow-providers-daskexecutor==1.1.1
   apache-airflow-providers-docker==4.4.4
   apache-airflow-providers-elasticsearch==6.3.4
   apache-airflow-providers-fab==3.0.1
   apache-airflow-providers-ftp==3.13.2
   apache-airflow-providers-git==0.0.9
   apache-airflow-providers-google==18.1.0
   apache-airflow-providers-grpc==3.8.2
   apache-airflow-providers-hashicorp==4.3.3
   apache-airflow-providers-http==5.4.0
   apache-airflow-providers-imap==3.9.3
   apache-airflow-providers-microsoft-azure==12.8.0
   apache-airflow-providers-mysql==6.3.4
   apache-airflow-providers-odbc==4.10.2
   apache-airflow-providers-openlineage==2.7.3
   apache-airflow-providers-postgres==6.4.0
   apache-airflow-providers-presto==5.9.3
   apache-airflow-providers-redis==4.3.2
   apache-airflow-providers-sendgrid==4.1.4
   apache-airflow-providers-sftp==5.4.1
   apache-airflow-providers-slack==9.4.0
   apache-airflow-providers-smtp==2.3.1
   apache-airflow-providers-snowflake==6.6.0
   apache-airflow-providers-sqlite==4.1.2
   apache-airflow-providers-ssh==4.1.5
   apache-airflow-providers-standard==1.9.1
   apache-airflow-providers-yandex==4.2.0
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   KubernetesExecutor with default pod template was used
   
   airflow.cfg custom parameters
   
   ```yaml
   config:
     core:
       dag_run_conf_overrides_params: "False"
       test_connection: "Enabled"
       dagbag_import_timeout: "20"
       dag_file_processor_timeout: "30"
       dag_ignore_file_syntax: "glob"
       remote_logging: "True"
     scheduler:
       max_dagruns_per_loop_to_schedule: "30"
       scheduler_health_check_threshold: "60"
       parsing_processes: "2"
       standalone_dag_processor: "True"
     kubernetes: 
       worker_pods_creation_batch_size: "120"
     secrets:
       backend: "airflow.providers.hashicorp.secrets.vault.VaultBackend"
       backend_kwargs: '{"url": "CENSORED", "auth_mount_point": "CENSORED", 
"auth_type": "kubernetes", "mount_point": "CENSORED", "kubernetes_role": 
"airflow", "kubernetes_jwt_path": 
"/var/run/secrets/kubernetes.io/serviceaccount/token", "connections_path": 
"app-airflow-ml/stage/connections", "variables_path": 
"app-airflow-ml/stage/variables", "config_path": null}'
     logging:
       level: "DEBUG"
       logging_level: "DEBUG"
       remote_logging: "True"
       remote_base_log_folder: "s3://CENSORED/logs"
       remote_log_conn_id: "CENSORED"
     smtp:
       smtp_host: "CENSORED"
       smtp_mail_from: "CENSORED"
       smtp_port: "CENSORED"
     webserver:
       expose_config: "True"
       show_trigger_form_if_no_params: "True"
     dag_processor: 
       refresh_interval: "60"
   
   ### Anything else?
   
   No relevant logs shown in airflow pods.
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to