ldacey opened a new issue, #34022:
URL: https://github.com/apache/airflow/issues/34022

   ### Apache Airflow version
   
   2.7.0
   
   ### What happened
   
   After updating to 2.7.0 my worker is unable to read or write logs to Azure 
Blob. My previous configuration has not been changed since May 2022, so there 
must have been a recent change which broke it.
   
   My container is "airflow-logs" and then I save production logs in 
"wasb-airflow-logs" folder/prefix. There are some other 'folders' for different 
environments.
   
   docker stack variables:
   ```python
   AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS: log_config.LOGGING_CONFIG
   AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: wasb-airflow-logs
   ```
   
   log_config.py
   ```python
   from copy import deepcopy
   from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
   
   REMOTE_BASE_LOG_FOLDER = "wasb://[email protected]"
   LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
   ```
   
   After 2.7.0 upgrade I noticed that my worker was spamming the following 
error message. It seems like Airflow suddenly things "wasb-airflow-logs" is the 
container instead of the prefix? 
   
   ```
   [2023-09-01 14:39:29,785: ERROR/ForkPoolWorker-11] Could not write logs to 
wasb-airflow-logs/dag_id=
   azure.core.exceptions.ResourceNotFoundError: The specified container does 
not exist.
   ```
   
   And my tasks all started with this error:
   
   ```
   *** tried listing blobs with 
prefix=wasb-airflow-logs/dag_id=...attempt=1.log and container=airflow-logs
   *** could not list blobs The requested URI does not represent any resource 
on the server.
   ```
   
   For now, I had to revert my changes to use 
`AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER` instead which allows the worker to 
run at least, but it cannot read/write logs.
   
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   Any change I made to the REMOTE_BASE_LOG_FOLDER did not work and resulted in 
an error:
   
   ```
   ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG 
due to Incorrect remote log configuration. Please check the configuration of 
option 'host' in section 'elasticsearch' if you are using Elasticsearch. In the 
other case, 'remote_base_log_folder' option in the 'logging' section.
   ```
   
   Here are the things I tried:
   
   ```python
   REMOTE_BASE_LOG_FOLDER = "wasb-airflow-logs"
   REMOTE_BASE_LOG_FOLDER = "wasb://[email protected]"
   REMOTE_BASE_LOG_FOLDER = 
"wasb://[email protected]/wasb-airflow-logs/"
   LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
   ```
   
   ### Operating System
   
   Ubuntu 22.04
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   docker stack deploy
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to