ldacey commented on issue #34022:
URL: https://github.com/apache/airflow/issues/34022#issuecomment-1705454732

   After pinning celery, I have no issues with the workers. I still cannot read 
or write logs on Azure though for some reason.
   
   This prefix and container are correct. After my update to 2.7.0 there are no 
remote logs though. Logs from task (reading remote logs):
   > *** tried listing blobs with 
prefix=wasb-airflow-logs/dag_id=zendesk/run_id=scheduled__2023-09-04T06:51:00+00:00/task_id=support.transform.custom-fields/attempt=1.log
 and container=airflow-logs
   > *** could not list blobs The requested URI does not represent any resource 
on the server.
   
   Logs from worker are below. Is the worker trying to write to a container 
called "wasb-airflow-logs" instead of "airflow-logs"? I am not sure why this is 
different from the logs being read which is correct (container is 
"airflow-logs" and prefix is "wasb-airflow-logs". As mentioned, this 
configuration has worked for a long time with no changes.
   
   > [2023-09-04 15:26:26,796: ERROR/ForkPoolWorker-3] Could not write logs to 
wasb-airflow-logs/dag_id=example/run_id=scheduled__2023-09-01T14:10:00+00:00/task_id=offered.extract/attempt=2.log
   > Traceback (most recent call last):
   > azure.core.exceptions.ResourceNotFoundError: The specified container does 
not 
   
   
   
   log_config.py (I tested a few REMOTE_BASE_LOG_FOLDER values in this file as 
well, as you can see from my original issue).
   ```python
   from copy import deepcopy
   from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
   
   LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
   ```
   
   env vars:
   ```python
     AIRFLOW__LOGGING__REMOTE_LOGGING: "true"
     AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID: azure_blob
     AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: wasb-airflow-logs
     ```
     
     
   
   pip freeze results:
   ```
   apache-airflow==2.7.0
   apache-airflow-providers-celery==3.3.3
   apache-airflow-providers-common-sql==1.7.1
   apache-airflow-providers-docker==3.7.4
   apache-airflow-providers-ftp==3.5.1
   apache-airflow-providers-google==10.7.0
   apache-airflow-providers-http==4.5.1
   apache-airflow-providers-imap==3.3.1
   apache-airflow-providers-microsoft-azure==6.3.0
   apache-airflow-providers-mysql==5.3.0
   apache-airflow-providers-odbc==4.0.0
   apache-airflow-providers-postgres==5.6.0
   apache-airflow-providers-redis==3.3.1
   apache-airflow-providers-salesforce==5.4.2
   apache-airflow-providers-sftp==4.6.0
   apache-airflow-providers-sqlite==3.4.3
   apache-airflow-providers-ssh==3.7.2
   
   adal==1.2.7
   adlfs==2023.8.0
   azure-batch==14.0.0
   azure-common==1.1.28
   azure-core==1.29.3
   azure-cosmos==4.5.0
   azure-datalake-store==0.0.53
   azure-identity==1.14.0
   azure-keyvault-secrets==4.7.0
   azure-kusto-data==4.2.0
   azure-mgmt-containerinstance==8.0.0
   azure-mgmt-core==1.4.0
   azure-mgmt-cosmosdb==9.2.0
   azure-mgmt-datafactory==1.1.0
   azure-mgmt-datalake-nspkg==3.0.1
   azure-mgmt-datalake-store==0.5.0
   azure-mgmt-nspkg==3.0.2
   azure-mgmt-resource==23.0.1
   azure-nspkg==3.0.2
   azure-servicebus==7.11.1
   azure-storage-blob==12.17.0
   azure-storage-common==2.1.0
   azure-storage-file==2.1.0
   azure-storage-file-datalake==12.12.0
   azure-synapse-spark==0.7.0
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to