ltken123 commented on issue #8212:
URL: https://github.com/apache/airflow/issues/8212#issuecomment-678015944


   > Having gotten remote logging working on 1.10.10 I've noticed there seems 
to be a difference in how the workers handle the log upload versus the 
webserver component. The Connection you define needs to point to the same 
folder as the remote_base_logs_folder. E.g if your remote base logs folder is 
data/airflow/logs then your connection used for remote logging also needs to 
point to it e.g s3://access_key:secret@data/airflow/logs. You cannot just have 
a generic "AWS connection" and then have it figure out which folder it needs to 
point to on its own using the filepath from the worker.
   
   @TRReeve do you have an example of how you set that connection and got it 
working? I've tried this a number of ways but continue to only get the log 
written in S3, not readable in the UI. 
   
   I'm just trying it in the config, i.e.
   
   AIRFLOW_CONN_S3_URI: s3://SECRET_KEY:ENCODED_SECRET_ACCESS_KEY@BUCKET/logs/
   
   AIRFLOW__CORE__REMOTE_LOG_CONN_ID: s3_uri
   
   AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW_CONN_S3_URI: 
s3://SECRET_KEY:ENCODED_SECRET_ACCESS_KEY@BUCKET/logs/
   
   
AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOG_CONN_ID: 
s3_uri
   
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to