TRReeve commented on issue #8212:
URL: https://github.com/apache/airflow/issues/8212#issuecomment-629872414
@marclamberti Your answer is how I understood it would work as well but when
I would get the logs uploading fine into the S3 bucket but then when i went to
"view logs" in the UI it would give the "logs not found" error with no output
in the logs to indicate it was using the s3 connection or the read_key function
to retrieve anything. I can confirm there was no need to define the task
handler or write a logging class which is a big improvement on things.
It would be really nice if I could just define AIRFLOW_CONN_S3_URI =
s3://user:pass@S3 then have REMOTE_BASE_LOGS_FOLDER=s3://airflow-stuff/logs and
the UI would build the path but I could only get logs uploading. My working
helm template for airflow on k8s builds the connection
s3://access_key:secret_key@{{ mys3path }} and then remote_log_path is s3://{{
mys3path }}. Aside that it's exactly the same as you defined above with the
same variables defined under AIRFLOW__KUBERNETES__ENVIRONMENT_VARIABLES.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]