aarrtteemmuuss opened a new issue #8770:
URL: https://github.com/apache/airflow/issues/8770
**Apache Airflow version**: 1.10.7
**Environment**: python 3.7, running locally
**What happened**:
I have following configuration in airflow.cfg:
`remote_logging = True`
`remote_log_conn_id = "S3Connection"`
`remote_base_log_folder = s3://bucket/logs`
`encrypt_s3_logs = False`
I have setup connection in UI with type S3 and following settings:
`{"aws_access_key_id":"xxx", "aws_secret_access_key": "xxx"}`
Executor is LocalExecutor. Scheduler is able to write logs to S3, but when I
open UI to look for task's logs, I can see its just hanging and nothing is
happening. The spin spins forever and I can't even see that is the error.
`http://127.0.0.1:8080/admin/airflow/get_logs_with_metadata?dag_id=example_bash_operator&task_id=run_after_loop&execution_date=2020-05-07T18%3A08%3A51.232255%2B00%3A00&try_number=1&metadata=null`
- returns Empty Response Error and nothing works.
Am I doing something wrong? Because S3 configuration is not good documented
and I see bunch of reports that it is not working.
**What you expected to happen**:
I expected logs to be pulled from S3 and shown in UI admin.
**How to reproduce it**:
Install airflow 1.10.7 locally and run any example dag with remote settings
enables for S3 bucket.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]