izhangzhihao commented on issue #15319:
URL: https://github.com/apache/airflow/issues/15319#issuecomment-821189306
> @izhangzhihao Currently Its possible to use S3 compatible object storage.
you have to create an S3 connection in airflow with these extra args:
>
> ```
> {
> "aws_access_key_id":"your_minio_access_key",
> "aws_secret_access_key": "your_minio_secret_key",
> "host": "http://127.0.0.1:9000"
> }
> ```
Yup, this works for OSS:
```json
{
"region_name": "oss-cn-shanghai",
"host": "https://airflow-logging.oss-cn-shanghai-internal.aliyuncs.com",
"aws_access_key_id":"ak",
"aws_secret_access_key": "sk"
}
```
and envs:
```
- name: AIRFLOW__LOGGING__REMOTE_LOGGING
value: "True"
- name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
value: "s3://airflow-logging"
- name: AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID
value: "airflow-logging"
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]