coder2j edited a comment on issue #15319:
URL: https://github.com/apache/airflow/issues/15319#issuecomment-953637106
> > @izhangzhihao Currently Its possible to use S3 compatible object
storage. you have to create an S3 connection in airflow with these extra args:
> > ```
> > {
> > "aws_access_key_id":"your_minio_access_key",
> > "aws_secret_access_key": "your_minio_secret_key",
> > "host": "http://127.0.0.1:9000"
> > }
> > ```
>
> Yup, this works for OSS:
>
> ```json
> {
> "region_name": "oss-cn-shanghai",
> "host": "https://airflow-logging.oss-cn-shanghai-internal.aliyuncs.com",
> "aws_access_key_id":"ak",
> "aws_secret_access_key": "sk"
> }
> ```
>
> and envs:
>
> ```
> - name: AIRFLOW__LOGGING__REMOTE_LOGGING
> value: "True"
> - name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
> value: "s3://airflow-logging"
> - name: AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID
> value: "airflow-logging"
> ```
I tried to use MinIO as a local S3 for airflow logging, it didn't work for
me using the connection setup suggested (Tested on Mac OS):
`{
"aws_access_key_id":"your_minio_access_key",
"aws_secret_access_key": "your_minio_secret_key",
"host": "http://127.0.0.1:9000"
} `
Instead, try to get your docker gateway by the command `sudo docker network
inspect bridge` and replace the host IP address with the getaway you get. In my
case, it is `172.17.0.1`, so updating the s3 connection as follows will work.
`{
"aws_access_key_id":"your_minio_access_key",
"aws_secret_access_key": "your_minio_secret_key",
"host": "http://172.17.0.1:9000"
} `
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]