k-lyda opened a new issue #12236:
URL: https://github.com/apache/airflow/issues/12236
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the
following questions.
Don't worry if they're not all applicable; just try to include what you can
:-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the
context.
-->
**Apache Airflow version**:
Apache Airflow [1.10.12]
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.2",
GitCommit:"52c56ce7a8272c798dbc29846288d7cd9fbae032", GitTreeState:"clean",
BuildDate:"2020-04-16T11:56:40Z", GoVersion:"go1.13.9", Compiler:"gc",
Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"13", GitVersion:"v1.13.12",
GitCommit:"a8b52209ee172232b6db7a6e0ce2adc77458829f", GitTreeState:"clean",
BuildDate:"2019-10-15T12:04:30Z", GoVersion:"go1.11.13", Compiler:"gc",
Platform:"linux/amd64"}
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
I want to save logs on S3 storage. I've added proper configuration. Logging
is working fine, unless I add import of MLFlow library in any of the files.
I don't even have to use this tool, just
`from mlflow.tracking import MlflowClient`
is enough to break the logging to S3.
**What you expected to happen**:
There is probably some mismatch in s3 credentials, but I don't see any
specific error message in logs.
**How to reproduce it**:
1. Setup logging to s3 with proper configuration entries or env vars
```
AIRFLOW__CORE__REMOTE_LOGGING: "True"
AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: "s3://bucket/airflow/logs"
AIRFLOW__CORE__REMOTE_LOG_CONN_ID: "s3_connection_id"
AIRFLOW__CORE__ENCRYPT_S3_LOGS: "False"
```
2. Logging to s3 should work fine in sample dag.
3. Add `from mlflow.tracking import MlflowClient` to DAG file - logging to
s3 is now not working.
**Anything else we need to know**:
This problem occurs every time, when MLFlow is imported to any file
processed by Airflow.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]