prakshalj0512 opened a new issue #11286:
URL: https://github.com/apache/airflow/issues/11286
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
```Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2",
GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean",
BuildDate:"2020-09-16T21:51:49Z", GoVersion:"go1.15.2", Compiler:"gc",
Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.0",
GitCommit:"9e991415386e4cf155a24b1da15becaa390438d8", GitTreeState:"clean",
BuildDate:"2020-03-25T14:50:46Z", GoVersion:"go1.13.8", Compiler:"gc",
Platform:"linux/amd64"}```
**Environment**:
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): minikube
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
The configs present under the `config:` section aren't applying to the
worker pods. For example, I have the following values set-up.
```config:
core:
dags_folder: '{{ include "airflow_dags" . }}'
load_examples: "True"
colored_console_log: "False"
executor: "{{ .Values.executor }}"
remote_log_conn_id: "s3_conn"
remote_logging: "True"
remote_base_log_folder: "s3://prakshal-test-bucket/"```
The worker pods don't recognize the remote logging values. I have to either
pass them again under ENV or add them to the docker image being used for the
workers.
CC: @dimberman
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]