GitHub user hcnhcn012 closed a discussion: [3.0.4][Remote Logging] "Unable to 
find AWS Connection ID" under KubernetesExecuotor configuration

airflow version: 3.0.4
deployment method: helm
executor: KubernetesExecuotor 

---

I noticed this issue was happened in former versions and it should be fixed in 
3.0.4 (see: https://github.com/apache/airflow/issues/50583), but it still 
happend for me when i upgrade airflow version by changing helm value: 
```
defaultAirflowTag: "3.0.4"
airflowVersion: "3.0.4"
```
I can confirm that the aws connection was created correctly (checked `airflow 
connections list`), but the worker pod can not find this connection. Here is my 
configuration.

---
Airflow:
```
[kubernetes_executor]
delete_worker_pods = False
multi_namespace_mode = False
namespace = airflow
pod_template_file = /opt/airflow/pod_templates/dynamic_deps_template.yaml
worker_container_repository = apache/airflow
worker_container_tag = 3.0.4

[logging]
colored_console_log = False
remote_logging = True
remote_base_log_folder = s3://airflow-worker-logs
remote_log_conn_id = worker_remote_log_s3_conn
```
Pod template (only show base container):
```
  containers:
    - name: base
      image: "apache/airflow:3.0.4-python3.12"
      imagePullPolicy: IfNotPresent
      env:
        - name: AIRFLOW__CORE__EXECUTOR
          value: LocalExecutor
        - name: AIRFLOW__CORE__FERNET_KEY
          valueFrom:
            secretKeyRef:
              name: airflow-fernet-key
              key: fernet-key
        - name: AIRFLOW__DATABASE__SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: pai-docai-airflow-metadata
              key: connection
        - name: AIRFLOW_CONN_AIRFLOW_DB
          valueFrom:
            secretKeyRef:
              name: pai-docai-airflow-metadata
              key: connection
        - name: AIRFLOW__CORE__DAGS_FOLDER
          value: "/opt/airflow/dags/pai-docai-dataflow/paidocaidataflow/dags"
        - name: PYTHONPATH
          value: 
"/opt/airflow/python-packages:/opt/airflow/dags/pai-docai-dataflow:/opt/airflow/dags"
        - name: PROJECT_ROOT
          value: "/opt/airflow/dags/pai-docai-dataflow"

        - name: AIRFLOW__API__BASE_URL
          value: 
"http://pai-docai-airflow-api-server.airflow.svc.cluster.local:8080";
        - name: AIRFLOW__API__AUTH_BACKENDS
          value: "airflow.api.auth.backend.default"
        - name: AIRFLOW__LOGGING__REMOTE_LOGGING
          value: "True"
        - name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
          value: "s3://airflow-worker-logs"
        - name: AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID
          value: "worker_remote_log_s3_conn"

      volumeMounts:
        - name: airflow-dags
          mountPath: /opt/airflow/dags
        - name: python-packages
          mountPath: /opt/airflow/python-packages
        - name: airflow-logs
          mountPath: /opt/airflow/logs

```
---
original failure log:
```
{"event":"Unable to find AWS Connection ID 'worker_remote_log_s3_conn', 
switching to 
empty.","level":"warning","logger":"airflow.task.hooks.airflow.providers.amazon.aws.hooks.s3.S3Hook","timestamp":"2025-08-18T08:15:49.647380Z"}
``` 
```
{"event":"No connection ID provided. Fallback on boto3 credential strategy 
(region_name=None). See: 
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html","level":"info","logger":"airflow.providers.amazon.aws.hooks.base_aws.BaseSessionFactory","timestamp":"2025-08-18T08:15:49.647765Z"}
``` 

GitHub link: https://github.com/apache/airflow/discussions/54602

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to