makrushin-evgenii opened a new issue, #32233: URL: https://github.com/apache/airflow/issues/32233
### Apache Airflow version Other Airflow 2 version (please specify below) ### What happened Airflow 2.6.1, Chart 1.9.0 I configured logging following ["Writing logs to Amazon S3" page](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/logging/s3-task-handler.html). Mostly works, but sometimes I encounter an error: ``` *** No logs found on s3 for ti=<TaskInstance: test_dag.test_task manual__2023-06-27T09:20:53.779786+00:00 map_index=0 [success]> *** Could not read served logs: [Errno -3] Temporary failure in name resolution ``` There are indeed no logs in the specified path. Perhaps you are familiar with this error and can suggest its cause and/or a way to fix it https://github.com/apache/airflow/assets/14077902/9f1bdc14-721a-4e0b-b4dc-c6680e188727 ### What you think should happen instead _No response_ ### How to reproduce Can't reproduce on purpose. The error appears randomly and can affect the same tasks in different runs. Or affect some tasks in a run. I have attached a video to illustrate ### Operating System Debian GNU/Linux 11 (bullseye) ### Versions of Apache Airflow Providers _No response_ ### Deployment Official Apache Airflow Helm Chart ### Deployment details values.yaml looks like: ```yaml defaultAirflowRepository: "${AIRFLOW_DOCKER_IMAGE_NAME}" defaultAirflowTag: "${AIRFLOW_DOCKER_IMAGE_TAG}" airflowVersion: "${AIRFLOW_VERSION}" fernetKey: "${FERNET_KEY}" webserverSecretKey: "${WEBSERVER_SECRET_KEY}" env: - name: AIRFLOW__API__AUTH_BACKENDS value: "airflow.api.auth.backend.basic_auth" - name: AIRFLOW__WEBSERVER__BASE_URL value: "https://${INGRESS_WEB_HOST}" - name: OIDC_CLIENT_ID value: "${OIDC_CLIENT_ID}" - name: OIDC_CLIENT_SECRET value: "${OIDC_CLIENT_SECRET}" - name: OIDC_ISSUER_URI value: "${OIDC_ISSUER_URI}" - name: OIDC_REDIRECT_URI value: "${OIDC_REDIRECT_URI}" extraSecrets: airflow-postgresql-metadata: data: "connection: ${AIRFLOW_POSTGRESQL_METADATA}" dags-gitsync-ssh-key-secret: data: "gitSshKey: ${DAGS_GITSYNC_SSH_KEY_SECRET}" data: metadataSecretName: airflow-postgresql-metadata ingress: web: enabled: true hosts: - "${INGRESS_WEB_HOST}" config: secrets: backend: 'airflow.providers.hashicorp.secrets.vault.VaultBackend' backend_kwargs: '{"auth_type": "approle", "url": "***", "role_id": "${SECRETS_BACKEND_VAULT_ROLE_ID}", "secret_id": "${SECRETS_BACKEND_VAULT_SECRET_ID}", "variables_path": "${SECRETS_BACKEND_VAULT_VARIABLES_PATH}", "connections_path": "${SECRETS_BACKEND_VAULT_CONNECTIONS_PATH}"}' logging: remote_logging: 'True' logging_level: 'INFO' remote_log_conn_id: 'aws_conn' remote_base_log_folder: '${LOGGING_REMOTE_BASE_LOG_FOLDER}' dags: persistence: enabled: false gitSync: enabled: true repo: "[email protected]:${CI_PROJECT_PATH}.git" branch: "${CI_COMMIT_REF_NAME}" subPath: "" sshKeySecret: "dags-gitsync-ssh-key-secret" knownHosts: *** resources: limits: cpu: 100m memory: 128Mi requests: cpu: 100m memory: 128Mi workers: serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" keda: enabled: false persistence: enabled: false kerberosSidecar: enabled: false logGroomerSidecar: enabled: false replicas: 1 resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 2048Mi scheduler: serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" logGroomerSidecar: enabled: false replicas: 1 resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 1024Mi webserver: serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" defaultUser: username: "${WEBSERVER_DEFAULT_USER_USERNAME}" password: "${WEBSERVER_DEFAULT_USER_PASSWORD}" allowPodLogReading: true podDisruptionBudget: enabled: false replicas: 1 resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 1024Mi triggerer: enabled: true serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" persistence: enabled: false replicas: 1 resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 1024Mi statsd: enabled: true serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" resources: limits: cpu: 100m memory: 128Mi requests: cpu: 100m memory: 128Mi redis: enabled: true serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" persistence: enabled: false resources: limits: cpu: 100m memory: 128Mi requests: cpu: 100m memory: 128Mi createUserJob: useHelmHooks: false applyCustomEnv: false serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 2048Mi migrateDatabaseJob: enabled: true useHelmHooks: false applyCustomEnv: false serviceAccount: create: false name: "${KUBERNETES_SERVICE_ACCOUNT}" resources: limits: cpu: 1000m memory: 2048Mi requests: cpu: 1000m memory: 2048Mi cleanup: enabled: false postgresql: enabled: false pgbouncer: enabled: false dagProcessor: enabled: false flower: enabled: false logs: persistence: enabled: false elasticsearch: enabled: false kerberos: enabled: false networkPolicies: enabled: false rbac: create: false createSCCRoleBinding: false ``` ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
