chrisluedtke opened a new issue, #35388:
URL: https://github.com/apache/airflow/issues/35388

   ### Official Helm Chart version
   
   1.10.0
   
   ### Apache Airflow version
   
   2.5.1
   
   ### Kubernetes Version
   
   1.26.6
   
   ### Helm Chart configuration
   
   A few relevant configs
   ```yaml
   executor: CeleryExecutor
   
   workers:
     persistence:
       enabled: false
   
   logs:
     persistence:
       enabled: false
   
   extraEnv: |
     - name: AIRFLOW__CORE__BASE_LOG_FOLDER
       value: "/opt/airflow/logs"
     - name: AIRFLOW__LOGGING__REMOTE_LOGGING
       value: "True"
     - name: AIRFLOW__LOGGING__LOGGING_LEVEL
       value: "INFO"
     - name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
       value: "wasb-${local.customer}-${local.tenant}-airflow-logs"
     - name: AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID
       value: "wasb_remote_airflow"
     - name: AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS
       value: "config.log_config.LOGGING_CONFIG"
   ```
   
   ### Docker Image customizations
   
   _No response_
   
   ### What happened
   
   I discovered that our airflow workers were taking ~10 minutes to start up 
during a helm release. When I investigated the pod logs, I found a couple 
`FailedMount` errors.
   
   
![image](https://github.com/apache/airflow/assets/20371880/7a279df6-2af1-4b28-a069-effd750472e4)
   
   [The docs suggest disabling worker persistence when using celery 
workers](https://airflow.apache.org/docs/helm-chart/stable/manage-logs.html#no-persistence),
 so I disabled it.
   
   Now when executing a currently running task via the celery executor, I 
cannot access logs for running tasks.
   
   ```
   *** Log file does not exist: 
/opt/airflow/logs/dag_id=replication.replication_sanity_checks.v2/run_id=manual__2023-11-02T23:00:40.593962+00:00/task_id=replication_sanity_check.get_source_table_counts/map_index=17/attempt=1.log
   *** Fetching from: 
http://airflow-worker-69f7bb5cdf-xwd67:8793/log/dag_id=replication.replication_sanity_checks.v2/run_id=manual__2023-11-02T23:00:40.593962+00:00/task_id=replication_sanity_check.get_source_table_counts/map_index=17/attempt=1.log
   *** Failed to fetch log file from worker. [Errno -2] Name or service not 
known
   ```
   
   After the task has completed, the logs are successfully loaded from a remote 
source as expected.
   
   ### What you think should happen instead
   
   Even with worker persistence disabled, I expect to be able to retrieve logs 
of currently running tasks.
   
   ### How to reproduce
   
   Deploy airflow to k8s via the official helm chart with the configs shared 
above. 
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to