benbenbang opened a new issue, #24565:
URL: https://github.com/apache/airflow/issues/24565

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   `apache-airflow-providers-amazon==3.0.0`
   
   ### Apache Airflow version
   
   2.2.4
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   `Executor=CeleryKubernetesExecutor`
   
   ### What happened
   
   Also posted here -> 
https://apache-airflow.slack.com/archives/CCV3FV9KL/p1655722322312679
   
   When using CeleryKubernetesExecutor, if the log file is not available on s3 
yet,  the fallback logging config is not a custom logging module.
   
   On line 136 in `airflow/providers/amazon/aws/log/s3_task_handler.py`, the 
fallback is 
   ```
   local_log, metadata = super()._read(ti, try_number)
   ```
   
   And since it's CKE, it will go to line 180 in 
`airflow/utils/log/file_task_handler.py`
   
   
   ### What you think should happen instead
   
   It should at least check and use the next task handler provided in the 
custom config, otherwise, fall back to the default config if there's no more 
handler.
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to