msumit commented on issue #8780:
URL: https://github.com/apache/airflow/issues/8780#issuecomment-625748213


   AFAIK the tasks logs are uploaded to remote storage once the task is 
completed, be it Kubernetes, Celery, or even Local executors. One simple 
solution is to make use of multi read-writes Persistent volume groups to store 
these logs, so the logs won't be lost even if pod is killed in between. 
   Not sure if Kubernetes has some way to catch the pod kill request and 
execute some code before actually killing it. In that case, you can write some 
code to upload the logs manually before the pod goes away. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to