arkadiusz-bach commented on issue #21548:
URL: https://github.com/apache/airflow/issues/21548#issuecomment-1042417578


   This is a bug, it is trying to access Celery logs because of the following 
line:
   
https://github.com/apache/airflow/blob/710d6997035a5e050367db013ae6847cc90dc51a/airflow/utils/log/file_task_handler.py#L143
   
   It should also check whether executor is equal to 'CeleryKubernetesExecutor' 
and the task queue is 'kubernetes'(queue value should be taken from config)
   
   Once you have got that fixed, log will be empty in the running state, 
because it is trying to read container logs and by default tasks logs are 
written to files.
   
   You will need to either add stdout handler for `airflow.task` logger in 
logging config or do it like I did, I just replaced the read namespaced pod 
with kubernetes exec command, which just executes `tail -n 100 log_file_path' 
in the task container in order to get last 100 lines of log. 
   
   As for the failed or succeeded tasks, it depends whether you have got 
remote_logging enabled, if you do then log will be taken from remote location 
otherwise it will be empty, because it seems that you are not using the same 
volumes for webserver and tasks containers and once the task container is gone 
then logs are gone as well.
   
   You can just mount the same volume for logs directory on both webserver and 
tasks containers, it will work, but if you are not using the same volumes(or 
volumes at all) for logs then that is the case.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to