JPonte commented on a change in pull request #16264:
URL: https://github.com/apache/airflow/pull/16264#discussion_r646094037
##########
File path: airflow/providers/amazon/aws/log/s3_task_handler.py
##########
@@ -93,7 +94,10 @@ def close(self):
# read log and remove old logs to get just the latest additions
with open(local_loc) as logfile:
log = logfile.read()
- self.s3_write(log, remote_loc)
+ success = self.s3_write(log, remote_loc)
+ keep_local = conf.getboolean('logging', 'KEEP_LOCAL_LOGS')
+ if success and not keep_local:
+ shutil.rmtree(os.path.dirname(local_loc))
Review comment:
Is that possible? This happens on the close() of the handler. Is there
any way another task handler has the same local_loc path and is trying to write
to it?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]