samuelwbock commented on a change in pull request #4675: [AIRFLOW-3853] delete
local logs after s3 upload
URL: https://github.com/apache/airflow/pull/4675#discussion_r256591973
##########
File path: airflow/utils/log/s3_task_handler.py
##########
@@ -84,7 +85,10 @@ def close(self):
# read log and remove old logs to get just the latest additions
with open(local_loc, 'r') as logfile:
log = logfile.read()
- self.s3_write(log, remote_loc)
+ successfully_written = self.s3_write(log, remote_loc)
+ # If the file was written out, remove it to avoid duplicating logs
in S3
+ if successfully_written:
Review comment:
Ok, I'll make those changes.
Because this bug should also pop-up for users backing up logs into
Azure/GCS, I'm going to go ahead and add this behavior for those log file
handlers as well.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services