pgagnon commented on a change in pull request #4675: [AIRFLOW-3853] delete local logs after s3 upload URL: https://github.com/apache/airflow/pull/4675#discussion_r255340340
########## File path: airflow/utils/log/s3_task_handler.py ########## @@ -84,7 +85,10 @@ def close(self): # read log and remove old logs to get just the latest additions with open(local_loc, 'r') as logfile: log = logfile.read() - self.s3_write(log, remote_loc) + successfully_written = self.s3_write(log, remote_loc) + # If the file was written out, remove it to avoid duplicating logs in S3 + if successfully_written: Review comment: Since this PR modifies behavior on which some users may rely, I would suggest adding a configuration option to explicitly enable deletion on upload. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services