samuelwbock commented on a change in pull request #4675: [AIRFLOW-3853] delete 
local logs after s3 upload
URL: https://github.com/apache/airflow/pull/4675#discussion_r256555290
 
 

 ##########
 File path: airflow/utils/log/s3_task_handler.py
 ##########
 @@ -169,5 +173,7 @@ def s3_write(self, log, remote_log_location, append=True):
                 replace=True,
                 encrypt=configuration.conf.getboolean('core', 
'ENCRYPT_S3_LOGS'),
             )
+            return True
         except Exception:
             self.log.exception('Could not write logs to %s', 
remote_log_location)
+        return False
 
 Review comment:
   Hey XD, while the boolean makes testing easier, I added the True/False 
return to serve as a flag for successful deletion. Right now, if there's an 
issue uploading the logs to S3 during the `s3_write` method, the failure is 
logged out, but no exception is thrown. To prevent logs from being deleted when 
they haven't been successfully uploaded, I added the boolean to make the 
upload's status explicit.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to