shcherbin commented on a change in pull request #7375: [AIRFLOW-5231] Fix 
S3Hook.delete_objects method
URL: https://github.com/apache/airflow/pull/7375#discussion_r375785220
 
 

 ##########
 File path: airflow/providers/amazon/aws/hooks/s3.py
 ##########
 @@ -606,11 +606,22 @@ def delete_objects(self, bucket, keys):
             When ``keys`` is a list, it's supposed to be the list of the
             keys to delete.
         :type keys: str or list
+
+        :return: False on failure to delete at least one batch and True on 
success.
+        :rtype: bool
         """
         if isinstance(keys, str):
             keys = [keys]
 
-        delete_dict = {"Objects": [{"Key": k} for k in keys]}
-        response = self.get_conn().delete_objects(Bucket=bucket, 
Delete=delete_dict)
-
-        return response
+        s3 = self.get_conn()
+        batch = 1000
+        for i in range(0, len(keys), batch):
+            try:
+                s3.delete_objects(
+                    Bucket=bucket,
+                    Delete={"Objects": [{"Key": k} for k in keys[i: i + 
batch]]}
+                )
+            except ClientError as e:
+                self.log.error(e.response["Error"]["Message"])
+                return False
 
 Review comment:
   I've changed the logic a bit due to failing tests in S3DeleteObjectsOperator.
   ClientError is raised in case of any technical error (e.g. the specified 
bucked does not exist or the request is throttled).
   I've also updated the tests.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to