[
https://issues.apache.org/jira/browse/OAK-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16648336#comment-16648336
]
Matt Ryan commented on OAK-7807:
--------------------------------
I just uploaded [^OAK-7807.patch.3] which is slightly changed from before and
also includes similar logic in {{TestS3DataStore}}.
I ran a test today using code before the changes and after. Before the
changes, I ran the full S3 data store unit test suite ten times in succession
on what before that was an empty S3 storage account. After running the tests
10 times, there were 35 undeleted buckets in the account created by the unit
tests. This is an interesting number because clearly the number of undeleted
buckets per test is not constant, which supports the theory that it was some
varying criteria, like time drift coupled with the time required to run the
tests, causing the problem.
After making the change and deleting all the buckets in the account, I ran the
full S3 test suite again ten times in succession and there were no undeleted
buckets, so this is at least an improvement.
> [S3DataStore] S3DataStore unit tests not deleting buckets created during test
> -----------------------------------------------------------------------------
>
> Key: OAK-7807
> URL: https://issues.apache.org/jira/browse/OAK-7807
> Project: Jackrabbit Oak
> Issue Type: Bug
> Components: blob-cloud
> Affects Versions: 1.9.8
> Reporter: Matt Ryan
> Assignee: Matt Ryan
> Priority: Major
> Attachments: OAK-7807.patch, OAK-7807.patch.2, OAK-7807.patch.3
>
>
> It appears that when the S3DataStore tests run they are not properly cleaning
> up S3 buckets that were created during the execution of the test.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)