abannon opened a new pull request, #19009:
URL: https://github.com/apache/druid/pull/19009
### Description
For segments stored in s3, the segment killer looks for files to remove
within the segment prefix by calling listObjectsV2, which returns at most 1000
items. In the case where there are more than 1000 files per segment prefix, the
segment killer did not find and delete all of the files. The fix is to add
pagination.
#### Release note
Fixed: Kill task for s3 will remove all segment files, instead of up to 1000
files.
<hr>
##### Key changed/added classes in this PR
* Changed S3DataSegmentKiller
<hr>
<!-- Check the items by putting "x" in the brackets for the done things. Not
all of these items apply to every PR. Remove the items which are not done or
not relevant to the PR. None of the items from the checklist below are strictly
necessary, but it would be very helpful if you at least self-review the PR. -->
This PR has:
- [x] been self-reviewed.
- [ ] a release note entry in the PR description.
- [ ] added Javadocs for most classes and all non-trivial methods. Linked
related entities via Javadoc links.
- [ ] added comments explaining the "why" and the intent of the code
wherever would not be obvious for an unfamiliar reader.
- [x] added unit tests or modified existing tests to cover new code paths,
ensuring the threshold for [code
coverage](https://github.com/apache/druid/blob/master/dev/code-review/code-coverage.md)
is met.
- [ ] added integration tests.
- [ ] been tested in a test Druid cluster.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]