steveloughran opened a new pull request, #5993:
URL: https://github.com/apache/hadoop/pull/5993

   
   Initial pass at writing an API for bulk deletes,
   targeting S3 and any store with paged delete support.
   
   Minimal design of a RemoteIterator to provide the list of paths to delete; a 
progress report will be provided after pages are deleted so as to provide an 
update of files deleted, and a way for the application
   code to abort an ongoing delete -such as after a failure.
   
   ### How was this patch tested?
   
   No tests yet; working on API first.
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to