james-rms opened a new issue, #563:
URL: https://github.com/apache/arrow-rs-object-store/issues/563

   **Is your feature request related to a problem or challenge? Please describe 
what you are trying to do.**
   I'm trying to copy objects within a bucket as part of my data ingestion 
workflow. However, when thse objects exceed 5GB in size, this fails with an 
error. There's a built-in limit in the S3 API for the maximum size of a copy 
operation, this is 5GB.
   -->
   
   **Describe the solution you'd like**
   
   I've outlined the solution i'd prefer in #561 - essentially the copy method 
should detect large copies and transparently switch to a multipart workflow.
   
   **Describe alternatives you've considered**
   
   Alternatively, we  could also expose this API as part of the 
multipart-upload API. If this were the solution, i'd prefer copy to return an 
error immediately if the object to copy is >5GB.
   <!--
   A clear and concise description of any alternative solutions or features 
you've considered.
   -->
   
   **Additional context**
   <!--
   Add any other context or screenshots about the feature request here.
   -->
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to