mehakmeet commented on pull request #2706:
URL: https://github.com/apache/hadoop/pull/2706#issuecomment-841783066


   Had merge conflicts so had to force push. 
   Tests: 
   ```
   [ERROR] Tests run: 1430, Failures: 1, Errors: 34, Skipped: 538
   ```
   Scale:
   ```
   [ERROR] Tests run: 151, Failures: 3, Errors: 21, Skipped: 29
   ```
   
   Most errors are MultiPart upload related: 
   ```
   com.amazonaws.SdkClientException: Invalid part size: part sizes for 
encrypted multipart uploads must be multiples of the cipher block size (16) 
with the exception of the last part.
   ```
   Simply adding 16(Padding length) to multipart upload block size won't work. 
The part sizes need to be a multiple of 16, so it has that restriction for CSE. 
Also, one more thing to note here is that it assumes the last part to be an 
exception, which makes me believe that multipart upload in CSE has to be 
sequential(or can we parallel upload the starting parts and then upload the 
last part?)? So, potentially another constraint while uploading could have 
performance impacts here apart from the HEAD calls being required while 
downloading/listing. 
   @steveloughran 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to