dannycjones commented on code in PR #5763:
URL: https://github.com/apache/hadoop/pull/5763#discussion_r1238487030


##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java:
##########
@@ -448,7 +448,7 @@ public void testReadOnlyOperations() throws Throwable {
         policy(
             statement(false, S3_ALL_BUCKETS, S3_PATH_WRITE_OPERATIONS),
             STATEMENT_ALL_S3,
-            STATEMENT_ALLOW_SSE_KMS_READ));
+            STATEMENT_ALLOW_SSE_KMS_RW));

Review Comment:
   Why does this need changing? It's for Server Side Encryption. (Was it always 
broken?)



##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3APrefetchingInputStream.java:
##########
@@ -118,6 +118,7 @@ private static int calculateNumBlocks(long largeFileSize, 
int blockSize) {
   @Test
   public void testReadLargeFileFully() throws Throwable {
     describe("read a large file fully, uses S3ACachingInputStream");
+    skipIfClientSideEncryption();

Review Comment:
   Shall we just move these into `openFS()` since we're assuming the FS it 
provides is not compatible with CSE for now?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to