[ 
https://issues.apache.org/jira/browse/HADOOP-19197?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17867837#comment-17867837
 ] 

ASF GitHub Bot commented on HADOOP-19197:
-----------------------------------------

steveloughran commented on code in PR #6874:
URL: https://github.com/apache/hadoop/pull/6874#discussion_r1686963532


##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3AEncryptionSSEKMSWithEncryptionContext.java:
##########
@@ -0,0 +1,100 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * <p>
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * <p>
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hadoop.fs.s3a;
+
+import java.io.IOException;
+import java.io.UncheckedIOException;
+import java.util.Set;
+
+import org.apache.hadoop.thirdparty.com.google.common.collect.ImmutableSet;
+
+import org.apache.commons.lang.StringUtils;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.s3a.impl.S3AEncryption;
+
+import static org.apache.hadoop.fs.contract.ContractTestUtils.skip;
+import static org.apache.hadoop.fs.s3a.Constants.S3_ENCRYPTION_CONTEXT;
+import static org.apache.hadoop.fs.s3a.Constants.S3_ENCRYPTION_KEY;
+import static org.apache.hadoop.fs.s3a.S3AEncryptionMethods.DSSE_KMS;
+import static org.apache.hadoop.fs.s3a.S3AEncryptionMethods.SSE_KMS;
+import static org.apache.hadoop.fs.s3a.S3ATestUtils.assume;
+import static org.apache.hadoop.fs.s3a.S3ATestUtils.getTestBucketName;
+
+/**
+ * Concrete class that extends {@link AbstractTestS3AEncryption}
+ * and tests KMS encryption with encryption context.
+ * S3's HeadObject doesn't return the object's encryption context.
+ * Therefore, we don't have a way to assert its value in code.
+ * In order to properly test if the encryption context is being set,
+ * the KMS key or the IAM User need to have a deny statements like the one 
below in the policy:
+ * <pre>
+ * {
+ *     "Effect": "Deny",
+ *     "Principal": {
+ *         "AWS": "*"
+ *     },
+ *     "Action": "kms:Decrypt",
+ *     "Resource": "*",
+ *     "Condition": {
+ *         "StringNotEquals": {
+ *             "kms:EncryptionContext:project": "hadoop"
+ *         }
+ *     }
+ * }
+ * </pre>
+ * With the statement above, S3A will fail to read the object from S3 if it 
was encrypted
+ * without the key-pair <code>"project": "hadoop"</code> in the encryption 
context.
+ */
+public class ITestS3AEncryptionSSEKMSWithEncryptionContext
+    extends AbstractTestS3AEncryption {
+
+  private static final Set<S3AEncryptionMethods> KMS_ENCRYPTION_ALGORITHMS = 
ImmutableSet.of(
+      SSE_KMS, DSSE_KMS);
+
+  private S3AEncryptionMethods encryptionAlgorithm;
+
+  @Override
+  protected Configuration createConfiguration() {
+    try {
+      // get the KMS key and context for this test.
+      Configuration c = new Configuration();
+      final String bucketName = getTestBucketName(c);
+      String kmsKey = S3AUtils.getS3EncryptionKey(bucketName, c);
+      String encryptionContext = 
S3AEncryption.getS3EncryptionContext(bucketName, c);
+      encryptionAlgorithm = S3AUtils.getEncryptionAlgorithm(bucketName, c);
+      assume("Expected a KMS encryption algorithm",
+          KMS_ENCRYPTION_ALGORITHMS.contains(encryptionAlgorithm));
+      if (StringUtils.isBlank(encryptionContext)) {
+        skip(S3_ENCRYPTION_CONTEXT + " is not set.");
+      }
+      Configuration conf = super.createConfiguration();
+      conf.set(S3_ENCRYPTION_KEY, kmsKey);

Review Comment:
   use `removeBaseAndBucketOverrides(conf, String...)` and list all options you 
intend to set..stops overrides getting in the way. doesn't work for jceks 
overrides, which always take priority





> S3A: Support AWS KMS Encryption Context
> ---------------------------------------
>
>                 Key: HADOOP-19197
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19197
>             Project: Hadoop Common
>          Issue Type: New Feature
>          Components: fs/s3
>    Affects Versions: 3.4.0
>            Reporter: Raphael Azzolini
>            Priority: Major
>              Labels: pull-request-available
>
> S3A properties allow users to choose the AWS KMS key 
> ({_}fs.s3a.encryption.key{_}) and S3 encryption algorithm to be used 
> (f{_}s.s3a.encryption.algorithm{_}). In addition to the AWS KMS Key, an 
> encryption context can be used as non-secret data that adds additional 
> integrity and authenticity to check the encrypted data. However, there is no 
> option to specify the [AWS KMS Encryption 
> Context|https://docs.aws.amazon.com/kms/latest/developerguide/concepts.html#encrypt_context]
>  in S3A.
> In AWS SDK v2 the encryption context in S3 requests is set by the parameter 
> [ssekmsEncryptionContext.|https://sdk.amazonaws.com/java/api/latest/software/amazon/awssdk/services/s3/model/CreateMultipartUploadRequest.Builder.html#ssekmsEncryptionContext(java.lang.String)]
>  It receives a base64-encoded UTF-8 string holding JSON with the encryption 
> context key-value pairs. The value of this parameter could be set by the user 
> in a new property {_}*fs.s3a.encryption.context*{_}, and be stored in the 
> [EncryptionSecrets|https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/auth/delegation/EncryptionSecrets.java]
>  to later be used when setting the encryption parameters in 
> [RequestFactoryImpl|https://github.com/apache/hadoop/blob/f92a8ab8ae54f11946412904973eb60404dee7ff/hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/impl/RequestFactoryImpl.java].



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to