[ 
https://issues.apache.org/jira/browse/HADOOP-15267?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16390051#comment-16390051
 ] 

Hudson commented on HADOOP-15267:
---------------------------------

SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #13787 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/13787/])
HADOOP-15267. S3A multipart upload fails when SSE-C encryption is (stevel: rev 
e0307e53e2110cb6b418861a7471e97a013c16e2)
* (edit) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/MockS3AFileSystem.java
* (edit) 
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3AFileSystem.java
* (add) 
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/scale/ITestS3AHugeFilesSSECDiskBlocks.java


> S3A multipart upload fails when SSE-C encryption is enabled
> -----------------------------------------------------------
>
>                 Key: HADOOP-15267
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15267
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.1.0
>         Environment: Hadoop 3.1 Snapshot
>            Reporter: Anis Elleuch
>            Assignee: Anis Elleuch
>            Priority: Critical
>             Fix For: 3.1.0
>
>         Attachments: HADOOP-15267-001.patch, HADOOP-15267-002.patch, 
> HADOOP-15267-003.patch
>
>
> When I enable SSE-C encryption in Hadoop 3.1 and set  fs.s3a.multipart.size 
> to 5 Mb, storing data in AWS doesn't work anymore. For example, running the 
> following code:
> {code}
> >>> df1 = spark.read.json('/home/user/people.json')
> >>> df1.write.mode("overwrite").json("s3a://testbucket/people.json")
> {code}
> shows the following exception:
> {code:java}
> com.amazonaws.services.s3.model.AmazonS3Exception: The multipart upload 
> initiate requested encryption. Subsequent part requests must include the 
> appropriate encryption parameters.
> {code}
> After some investigation, I discovered that hadoop-aws doesn't send SSE-C 
> headers in Put Object Part as stated in AWS specification: 
> [https://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadUploadPart.html]
> {code:java}
> If you requested server-side encryption using a customer-provided encryption 
> key in your initiate multipart upload request, you must provide identical 
> encryption information in each part upload using the following headers.
> {code}
>  
> You can find a patch attached to this issue for a better clarification of the 
> problem.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to