[ 
https://issues.apache.org/jira/browse/HADOOP-19793?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18056038#comment-18056038
 ] 

ASF GitHub Bot commented on HADOOP-19793:
-----------------------------------------

ajfabbri opened a new pull request, #8225:
URL: https://github.com/apache/hadoop/pull/8225

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   
   Work in progress.
   
   ### How was this patch tested?
   
   ```
    mvn -Dtest=none -Dit.test="ITestS3AHugeFilesNoMultipart" -Dscale 
-Dfs.s3a.scale.test.huge.filesize=3G verify
   ```
   
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   ### AI Tooling
   
   If an AI tool was used:
   
   - [ ] The PR includes the phrase "Contains content generated by <tool>"
         where <tool> is the name of the AI tool used.
   - [ ] My use of AI contributions follows the ASF legal policy
         https://www.apache.org/legal/generative-tooling.html
   




> S3A: Regression: maximum size of a single upload is now only 2GB
> ----------------------------------------------------------------
>
>                 Key: HADOOP-19793
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19793
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs/s3
>    Affects Versions: 3.5.0, 3.4.1, 3.4.2
>            Reporter: Steve Loughran
>            Assignee: Aaron Fabbri
>            Priority: Minor
>
> In HADOOP-19221 the max size of a single block was made an integer, even if 
> the source is a file > 2GB long. This means that uploads as a single block no 
> longer work. This is relevant when working with stores like GCS which don't 
> support multipart uploads.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to