[
https://issues.apache.org/jira/browse/HADOOP-18637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17711126#comment-17711126
]
ASF GitHub Bot commented on HADOOP-18637:
-----------------------------------------
mukund-thakur commented on code in PR #5543:
URL: https://github.com/apache/hadoop/pull/5543#discussion_r1163358833
##########
hadoop-tools/hadoop-aws/src/main/java/org/apache/hadoop/fs/s3a/S3ADataBlocks.java:
##########
@@ -849,14 +855,29 @@ long dataSize() {
return bytesWritten;
}
+ /**
+ * Does this block have unlimited space?
+ * @return true if a block with no size limit was created.
+ */
+ private boolean unlimited() {
+ return limit < 0;
+ }
+
@Override
boolean hasCapacity(long bytes) {
- return dataSize() + bytes <= limit;
+ return unlimited() || dataSize() + bytes <= limit;
}
+ /**
+ * {@inheritDoc}.
+ * If there is no limit to capacity, return MAX_VALUE.
+ * @return capacity in the block.
+ */
@Override
long remainingCapacity() {
- return limit - bytesWritten;
+ return unlimited()
Review Comment:
remainingCapacity is long so shouldn't it be long.MAX_VALUE
> S3A to support upload of files greater than 2 GB using DiskBlocks
> -----------------------------------------------------------------
>
> Key: HADOOP-18637
> URL: https://issues.apache.org/jira/browse/HADOOP-18637
> Project: Hadoop Common
> Issue Type: Improvement
> Components: fs/s3
> Reporter: Harshit Gupta
> Assignee: Harshit Gupta
> Priority: Major
> Labels: pull-request-available
>
> Use S3A Diskblocks to support the upload of files greater than 2 GB using
> DiskBlocks. Currently, the max upload size of a single block is ~2GB.
> cc: [~mthakur] [[email protected]] [~mehakmeet]
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]