[ 
https://issues.apache.org/jira/browse/JCLOUDS-1366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16734680#comment-16734680
 ] 

ASF subversion and git services commented on JCLOUDS-1366:
----------------------------------------------------------

Commit 2393c7920b65be5f42fae3376b34810c7e96815f in jclouds's branch 
refs/heads/master from Andrew Gaul
[ https://git-wip-us.apache.org/repos/asf?p=jclouds.git;h=2393c79 ]

JCLOUDS-1366: JCLOUDS-1472: Fix InputStream MPU

Previously jclouds attempted to slice non-repeatable InputStream
Payloads in order to upload sequentially.  This never worked due to
mutating the single stream via skip and close.  Also backfill test
which spuriously succeeded.


> OutOfMemory when InputStream referencing to big file is used as payload
> -----------------------------------------------------------------------
>
>                 Key: JCLOUDS-1366
>                 URL: https://issues.apache.org/jira/browse/JCLOUDS-1366
>             Project: jclouds
>          Issue Type: Bug
>          Components: jclouds-blobstore
>    Affects Versions: 2.0.0, 2.0.3
>         Environment: Linux and Windows
>            Reporter: Deyan
>            Priority: Critical
>
> If I use InputStream which source is large file (lets say 3GB) I am getting 
> OOE. This is with default java VM options.
> Here is the code I am using to construct the blob:
> {code:java}
>  File bigFile = new File(file);
>  try (InputStream inputStream = new FileInputStream(f)) {
>                 Blob b = blobStore.blobBuilder(blobName)
>                         .payload(inputStream).contentLength(f.length())
>                         .contentDisposition(blobName)
>                         .contentType(
>                                 MediaType.OCTET_STREAM)
>                         .userMetadata(ImmutableMap.of("a", "b", "test", 
> "beta"))
>                         .build();
>                 blobStore.putBlob("test", bbbbb, multipart());
> }
> {code}
> Stacktrace:
> {code:java}
> java.lang.OutOfMemoryError: Java heap space
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.getNextPayload(BasePayloadSlicer.java:101)
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:90)
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:63)
>       at 
> org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:363)
>       at 
> org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:349)
>       at org.jclouds.s3.blobstore.S3BlobStore.putBlob(S3BlobStore.java:262)
> {code}
>  If 'bigFile' is used as payload the bug is not reproducible.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to