[ 
https://issues.apache.org/jira/browse/JCLOUDS-1366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16312682#comment-16312682
 ] 

Andrew Gaul commented on JCLOUDS-1366:
--------------------------------------

42079e1392fb5b2b792f518812689854c375445f introduced this regression with the 
parallel upload feature.  Previously {{BaseBlobStore.putMultipartBlob}} 
prepared a single MPU part and uploaded it, looping until complete.  Now it 
prepares all MPU parts simultaneously and submits them to an 
{{ExecutorService}}.  Combined with JCLOUDS-814, this buffers the entire blob 
in-memory and results in {{OutOfMemoryError}}.  Instead we should limit the 
number of simultaneous uploads with {{InputStream}} payloads.  [~zack-s] 
[~dgyurdzhekliev] Could you investigate this?

> OutOfMemory when InputStream referencing to big file is used as payload
> -----------------------------------------------------------------------
>
>                 Key: JCLOUDS-1366
>                 URL: https://issues.apache.org/jira/browse/JCLOUDS-1366
>             Project: jclouds
>          Issue Type: Bug
>          Components: jclouds-blobstore
>    Affects Versions: 2.0.3
>         Environment: Linux and Windows
>            Reporter: Deyan
>            Priority: Critical
>
> If I use InputStream which source is large file (lets say 3GB) I am getting 
> OOE. This is with default java VM options.
> Here is the code I am using to construct the blob:
> {code:java}
>  File bigFile = new File(file);
>  try (InputStream inputStream = new FileInputStream(f)) {
>                 Blob b = blobStore.blobBuilder(blobName)
>                         .payload(inputStream).contentLength(f.length())
>                         .contentDisposition(blobName)
>                         .contentType(
>                                 MediaType.OCTET_STREAM)
>                         .userMetadata(ImmutableMap.of("a", "b", "test", 
> "beta"))
>                         .build();
>                 blobStore.putBlob("test", bbbbb, multipart());
> }
> {code}
> Stacktrace:
> {code:java}
> java.lang.OutOfMemoryError: Java heap space
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.getNextPayload(BasePayloadSlicer.java:101)
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:90)
>       at 
> org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:63)
>       at 
> org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:363)
>       at 
> org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:349)
>       at org.jclouds.s3.blobstore.S3BlobStore.putBlob(S3BlobStore.java:262)
> {code}
>  If 'bigFile' is used as payload the bug is not reproducible.
>  



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to