[ 
https://issues.apache.org/jira/browse/COMPRESS-599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17451885#comment-17451885
 ] 

Gary D. Gregory edited comment on COMPRESS-599 at 12/1/21, 3:17 PM:
--------------------------------------------------------------------

I added a disabled test case class: {{Codec_decodeInts_OutOfMemoryErrorTest.}} 
Note:
 - the useful class name
 - it's a unit test
 - it follows standard coding convention by upper-casing constants.

Please follow the above in the future. TY.

More importantly, I do not get the same error you report, I get:
{noformat}
java.lang.OutOfMemoryError: Java heap space
  at 
org.apache.commons.compress.harmony.unpack200.CpBands.parseCpUtf8(CpBands.java:365)
  at 
org.apache.commons.compress.harmony.unpack200.CpBands.read(CpBands.java:111)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.readSegment(Segment.java:351)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.unpackRead(Segment.java:459)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.unpack(Segment.java:436)
  at 
org.apache.commons.compress.harmony.unpack200.Archive.unpack(Archive.java:155)
  at 
org.apache.commons.compress.harmony.unpack200.Pack200UnpackerAdapter.unpack(Pack200UnpackerAdapter.java:49)
  at 
org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:183)
  at 
org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:77)
  at 
org.apache.commons.compress.harmony.unpack200.tests.Codec_decodeInts_OutOfMemoryErrorTest.test(Codec_decodeInts_OutOfMemoryErrorTest.java:36)
{noformat}

But I am running against git master and I imagine you are running on 1.21.


was (Author: garydgregory):
I added a disabled test case class: {{Codec_decodeInts_OutOfMemoryErrorTest.}} 
Note:
 - the useful class name
 - it's a unit test
 - it follows standard coding convention by upper-casing constants.

Please follow the above in the future. TY.

More importantly, I do not get the same error you report, I get:
{noformat}
java.lang.OutOfMemoryError: Java heap space
  at 
org.apache.commons.compress.harmony.unpack200.CpBands.parseCpUtf8(CpBands.java:365)
  at 
org.apache.commons.compress.harmony.unpack200.CpBands.read(CpBands.java:111)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.readSegment(Segment.java:351)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.unpackRead(Segment.java:459)
  at 
org.apache.commons.compress.harmony.unpack200.Segment.unpack(Segment.java:436)
  at 
org.apache.commons.compress.harmony.unpack200.Archive.unpack(Archive.java:155)
  at 
org.apache.commons.compress.harmony.unpack200.Pack200UnpackerAdapter.unpack(Pack200UnpackerAdapter.java:49)
  at 
org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:183)
  at 
org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:77)
  at 
org.apache.commons.compress.harmony.unpack200.tests.Codec_decodeInts_OutOfMemoryErrorTest.test(Codec_decodeInts_OutOfMemoryErrorTest.java:36)
{noformat}

> Memory usage in Pack200Compressor cannot be limited
> ---------------------------------------------------
>
>                 Key: COMPRESS-599
>                 URL: https://issues.apache.org/jira/browse/COMPRESS-599
>             Project: Commons Compress
>          Issue Type: Bug
>          Components: Compressors
>    Affects Versions: 1.21
>            Reporter: Dominik Stadler
>            Priority: Major
>              Labels: fuzzer, memory
>
> While fuzzing commons-compress, I found the following case where a specific 
> input leads to unbounded memory allocation in the Pack200Compressor.
> It seems there is currently no way to limit memory usage of this compressor 
> and thus this cannot be avoided if you process untrusted data via 
> commons-compress.
> With the following code-snippet:
> {noformat}
> public class Crash_0d5a0130ab3cd32f299b2a27aa76f24a0bbabae8 {
>     static final String base64Bytes = 
> "yv7QDQeW0ABgfwDuwOn8QwIGAAIBAQAAd9zc3Nzc3Nzc3Nzc3Nzc3NxuZXR3YXJl3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3Nzc3GluZG93cwAAAwMUAxUDZmVzdA0K";
>     public static void main(String[] args) throws IOException {
>         byte[] input = java.util.Base64.getDecoder().decode(base64Bytes);
>       new Pack200CompressorInputStream(new ByteArrayInputStream(input), 
> Pack200Strategy.TEMP_FILE);
>     }
> } {noformat}
> The following exception happens:
> {noformat}
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>     at 
> org.apache.commons.compress.harmony.pack200.Codec.decodeInts(Codec.java:169)
>     at 
> org.apache.commons.compress.harmony.pack200.BHSDCodec.decodeInts(BHSDCodec.java:256)
>     at 
> org.apache.commons.compress.harmony.unpack200.BandSet.decodeBandInt(BandSet.java:100)
>     at 
> org.apache.commons.compress.harmony.unpack200.CpBands.parseCpUtf8(CpBands.java:366)
>     at 
> org.apache.commons.compress.harmony.unpack200.CpBands.read(CpBands.java:111)
>     at 
> org.apache.commons.compress.harmony.unpack200.Segment.readSegment(Segment.java:351)
>     at 
> org.apache.commons.compress.harmony.unpack200.Segment.unpackRead(Segment.java:459)
>     at 
> org.apache.commons.compress.harmony.unpack200.Segment.unpack(Segment.java:436)
>     at 
> org.apache.commons.compress.harmony.unpack200.Archive.unpack(Archive.java:155)
>     at 
> org.apache.commons.compress.harmony.unpack200.Pack200UnpackerAdapter.unpack(Pack200UnpackerAdapter.java:49)
>     at 
> org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:183)
>     at 
> org.apache.commons.compress.compressors.pack200.Pack200CompressorInputStream.<init>(Pack200CompressorInputStream.java:77)
>     at 
> Crash_0d5a0130ab3cd32f299b2a27aa76f24a0bbabae8.main(Crash_0d5a0130ab3cd32f299b2a27aa76f24a0bbabae8.java:13)
>  {noformat}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to