Brian Miller created COMPRESS-619:
-------------------------------------

             Summary: Large SevenZFile When Next Header Size is Greater than 
Max Int
                 Key: COMPRESS-619
                 URL: https://issues.apache.org/jira/browse/COMPRESS-619
             Project: Commons Compress
          Issue Type: Bug
          Components: Archivers
    Affects Versions: 1.21
            Reporter: Brian Miller


When reading a large file (42GB) the following stack trace is produced:

 
{code:java}
java.io.IOException: Cannot handle nextHeaderSize 4102590414
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:343)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:136)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:376)
 ~[classes/:?]
    at 
org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:364)
 ~[classes/:?] {code}
 

The file was produced using the SevenZOutputFile class and contains a large 
number of very small files all inserted using copy compression. It passes the 
7z tests and has the following statistics:

 
{code:java}
Files: 40872560
Size:       43708874326
Compressed: 47811464772
 {code}
It is failing because a ByteBuffer can't be created that is large enough with 
something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to