[
https://issues.apache.org/jira/browse/COMPRESS-16?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Christian Grobmeier updated COMPRESS-16:
----------------------------------------
Attachment: patch-for-compress.txt
Based on the ant patch by David Wartell I created a compress-patch.
I didn't commit it till I got it tested - if somebody has > 8 gig files around,
please help with testing :-)
> unable to extract a TAR file that contains an entry which is 10 GB in size
> --------------------------------------------------------------------------
>
> Key: COMPRESS-16
> URL: https://issues.apache.org/jira/browse/COMPRESS-16
> Project: Commons Compress
> Issue Type: Bug
> Environment: I am using win xp sp3, but this should be platform
> independent.
> Reporter: Sam Smith
> Fix For: 1.1
>
> Attachments: ant-8GB-tar.patch, patch-for-compress.txt
>
>
> I made a TAR file which contains a file entry where the file is 10 GB in size.
> When I attempt to extract the file using TarInputStream, it fails with the
> following stack trace:
> java.io.IOException: unexpected EOF with 24064 bytes unread
> at
> org.apache.commons.compress.archivers.tar.TarInputStream.read(TarInputStream.java:348)
> at
> org.apache.commons.compress.archivers.tar.TarInputStream.copyEntryContents(TarInputStream.java:388)
> So, TarInputStream does not seem to support large (> 8 GB?) files.
> Here is something else to note: I created that TAR file using TarOutputStream
> , which did not complain when asked to write a 10 GB file into the TAR file,
> so I assume that TarOutputStream has no file size limits? That, or does it
> silently create corrupted TAR files (which would be the worst situation of
> all...)?
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.