svn commit: r48817 - in /release/commons/compress: binaries/ source/
Author: bodewig Date: Wed Jul 14 04:47:41 2021 New Revision: 48817 Log: cleanup Removed: release/commons/compress/binaries/commons-compress-1.20-bin.tar.gz release/commons/compress/binaries/commons-compress-1.20-bin.tar.gz.asc release/commons/compress/binaries/commons-compress-1.20-bin.tar.gz.sha512 release/commons/compress/binaries/commons-compress-1.20-bin.zip release/commons/compress/binaries/commons-compress-1.20-bin.zip.asc release/commons/compress/binaries/commons-compress-1.20-bin.zip.sha512 release/commons/compress/source/commons-compress-1.20-src.tar.gz release/commons/compress/source/commons-compress-1.20-src.tar.gz.asc release/commons/compress/source/commons-compress-1.20-src.tar.gz.sha512 release/commons/compress/source/commons-compress-1.20-src.zip release/commons/compress/source/commons-compress-1.20-src.zip.asc release/commons/compress/source/commons-compress-1.20-src.zip.sha512
svn commit: r1891495 - in /commons/cms-site/trunk: conf/component_releases.properties doap/doap_compress.rdf
Author: bodewig Date: Tue Jul 13 04:32:09 2021 New Revision: 1891495 URL: http://svn.apache.org/viewvc?rev=1891495=rev Log: Commons Compress 1.21 released Modified: commons/cms-site/trunk/conf/component_releases.properties commons/cms-site/trunk/doap/doap_compress.rdf Modified: commons/cms-site/trunk/conf/component_releases.properties URL: http://svn.apache.org/viewvc/commons/cms-site/trunk/conf/component_releases.properties?rev=1891495=1891494=1891495=diff == --- commons/cms-site/trunk/conf/component_releases.properties (original) +++ commons/cms-site/trunk/conf/component_releases.properties Tue Jul 13 04:32:09 2021 @@ -12,8 +12,8 @@ codecVersion=1.15 codecReleased=2020-09-01 collectionsVersion=4.4 collectionsReleased=2019-07-05 -compressVersion=1.20 -compressReleased=2020-02-08 +compressVersion=1.21 +compressReleased=2021-07-12 configurationVersion=2.7 configurationReleased=2020-03-11 cryptoVersion=1.1.0 Modified: commons/cms-site/trunk/doap/doap_compress.rdf URL: http://svn.apache.org/viewvc/commons/cms-site/trunk/doap/doap_compress.rdf?rev=1891495=1891494=1891495=diff == --- commons/cms-site/trunk/doap/doap_compress.rdf (original) +++ commons/cms-site/trunk/doap/doap_compress.rdf Tue Jul 13 04:32:09 2021 @@ -36,6 +36,13 @@ commons-compress +2021-07-12 +1.21 + + + + +commons-compress 2020-02-08 1.20
svn commit: r48802 - /release/commons/compress/RELEASE-NOTES.txt
Author: bodewig Date: Tue Jul 13 03:58:57 2021 New Revision: 48802 Log: update release notes Modified: release/commons/compress/RELEASE-NOTES.txt Modified: release/commons/compress/RELEASE-NOTES.txt == --- release/commons/compress/RELEASE-NOTES.txt (original) +++ release/commons/compress/RELEASE-NOTES.txt Tue Jul 13 03:58:57 2021 @@ -10,10 +10,11 @@ Release 1.21 Compress 1.21 is the first release to require Java 8 to build and run. -SevenZFileOptions has a new setting that needs to be enabled explicity -if SevenZFile should try to recover broken archives - a feature -introduced with Commons Compress 1.19. This is a breaking change if -you relied on the recovery attempt. +SevenZFileOptions has a new setting that needs to be enabled +explicitly if SevenZFile should try to recover broken archives - a +feature introduced with Commons Compress 1.19. This is a breaking +change if you relied on the recovery attempt. The change was made to +detect broken archives sooner, and to mitigate the OOM exploit. Several formats now throw IOExceptions when reading broken archives or streams that would have caused arbitrary RuntimeExceptions in earlier
[commons-compress] branch master updated: merge 1.21 tag and prepare for next iteration
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 90451dd merge 1.21 tag and prepare for next iteration 90451dd is described below commit 90451dd80ec8514b29cc56e7b7440b60fea0bbf0 Author: Stefan Bodewig AuthorDate: Fri Jul 9 18:54:09 2021 +0200 merge 1.21 tag and prepare for next iteration --- NOTICE.txt | 2 +- README.md | 4 +- RELEASE-NOTES.txt | 17 +++ pom.xml | 2 +- src/changes/changes.xml | 4 +- src/site/site.xml | 1 + src/site/xdoc/download_compress.xml | 26 +-- src/site/xdoc/security-reports.xml | 91 + 8 files changed, 121 insertions(+), 26 deletions(-) diff --git a/NOTICE.txt b/NOTICE.txt index 132b089..3fb4707 100644 --- a/NOTICE.txt +++ b/NOTICE.txt @@ -1,5 +1,5 @@ Apache Commons Compress -Copyright 2002-2020 The Apache Software Foundation +Copyright 2002-2021 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (https://www.apache.org/). diff --git a/README.md b/README.md index 89ca9b4..8db17e9 100644 --- a/README.md +++ b/README.md @@ -46,7 +46,7 @@ Apache Commons Compress [![Build Status](https://travis-ci.org/apache/commons-compress.svg)](https://travis-ci.org/apache/commons-compress) [![Coverage Status](https://coveralls.io/repos/apache/commons-compress/badge.svg)](https://coveralls.io/r/apache/commons-compress) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.apache.commons/commons-compress/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.apache.commons/commons-compress/) -[![Javadocs](https://javadoc.io/badge/org.apache.commons/commons-compress/1.20.svg)](https://javadoc.io/doc/org.apache.commons/commons-compress/1.20) +[![Javadocs](https://javadoc.io/badge/org.apache.commons/commons-compress/1.21.svg)](https://javadoc.io/doc/org.apache.commons/commons-compress/1.21) [![Fuzzing Status](https://oss-fuzz-build-logs.storage.googleapis.com/badges/apache-commons.svg)](https://bugs.chromium.org/p/oss-fuzz/issues/list?sort=-opened=1=proj:apache-commons) **Note: Commons Compress currently doesn't build on JDK 14+, we will @@ -74,7 +74,7 @@ Alternatively you can pull it from the central Maven repositories: org.apache.commons commons-compress - 1.20 + 1.21 ``` diff --git a/RELEASE-NOTES.txt b/RELEASE-NOTES.txt index 65c265b..49a9e75 100644 --- a/RELEASE-NOTES.txt +++ b/RELEASE-NOTES.txt @@ -8,16 +8,17 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. Release 1.21 -Compress 1.20 now at least requires Java 8 to build and run. +Compress 1.21 is the first release to require Java 8 to build and run. -SevenZFileOptions has a new setting that needs to be enabled explicity -if SevenZFile should try to recover broken archives - a feature -introduced with Commons Compress 1.19. This is a breaking change if -you relied on the recovery attempt. +SevenZFileOptions has a new setting that needs to be enabled +explicitly if SevenZFile should try to recover broken archives - a +feature introduced with Commons Compress 1.19. This is a breaking +change if you relied on the recovery attempt. The change was made to +detect broken archives sooner, and to mitigate the OOM exploit. -Several formats may now throw IOExceptions when reading broken -archives or streams that would have caused arbitrary RuntimeExceptions -in earlier versions of Compress. +Several formats now throw IOExceptions when reading broken archives or +streams that would have caused arbitrary RuntimeExceptions in earlier +versions of Compress. New features: o Add writePreamble to ZipArchiveInputStream. This method could diff --git a/pom.xml b/pom.xml index 2fa43c5..bead6fa 100644 --- a/pom.xml +++ b/pom.xml @@ -24,7 +24,7 @@ commons-compress - 1.21-SNAPSHOT + 1.22-SNAPSHOT Apache Commons Compress https://commons.apache.org/proper/commons-compress/ 2002 diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 67603d5..24b28c5 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -42,7 +42,9 @@ The type attribute can be add,update,fix,remove. Apache Commons Compress Release Notes - + + + diff --git a/src/site/xdoc/download_compress.xml b/src/site/xdoc/download_compress.xml index aeb7043..67d7cf8 100644 --- a/src/site/xdoc/download_compress.xml +++ b/src/site/xdoc/download_compress.xml @@ -113,32 +113,32 @@ limitations under the License. - + - commons-compress-1.20-bin.tar.gz - https://www.apache.org
[commons-compress] annotated tag rel/1.21 created (now 44550e7)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to annotated tag rel/1.21 in repository https://gitbox.apache.org/repos/asf/commons-compress.git. at 44550e7 (tag) tagging a599d93e7411f18439a151c31d56620bad75fe1c (tag) length 379 bytes by Stefan Bodewig on Mon Jul 12 20:06:18 2021 +0200 - Log - RC1 of 1.21 has been accepted -BEGIN PGP SIGNATURE- iHEEABECADEWIQTOgHWiUVR77iSbwVGiEVrhX2uLcgUCYOyEmhMcYm9kZXdpZ0Bh cGFjaGUub3JnAAoJEKIRWuFfa4tyqKsAnjyb84WluxqGTHYNDuAdSuuhQBUqAJ9Z Jn0icpkR7AYMcYhUVWlfIbHyaA== =o2Ls -END PGP SIGNATURE- --- No new revisions were added by this update.
svn commit: r48797 - /dev/commons/compress/ /dev/commons/compress/binaries/ /dev/commons/compress/source/ /release/commons/compress/ /release/commons/compress/binaries/ /release/commons/compress/sourc
Author: bodewig Date: Mon Jul 12 18:03:34 2021 New Revision: 48797 Log: vote for Commons Compress 1.21 has passed Added: release/commons/compress/binaries/commons-compress-1.21-bin.tar.gz - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz release/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.asc - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.asc release/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.sha512 - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.sha512 release/commons/compress/binaries/commons-compress-1.21-bin.zip - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.zip release/commons/compress/binaries/commons-compress-1.21-bin.zip.asc - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.zip.asc release/commons/compress/binaries/commons-compress-1.21-bin.zip.sha512 - copied unchanged from r48796, dev/commons/compress/binaries/commons-compress-1.21-bin.zip.sha512 release/commons/compress/source/commons-compress-1.21-src.tar.gz - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.tar.gz release/commons/compress/source/commons-compress-1.21-src.tar.gz.asc - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.tar.gz.asc release/commons/compress/source/commons-compress-1.21-src.tar.gz.sha512 - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.tar.gz.sha512 release/commons/compress/source/commons-compress-1.21-src.zip - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.zip release/commons/compress/source/commons-compress-1.21-src.zip.asc - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.zip.asc release/commons/compress/source/commons-compress-1.21-src.zip.sha512 - copied unchanged from r48796, dev/commons/compress/source/commons-compress-1.21-src.zip.sha512 Removed: dev/commons/compress/README.html dev/commons/compress/RELEASE-NOTES.txt dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.asc dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.sha512 dev/commons/compress/binaries/commons-compress-1.21-bin.zip dev/commons/compress/binaries/commons-compress-1.21-bin.zip.asc dev/commons/compress/binaries/commons-compress-1.21-bin.zip.sha512 dev/commons/compress/source/commons-compress-1.21-src.tar.gz dev/commons/compress/source/commons-compress-1.21-src.tar.gz.asc dev/commons/compress/source/commons-compress-1.21-src.tar.gz.sha512 dev/commons/compress/source/commons-compress-1.21-src.zip dev/commons/compress/source/commons-compress-1.21-src.zip.asc dev/commons/compress/source/commons-compress-1.21-src.zip.sha512 Modified: release/commons/compress/README.html release/commons/compress/RELEASE-NOTES.txt Modified: release/commons/compress/README.html == --- release/commons/compress/README.html (original) +++ release/commons/compress/README.html Mon Jul 12 18:03:34 2021 @@ -1,6 +1,6 @@ -Commons-Compress 1.20 +Commons-Compress 1.21 -This is the 1.20 release of commons-compress. It is available in both binary and source distributions. +This is the 1.21 release of commons-compress. It is available in both binary and source distributions. Note: The tar files in the distribution use GNU tar extensions @@ -34,12 +34,12 @@ href="https://www.apache.org/dist/common Always test available signatures, e.g., $ pgpk -a KEYS -$ pgpv commons-compress-1.20-bin.tar.gz.asc +$ pgpv commons-compress-1.21-bin.tar.gz.asc or, $ pgp -ka KEYS -$ pgp commons-compress-1.20-bin.tar.gz.asc +$ pgp commons-compress-1.21-bin.tar.gz.asc or, $ gpg --import KEYS -$ gpg --verify commons-compress-1.20-bin.tar.gz.asc +$ gpg --verify commons-compress-1.21-bin.tar.gz.asc Modified: release/commons/compress/RELEASE-NOTES.txt == --- release/commons/compress/RELEASE-NOTES.txt (original) +++ release/commons/compress/RELEASE-NOTES.txt Mon Jul 12 18:03:34 2021 @@ -5,6 +5,324 @@ compression and archive formats. These lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4, Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. +Release 1.21 + + +Compress 1.21 is the first release to require Java 8 to build and run. + +SevenZFileOptions has a new setting that needs to be enabled explicity +if SevenZFile should try to recover broken archives - a feature +introduced with Co
[commons-compress] 01/01: prepare RC1 of Commons Compress 1.21
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to annotated tag 1.21-RC1 in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 60e3d9f6bef1e431f8738e881c051d706f81e6cf Author: Stefan Bodewig AuthorDate: Fri Jul 9 18:54:09 2021 +0200 prepare RC1 of Commons Compress 1.21 --- NOTICE.txt | 2 +- README.md | 4 ++-- pom.xml | 2 +- src/site/site.xml | 1 + src/site/xdoc/download_compress.xml | 26 +- 5 files changed, 18 insertions(+), 17 deletions(-) diff --git a/NOTICE.txt b/NOTICE.txt index 132b089..3fb4707 100644 --- a/NOTICE.txt +++ b/NOTICE.txt @@ -1,5 +1,5 @@ Apache Commons Compress -Copyright 2002-2020 The Apache Software Foundation +Copyright 2002-2021 The Apache Software Foundation This product includes software developed at The Apache Software Foundation (https://www.apache.org/). diff --git a/README.md b/README.md index 89ca9b4..8db17e9 100644 --- a/README.md +++ b/README.md @@ -46,7 +46,7 @@ Apache Commons Compress [![Build Status](https://travis-ci.org/apache/commons-compress.svg)](https://travis-ci.org/apache/commons-compress) [![Coverage Status](https://coveralls.io/repos/apache/commons-compress/badge.svg)](https://coveralls.io/r/apache/commons-compress) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.apache.commons/commons-compress/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.apache.commons/commons-compress/) -[![Javadocs](https://javadoc.io/badge/org.apache.commons/commons-compress/1.20.svg)](https://javadoc.io/doc/org.apache.commons/commons-compress/1.20) +[![Javadocs](https://javadoc.io/badge/org.apache.commons/commons-compress/1.21.svg)](https://javadoc.io/doc/org.apache.commons/commons-compress/1.21) [![Fuzzing Status](https://oss-fuzz-build-logs.storage.googleapis.com/badges/apache-commons.svg)](https://bugs.chromium.org/p/oss-fuzz/issues/list?sort=-opened=1=proj:apache-commons) **Note: Commons Compress currently doesn't build on JDK 14+, we will @@ -74,7 +74,7 @@ Alternatively you can pull it from the central Maven repositories: org.apache.commons commons-compress - 1.20 + 1.21 ``` diff --git a/pom.xml b/pom.xml index 2fa43c5..757312c 100644 --- a/pom.xml +++ b/pom.xml @@ -24,7 +24,7 @@ commons-compress - 1.21-SNAPSHOT + 1.21 Apache Commons Compress https://commons.apache.org/proper/commons-compress/ 2002 diff --git a/src/site/site.xml b/src/site/site.xml index e96859f..d76013b 100644 --- a/src/site/site.xml +++ b/src/site/site.xml @@ -38,6 +38,7 @@ + diff --git a/src/site/xdoc/download_compress.xml b/src/site/xdoc/download_compress.xml index aeb7043..67d7cf8 100644 --- a/src/site/xdoc/download_compress.xml +++ b/src/site/xdoc/download_compress.xml @@ -113,32 +113,32 @@ limitations under the License. - + - commons-compress-1.20-bin.tar.gz - https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.20-bin.tar.gz.sha512;>sha512 - https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.20-bin.tar.gz.asc;>pgp + commons-compress-1.21-bin.tar.gz + https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.sha512;>sha512 + https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.asc;>pgp - commons-compress-1.20-bin.zip - https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.20-bin.zip.sha512;>sha512 - https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.20-bin.zip.asc;>pgp + commons-compress-1.21-bin.zip + https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.21-bin.zip.sha512;>sha512 + https://www.apache.org/dist/commons/compress/binaries/commons-compress-1.21-bin.zip.asc;>pgp - commons-compress-1.20-src.tar.gz - https://www.apache.org/dist/commons/compress/source/commons-compress-1.20-src.tar.gz.sha512;>sha512 - https://www.apache.org/dist/commons/compress/source/commons-compress-1.20-src.tar.gz.asc;>pgp + commons-compress-1.21-src.tar.gz + https://www.apache.org/dist/commons/compress/source/commons-compress-1.21-src.tar.gz.sha512;>sha512 + https://www.apache.org/dist/commons/compress/source/commons-compress-1.21-src.tar.gz.asc;>pgp - commons-compress-1.20-src.zip - https://www.apache.org/dist/commons/compress/source/commons-compress-1.20-src.zip
[commons-compress] annotated tag 1.21-RC1 created (now a599d93)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to annotated tag 1.21-RC1 in repository https://gitbox.apache.org/repos/asf/commons-compress.git. at a599d93 (tag) tagging 60e3d9f6bef1e431f8738e881c051d706f81e6cf (commit) replaces rel/1.20 by Stefan Bodewig on Fri Jul 9 19:07:14 2021 +0200 - Log - RC1 of Compress 1.21 -BEGIN PGP SIGNATURE- iHEEABECADEWIQTOgHWiUVR77iSbwVGiEVrhX2uLcgUCYOiCQhMcYm9kZXdpZ0Bh cGFjaGUub3JnAAoJEKIRWuFfa4tyzrAAn2nwwKmjhjxcXN4abVqRxhXuNBrdAJ9h 24VAjoFYBp9p85oH6CAAN6Yh2Q== =P0Wn -END PGP SIGNATURE- --- This annotated tag includes the following new commits: new 60e3d9f prepare RC1 of Commons Compress 1.21 The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
svn commit: r48755 - in /dev/commons/compress: ./ binaries/ source/
Author: bodewig Date: Fri Jul 9 17:06:02 2021 New Revision: 48755 Log: RC1 of Commons Compress 1.21 Added: dev/commons/compress/README.html - copied, changed from r41696, release/commons/compress/README.html dev/commons/compress/RELEASE-NOTES.txt (with props) dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz (with props) dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.asc dev/commons/compress/binaries/commons-compress-1.21-bin.tar.gz.sha512 dev/commons/compress/binaries/commons-compress-1.21-bin.zip (with props) dev/commons/compress/binaries/commons-compress-1.21-bin.zip.asc dev/commons/compress/binaries/commons-compress-1.21-bin.zip.sha512 dev/commons/compress/source/commons-compress-1.21-src.tar.gz (with props) dev/commons/compress/source/commons-compress-1.21-src.tar.gz.asc dev/commons/compress/source/commons-compress-1.21-src.tar.gz.sha512 dev/commons/compress/source/commons-compress-1.21-src.zip (with props) dev/commons/compress/source/commons-compress-1.21-src.zip.asc dev/commons/compress/source/commons-compress-1.21-src.zip.sha512 Copied: dev/commons/compress/README.html (from r41696, release/commons/compress/README.html) == --- release/commons/compress/README.html (original) +++ dev/commons/compress/README.html Fri Jul 9 17:06:02 2021 @@ -1,6 +1,6 @@ -Commons-Compress 1.20 +Commons-Compress 1.21 -This is the 1.20 release of commons-compress. It is available in both binary and source distributions. +This is the 1.21 release of commons-compress. It is available in both binary and source distributions. Note: The tar files in the distribution use GNU tar extensions @@ -34,12 +34,12 @@ href="https://www.apache.org/dist/common Always test available signatures, e.g., $ pgpk -a KEYS -$ pgpv commons-compress-1.20-bin.tar.gz.asc +$ pgpv commons-compress-1.21-bin.tar.gz.asc or, $ pgp -ka KEYS -$ pgp commons-compress-1.20-bin.tar.gz.asc +$ pgp commons-compress-1.21-bin.tar.gz.asc or, $ gpg --import KEYS -$ gpg --verify commons-compress-1.20-bin.tar.gz.asc +$ gpg --verify commons-compress-1.21-bin.tar.gz.asc Added: dev/commons/compress/RELEASE-NOTES.txt == --- dev/commons/compress/RELEASE-NOTES.txt (added) +++ dev/commons/compress/RELEASE-NOTES.txt Fri Jul 9 17:06:02 2021 @@ -0,0 +1,1570 @@ + Apache Commons Compress RELEASE NOTES + +Apache Commons Compress software defines an API for working with +compression and archive formats. These include: bzip2, gzip, pack200, +lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4, +Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. + +Release 1.21 + + +Compress 1.20 now at least requires Java 8 to build and run. + +SevenZFileOptions has a new setting that needs to be enabled explicity +if SevenZFile should try to recover broken archives - a feature +introduced with Commons Compress 1.19. This is a breaking change if +you relied on the recovery attempt. + +Several formats may now throw IOExceptions when reading broken +archives or streams that would have caused arbitrary RuntimeExceptions +in earlier versions of Compress. + +New features: +o Add writePreamble to ZipArchiveInputStream. This method could + write raw data to zip archive before any entry was written to + the zip archive. + For most of the time, this is used to create self-extracting + zip. + Github Pull Request #127. + Issue: COMPRESS-550. + Thanks to Scott Frederick. +o Added support for random access to the TAR packages. + Github Pull Request #113. + Issue: COMPRESS-540. + Thanks to Robin Schimpf. +o Added support for BufferPool in ZstdCompressorInputStream. + Github Pull Request #165. + Issue: COMPRESS-565. + Thanks to Michael L Heuer. +o Commons Compress cannot be built with JDK14 due to Pack200 removal. + Add Pack200 implementation from Apache Harmony. + Issue: COMPRESS-507. + Thanks to Gary Gregory, Apache Harmony. +o Add a new AlwaysWithCompatibility in Zip64Mode, this is a + compromise for some libraries including 7z and Expand-Archive + Powershell utility(and likely Excel). + + And we will encode both the LFH offset and Disk Number Start + in the ZIP64 Extended Information Extra Field - even if only + the disk number needs to be encoded. + + Github Pull Request #169. + Issue: COMPRESS-565. + Thanks to Evgenii Bovykin. +o gzip deflate buffer size is now configurable. + Issue: COMPRESS-566. + Thanks to Brett Okken. + +Fixed Bugs: +o Fix bugs in random access of 7z. Problems may happen + in a mixture use of random access and sequential access + of 7z. + Github Pull Request #95. + Issue: COMPRESS-505. +o Fix bugs in random access of 7z. Exceptions are thrown + when reading the first entry multiple times by random + access. + Issue: COMPRES
[commons-compress] branch master updated: create release notes for 1.21
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new dc5c8e4 create release notes for 1.21 dc5c8e4 is described below commit dc5c8e4eb406ed61fc8ec3076dda135b4bf66f63 Author: Stefan Bodewig AuthorDate: Wed Jul 7 19:06:11 2021 +0200 create release notes for 1.21 --- RELEASE-NOTES.txt | 318 ++ 1 file changed, 318 insertions(+) diff --git a/RELEASE-NOTES.txt b/RELEASE-NOTES.txt index 0ee5d5f..65c265b 100644 --- a/RELEASE-NOTES.txt +++ b/RELEASE-NOTES.txt @@ -5,6 +5,324 @@ compression and archive formats. These include: bzip2, gzip, pack200, lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4, Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. +Release 1.21 + + +Compress 1.20 now at least requires Java 8 to build and run. + +SevenZFileOptions has a new setting that needs to be enabled explicity +if SevenZFile should try to recover broken archives - a feature +introduced with Commons Compress 1.19. This is a breaking change if +you relied on the recovery attempt. + +Several formats may now throw IOExceptions when reading broken +archives or streams that would have caused arbitrary RuntimeExceptions +in earlier versions of Compress. + +New features: +o Add writePreamble to ZipArchiveInputStream. This method could + write raw data to zip archive before any entry was written to + the zip archive. + For most of the time, this is used to create self-extracting + zip. + Github Pull Request #127. + Issue: COMPRESS-550. + Thanks to Scott Frederick. +o Added support for random access to the TAR packages. + Github Pull Request #113. + Issue: COMPRESS-540. + Thanks to Robin Schimpf. +o Added support for BufferPool in ZstdCompressorInputStream. + Github Pull Request #165. + Issue: COMPRESS-565. + Thanks to Michael L Heuer. +o Commons Compress cannot be built with JDK14 due to Pack200 removal. + Add Pack200 implementation from Apache Harmony. + Issue: COMPRESS-507. + Thanks to Gary Gregory, Apache Harmony. +o Add a new AlwaysWithCompatibility in Zip64Mode, this is a + compromise for some libraries including 7z and Expand-Archive + Powershell utility(and likely Excel). + + And we will encode both the LFH offset and Disk Number Start + in the ZIP64 Extended Information Extra Field - even if only + the disk number needs to be encoded. + + Github Pull Request #169. + Issue: COMPRESS-565. + Thanks to Evgenii Bovykin. +o gzip deflate buffer size is now configurable. + Issue: COMPRESS-566. + Thanks to Brett Okken. + +Fixed Bugs: +o Fix bugs in random access of 7z. Problems may happen + in a mixture use of random access and sequential access + of 7z. + Github Pull Request #95. + Issue: COMPRESS-505. +o Fix bugs in random access of 7z. Exceptions are thrown + when reading the first entry multiple times by random + access. + Issue: COMPRESS-510. +o Add '/' to directories with long name in tar. This is to + resolve the ambiguous behavior of the TarArchiveEntry.getName() + method between directory with short name and long name. + Issue: COMPRESS-509. + Thanks to Petr Vasak. +o Removed the PowerMock dependency. + Issue: COMPRESS-520. + Thanks to Robin Schimpf. +o Added improved checks to detect corrupted bzip2 streams and + throw the expected IOException rather than obscure + RuntimeExceptions. + See also COMPRESS-519. + Issue: COMPRESS-516. +o Improved parsing of X5455_ExtendedTimestamp ZIP extra field. + Issue: COMPRESS-517. +o ZipArchiveInputStream and ZipFile will now throw an + IOException rather than a RuntimeException if the zip64 extra + field of an entry could not be parsed. + Issue: COMPRESS-518. +o Improved detection of corrupt ZIP archives in ZipArchiveInputStream. + Issue: COMPRESS-523. +o Added improved checks to detect corrupted deflate64 streams and + throw the expected IOException rather than obscure + RuntimeExceptions. + Issues: COMPRESS-521, COMPRESS-522, COMPRESS-525, COMPRESS-526, and COMPRESS-527. +o Add the archive name in the exception in the constructor of + ZipFile to make it a more specific exception. + Github Pull Request #102. + Issue: COMPRESS-515. + Thanks to ian-lavallee. +o Throw IOException when it encounters a non-number while parsing pax + header. + Issue: COMPRESS-530. +o Throw IOException when a a tar archive contains a PAX header + without any normal entry following it. + Issue: COMPRESS-531. +o Added improved checks to detect corrupted IMPLODED streams and + throw the expected IOException rather than obscure + RuntimeExceptions. + Issue: COMPRESS-532. +o Throw expected IOException instead of NumberFormatException if + it encounters non-numbers when parsing pax headers for tarball. + + Throw IllegalArgumentException
[commons-compress] branch master updated: some additional documentation updates
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 93a35f0 some additional documentation updates 93a35f0 is described below commit 93a35f0d49e25f4a841c2f5d8d75c4cbcb2552ea Author: Stefan Bodewig AuthorDate: Sun Jul 4 12:36:51 2021 +0200 some additional documentation updates --- src/site/xdoc/examples.xml| 32 ++-- src/site/xdoc/index.xml | 10 ++ src/site/xdoc/limitations.xml | 6 +++--- 3 files changed, 35 insertions(+), 13 deletions(-) diff --git a/src/site/xdoc/examples.xml b/src/site/xdoc/examples.xml index ce7e813..2ceac92 100644 --- a/src/site/xdoc/examples.xml +++ b/src/site/xdoc/examples.xml @@ -98,8 +98,9 @@ CompressorInputStream input = new CompressorStreamFactory() stream. As of 1.14 this setting only affects decompressing Z, XZ and LZMA compressed streams. Since Compress 1.19 SevenZFile also has an -optional constructor to pass an upper memory limit. Supported -are LZMA compressed streams. +optional constructor to pass an upper memory limit which is supported +are LZMA compressed streams. Since Compress 1.21 this setting +also is taken into account when reading the metadata of an archive. For the Snappy and LZ4 formats the amount of memory used during compression is directly proportional to the window size. @@ -239,7 +240,7 @@ try (InputStream fi = Files.newInputStream(Paths.get("my.tar.gz")); archive, you can first use createArchiveEntry for each file. In general this will set a few flags (usually the last modified time, the size and the information whether this -is a file or directory) based on the File +is a file or directory) based on the File or Path instance. Alternatively you can create the ArchiveEntry subclass corresponding to your format directly. Often you may want to set additional flags @@ -430,6 +431,23 @@ LOOP UNTIL entry.getSize() HAS BEEN READ { will likely be significantly slower than sequential access. +Recovering from Certain Broken 7z Archives + +Starting with Compress 1.19 SevenZFile tries +to recover archives that look as if they were part of a +multi-volume archive where the first volume has been removed +too early. + +Starting with Compress 1.21 this option has to be enabled +explicitly in SevenZFileOptions. The way recovery +works is by Compress scanning an archive from the end for +something that might look like valid 7z metadata and use that, +if it can successfully parse the block of data. When doing so +Compress may encounter blocks of metadata that look like the +metadata of very large archives which in turn may make +Compress allocate a lot of memory. Therefore we strongly +recommend you also set a memory limit inside the +SevenZFileOptions if you enable recovery. @@ -931,7 +949,7 @@ in.close(); Compress offers two different stream classes for reading or writing either format. -Uncompressing a given frame LZ4 file (you would +Uncompressing a given framed LZ4 file (you would certainly add exception handling and make sure all streams get closed properly):
[commons-compress] 02/03: update documentation
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit c832c8d07572e92e5be00f70509d1b911bc009f8 Author: Stefan Bodewig AuthorDate: Sat Jul 3 17:39:09 2021 +0200 update documentation --- src/changes/changes.xml | 15 --- src/site/xdoc/index.xml | 16 +++- src/site/xdoc/limitations.xml | 8 ++-- src/site/xdoc/tar.xml | 18 -- 4 files changed, 45 insertions(+), 12 deletions(-) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 5a1364a..35a8cc3 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -42,8 +42,14 @@ The type attribute can be add,update,fix,remove. Apache Commons Compress Release Notes - + Made sure ZstdCompressorOutputStream no longer used @@ -99,7 +105,7 @@ The type attribute can be add,update,fix,remove. Add the archive name in the exception in the constructor of -ZipFIle to make it a more specific exception. +ZipFile to make it a more specific exception. Github Pull Request #102. @@ -204,6 +210,9 @@ The type attribute can be add,update,fix,remove. Add a new maven profile in pom.xml for JDK14+ to ignore the failing tests about Pack200. + +This has later been superseeded by adding the Apache Harmony +classes for Pack200 support. diff --git a/src/site/xdoc/index.xml b/src/site/xdoc/index.xml index b267cdc..ec333a2 100644 --- a/src/site/xdoc/index.xml +++ b/src/site/xdoc/index.xml @@ -52,12 +52,26 @@ - The current release is 1.20 and requires Java 7. + The current release is 1.21 and requires Java 8. Below we highlight some new features, for a full list of changes see the Changes Report. + + + A new class TarFile provides random + access to TAR archives. + Commons Compress now ships with a copy of the + Pack200 code of the retired Apache Harmony project. The + pack200 support in Commons Compress no longer uses the + implementation of the Java class library - and thus also + works for Java 14 and later. + Added new methods supporting + java.nio.Path as an alternative to + java.io.File to many classes. + + SevenZFile now supports random diff --git a/src/site/xdoc/limitations.xml b/src/site/xdoc/limitations.xml index ea72416..e1c62d1 100644 --- a/src/site/xdoc/limitations.xml +++ b/src/site/xdoc/limitations.xml @@ -161,10 +161,14 @@ - Pack200 support in Commons Comppress relies on the + Pack200 support in Commons Comppress prior to 1.21 relies on the Pack200 class of the Java classlib. Java 14 removed support and thus Pack200 will not work at all when - running on Java 14 or later. + running on Java 14 or later. + Starting with Commons Compress 1.21 the classlib + implementation is no longer used at all, instead Commons + Compress contains the pack200 code of the retired Apache + Harmony project. diff --git a/src/site/xdoc/tar.xml b/src/site/xdoc/tar.xml index c17ea8c..79a6943 100644 --- a/src/site/xdoc/tar.xml +++ b/src/site/xdoc/tar.xml @@ -126,13 +126,12 @@ -TarArchiveInputStream will recognize sparse +Prior to Commons Compress 1.20 TarArchiveInputStream would recognize sparse file entries stored using the "oldgnu" format -(-sparse-version=0.0 in GNU tar) but is not -able to extract them correctly. canReadEntryData will return false -on such entries. The other variants of sparse files can -currently not be detected at all. +(-sparse-version=0.0 in GNU tar) but not +able to extract them correctly. Starting with Commons Compress +all GNU and POSIX variants of sparse files are recognized and +can be raed. @@ -223,6 +222,13 @@ + +Starting with Commons Compress 1.21 the tar package +contains a TarFile class that provides random +access to archives. Except for the ability to access entries +out of order TarFile is not superior to +TarArchiveInputStream. +
[commons-compress] 01/03: address some findings of static code analysis
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit cd831ba31944988d22b4536933e1c7127718551f Author: Stefan Bodewig AuthorDate: Sat Jul 3 17:14:15 2021 +0200 address some findings of static code analysis --- .../commons/compress/archivers/sevenz/SevenZFile.java| 16 .../commons/compress/archivers/tar/TarArchiveEntry.java | 9 - 2 files changed, 12 insertions(+), 13 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 867ed18..7625c91 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -615,7 +615,7 @@ public class SevenZFile implements Closeable { int nid = getUnsignedByte(header); if (nid == NID.kArchiveProperties) { -sanityCheckArchiveProperties(header, stats); +sanityCheckArchiveProperties(header); nid = getUnsignedByte(header); } @@ -652,7 +652,7 @@ public class SevenZFile implements Closeable { } } -private void sanityCheckArchiveProperties(final ByteBuffer header, final ArchiveStatistics stats) +private void sanityCheckArchiveProperties(final ByteBuffer header) throws IOException { int nid = getUnsignedByte(header); while (nid != NID.kEnd) { @@ -878,7 +878,7 @@ public class SevenZFile implements Closeable { final int numFoldersInt = (int) readUint64(header); final Folder[] folders = new Folder[numFoldersInt]; archive.folders = folders; -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); for (int i = 0; i < numFoldersInt; i++) { folders[i] = readFolder(header); } @@ -1391,7 +1391,7 @@ public class SevenZFile implements Closeable { break; } case NID.kName: { -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); final byte[] names = new byte[(int) (size - 1)]; final int namesLength = names.length; get(header, names); @@ -1412,7 +1412,7 @@ public class SevenZFile implements Closeable { } case NID.kCTime: { final BitSet timesDefined = readAllOrBits(header, numFilesInt); -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); for (int i = 0; i < numFilesInt; i++) { checkEntryIsInitialized(fileMap, i); final SevenZArchiveEntry entryAtIndex = fileMap.get(i); @@ -1425,7 +1425,7 @@ public class SevenZFile implements Closeable { } case NID.kATime: { final BitSet timesDefined = readAllOrBits(header, numFilesInt); -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); for (int i = 0; i < numFilesInt; i++) { checkEntryIsInitialized(fileMap, i); final SevenZArchiveEntry entryAtIndex = fileMap.get(i); @@ -1438,7 +1438,7 @@ public class SevenZFile implements Closeable { } case NID.kMTime: { final BitSet timesDefined = readAllOrBits(header, numFilesInt); -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); for (int i = 0; i < numFilesInt; i++) { checkEntryIsInitialized(fileMap, i); final SevenZArchiveEntry entryAtIndex = fileMap.get(i); @@ -1451,7 +1451,7 @@ public class SevenZFile implements Closeable { } case NID.kWinAttributes: { final BitSet attributesDefined = readAllOrBits(header, numFilesInt); -final int external = getUnsignedByte(header); +/* final int external = */ getUnsignedByte(header); for (int i = 0; i < numFilesInt; i++) { checkEntryIsInitialized(fileMap, i); final SevenZArchiveEntry entryAtIndex = fileMap.get(i); diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index 44dcf54..b547e30 100644 --- a/src/main/java/org
[commons-compress] branch master updated (b1ecf0b -> 220bdd6)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from b1ecf0b copy Harmony's Pack200 tests new cd831ba address some findings of static code analysis new c832c8d update documentation new 220bdd6 made sure, all JIRAs resolved for 1.21 are included The 3 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: src/changes/changes.xml| 50 -- .../compress/archivers/sevenz/SevenZFile.java | 16 +++ .../compress/archivers/tar/TarArchiveEntry.java| 9 ++-- src/site/xdoc/index.xml| 16 ++- src/site/xdoc/limitations.xml | 8 +++- src/site/xdoc/tar.xml | 18 +--- 6 files changed, 92 insertions(+), 25 deletions(-)
[commons-compress] 03/03: made sure, all JIRAs resolved for 1.21 are included
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 220bdd6f1315081efe4fde0f8875bf794e84238b Author: Stefan Bodewig AuthorDate: Sat Jul 3 18:00:06 2021 +0200 made sure, all JIRAs resolved for 1.21 are included --- src/changes/changes.xml | 35 +++ 1 file changed, 35 insertions(+) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 35a8cc3..67603d5 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -133,6 +133,16 @@ you relied on the recovery attempt."> of a traditional tar header while bigNumberMode is BIGNUMBER_ERROR, and address this in java docs. + +Made an inner class static +Github Pull Request #107. + + +added an early exit to a loop in BZip2CompressorOutputStream +Github Pull Request #106. + Update the class of variable file in TarArchiveEntry from java.io.File to java.nio.file.Path. Corresponding constructors @@ -224,6 +234,11 @@ you relied on the recovery attempt."> Throw an declared IOException if a null entry is met when reading a global pax header instead of a runtime NPE. + +ZIP extraction could lead to ArrayIndexOutOfBoundsExceptions +rather than the expected IOException. + Add asserts for Arrays.copyOf in X0017_StrongEncryptionHeader. @@ -389,6 +404,26 @@ you relied on the recovery attempt."> Github Pull Request #169. + +Some minor improvements. +Github Pull Request #193. + + +Java8 improvements. +Github Pull Request #194. + + +Remove redundant local variable. +Github Pull Request #195. + + +Remove redundant operation +Github Pull Request #196. + gzip deflate buffer size is now configurable.
[commons-compress] branch pack200-harmony-as-fallback updated (d358036 -> 4036af0)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git. discard d358036 try to use java.util.jar.Packe200, only fall back to Harmony add a143029 re-enable Pack200 tests on Java 14+ add b1ecf0b copy Harmony's Pack200 tests new 4036af0 try to use java.util.jar.Packe200, only fall back to Harmony This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (d358036) \ N -- N -- N refs/heads/pack200-harmony-as-fallback (4036af0) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: pom.xml| 20 - .../harmony/pack200/tests/ArchiveTest.java | 357 +++ .../harmony/pack200/tests/BHSDCodecTest.java | 82 +++ .../harmony/pack200/tests/CodecEncodingTest.java | 311 + .../compress/harmony/pack200/tests/CodecTest.java | 244 .../harmony/pack200/tests/HelloWorld.java} | 24 +- .../pack200/tests/NewAttributeBandsTest.java | 330 ++ .../harmony/pack200/tests/PackingOptionsTest.java | 696 + .../harmony/pack200/tests/PopulationCodecTest.java | 83 +++ .../harmony/pack200/tests/RunCodecTest.java| 166 + .../unpack200/tests/AbstractBandsTestCase.java | 83 +++ .../harmony/unpack200/tests/ArchiveTest.java | 346 ++ .../unpack200/tests/AttributeLayoutMapTest.java| 54 ++ .../unpack200/tests/AttributeLayoutTest.java | 142 + .../harmony/unpack200/tests/BandSetTest.java | 82 +++ .../harmony/unpack200/tests/BcBandsTest.java | 672 .../harmony/unpack200/tests/CPUTF8Test.java} | 29 +- .../harmony/unpack200/tests/ClassBandsTest.java| 179 ++ .../harmony/unpack200/tests/CodeAttributeTest.java | 203 ++ .../harmony/unpack200/tests/ICTupleTest.java | 64 ++ .../unpack200/tests/NewAttributeBandsTest.java | 256 .../tests/SegmentConstantPoolArrayCacheTest.java | 78 +++ .../unpack200/tests/SegmentConstantPoolTest.java | 121 .../unpack200/tests/SegmentOptionsTest.java} | 47 +- .../harmony/unpack200/tests/SegmentTest.java | 119 .../harmony/unpack200/tests/SegmentUtilsTest.java | 82 +++ .../unpack200/tests/bytecode/ByteCodeTest.java}| 24 +- .../tests/bytecode/ClassFileEntryTest.java | 108 .../unpack200/tests/bytecode/ConstantPoolTest.java | 69 ++ src/test/resources/pack200/HelloWorld.pack | Bin 0 -> 530 bytes src/test/resources/pack200/InterfaceOnly.pack | Bin 0 -> 137 bytes src/test/resources/pack200/JustResources.pack | Bin 0 -> 51 bytes src/test/resources/pack200/JustResources.pack.gz | Bin 0 -> 82 bytes src/test/resources/pack200/LargeClass.pack.gz | Bin 0 -> 752 bytes src/test/resources/pack200/annotations.jar | Bin 0 -> 3683 bytes src/test/resources/pack200/annotations.pack.gz | Bin 0 -> 712 bytes .../resources/pack200/annotations2unpacked.jar | Bin 0 -> 2919 bytes src/test/resources/pack200/annotationsRI.jar | Bin 0 -> 3539 bytes src/test/resources/pack200/annotationsRI.pack.gz | Bin 0 -> 1003 bytes src/test/resources/pack200/annotationsUnpacked.jar | Bin 0 -> 2267 bytes src/test/resources/pack200/hw.jar | Bin 0 -> 842 bytes src/test/resources/pack200/jars/ant.jar| Bin 0 -> 958858 bytes src/test/resources/pack200/jndi-e1.pack.gz | Bin 0 -> 75058 bytes src/test/resources/pack200/jndi.jar| Bin 0 -> 236738 bytes src/test/resources/pack200/jndiUnpacked.jar| Bin 0 -> 232433 bytes .../pack200/jndiWithUnknownAttributes.jar | Bin 0 -> 238035 bytes src/test/resources/pack200/largeClassUnpacked.jar | Bin 0 -> 10793 bytes .../pack200/p200WithUnknownAttributes.jar | Bin 0 -> 74529 bytes .../pack200/p200WithUnknownAttributes2.jar | Bin 0 -> 73139 bytes src/test/resources/pack200/pack200-e1.pack.gz | Bin 0 -> 14751 bytes src/test/resources/pack200/pac
[commons-compress] branch master updated: re-enable Pack200 tests on Java 14+
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new a143029 re-enable Pack200 tests on Java 14+ a143029 is described below commit a143029ae1171bb2093dc175c524f358ed5fede9 Author: Stefan Bodewig AuthorDate: Sat Jul 3 10:10:12 2021 +0200 re-enable Pack200 tests on Java 14+ --- pom.xml | 20 1 file changed, 20 deletions(-) diff --git a/pom.xml b/pom.xml index 1ec23b7..2fa43c5 100644 --- a/pom.xml +++ b/pom.xml @@ -552,26 +552,6 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. - java14+ - -[14,) - - - - -maven-surefire-plugin - - -**/DetectCompressorTestCase.java -**/Pack200TestCase.java -**/Pack200UtilsTest.java - - - - - - - java11+ [11,)
[commons-compress] branch pack200-harmony-as-fallback updated (fd3f09c -> d358036)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git. omit fd3f09c adapt to changed test dependencies in master omit 3b2227a sync pom with parent, hoping to get github actions build to pass omit 18cb531 sync github actions config with master omit 58c250c run tests on Java 14+ in CI omit d3ef88f try to use java.util.jar.Packe200, only fall back to Harmony add 92d9df3 ignore Pack200 tests for jdk14+ (#129) add 4ede294 Trigger GitHub builds for PRs. add 08d754c COMPRESS-548 : throw exception if length of zip extra field is too short add a0ec219 Use US English spelling. add 319a848 COMPRESS-554 : throw IOExcepiton if error is met add 3b28227 COMPRESS-554 : update name of testcase add 4eb3bbe COMPRESS-547 : add asserts for Arrays.copyOf add a40c53d COMPRESS-550 : add writePreamble to ZipArchiveInputStream add b0d590d COMPRESS-550 : document in changes add db932dd Bump actions/setup-java from v1.4.1 to v1.4.2 (#133) add 876ee87 Update actions/setup-java from v1.4.1 to v1.4.2 #133. add 550e525 Update actions/setup-java from v1.4.1 to v1.4.2 #133. add cf2f6c6 Replace Java 14 with Java 15 as the latest Java version to test. Use Jaav 16 EA as the EA version to test. add 0628668 Use uppercase for long literal suffix. add 8a65cc9 Fix for CFH detect in ZipArchiveInputStream add 555daa4 COMPRESS-553 : fix for pax header of tar add 5c6f14c update CRLF to LF add 294f555 Add SECURITY.MD. add f56300c Use lambdas. add ef3cad0 Modify some calls of method Collection.toArray add 80057c3 Typo. add ea1cba2 Access static methods directly. add 82f7401 Use final. add 3c05472 Remove unused import. add cbdb1e2 No need to nest else clause. add d68da9b No need to throw UnsupportedEncodingException from this private method. add e61abc3 No need to throw IOException from this private method. add e0b11ec Use try-with-resources. add e21f1ac Fix Javadoc link. add 99cc3fb Use Objects.equals(). add 3a69e1c Bump junit from 4.13 to 4.13.1 (#143) add d8a0142 Bump actions/checkout from v2.3.2 to v2.3.3 (#139) add 7ceae5d Bump actions/setup-java from v1.4.2 to v1.4.3 (#141) add 16a7fbb Bump junit from 4.13 to 4.13.1 #143. Bump actions/checkout from v2.3.2 to v2.3.3 #139. Bump actions/setup-java from v1.4.2 to v1.4.3 #141. add b08186b Update com.github.luben:zstd-jni from 1.4.5-6 -> 1.4.5-12. add 176abe1 Replace Java 14 with 15 (latest) on Travis. add 9e3c2ae Bump memoryfilesystem from 1.3.0 to 2.1.0 #131. add 16391f3 Bump memoryfilesystem from 1.3.0 to 2.1.0 (#131) add 648602a Merge branch 'master' of https://gitbox.apache.org/repos/asf/commons-compress.git add bf2fbb6 Update Mockito 1.10.19 -> 3.6.0. add 498db43 Remove dead comments. add bc754a3 Bump actions/checkout from v2.3.3 to v2.3.4 (#150) add 4ded585 Bump actions/checkout from v2.3.3 to v2.3.4 #150. add 6c53849 COMPRESS-509 : document this change in changes.xml add 4f52e4a COMPRESS-558: Fix accidentally added / to file names add de1f857 document COMPRESS-558 in changes.xml add 64f01c6 Use a switch instead of a cascading if-else. add 4c73313 Use Objects.hash(). add 32012f9 Remove redundant calls to super(). add 5cacb5d No need to initialize to default. add 993d587 Use final. add d959766 Use try-with-resource. add ae5baa9 Remove redundant modifiers like private on enum constructors. add ef28595 Use blocks. add 37f193f Use final. add 50174a9 Use upper-case 'L' instead of 'l' for long literal suffix. add 0bff34c Redundant return. add b86feb6 COMPRESS-560: Do not return false if the entry is a tar sparse entry add 7d00426 COMPRESS-560: Use assumeTrue/assumeFalse to check if the test should be run on the current OS add 291b49d COMPRESS-560: Add message to assume check add 1258341 document pull request #153 add 05a995e do not use wildcard imports add 75a16c2 COMPRESS-559: Extract sparsefile-0.1 also on Linux add d98e849 COMPRESS-559: Skip extracting with GNU tar 1.28 add 4cf5b34 document COMPRESS-559 and make some comments add 1470c26 Fix spelling. add d462ac4 COMPRESS-561 - Minor improvement add ffc5c18 Add final add 57bcd9b document COMPRESS-561 add 81299d2 add more information about allowStoredEntriesWithDataDescriptor add ced31bd Add Java 17-EA to the GitHub build. add 505feaf Use String#isEmpty(). add b0df98a Breaking binary compatibility should break the build instead of generating a report one might skip reading, especially for PRs. add 4944b13 mockito.version
[commons-compress] branch master updated: remove pattern where we first allocate an array and then try to fill it
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 80124dd remove pattern where we first allocate an array and then try to fill it 80124dd is described below commit 80124dd9fe4b0a0b2e203ca19aacac8cd0afc96f Author: Stefan Bodewig AuthorDate: Fri Jul 2 20:00:47 2021 +0200 remove pattern where we first allocate an array and then try to fill it --- .../archivers/ar/ArArchiveInputStream.java | 16 ++-- .../archivers/arj/ArjArchiveInputStream.java | 29 --- .../archivers/cpio/CpioArchiveInputStream.java | 19 +++-- .../archivers/dump/DumpArchiveInputStream.java | 10 ++- .../compress/archivers/dump/TapeInputStream.java | 11 ++- .../compress/archivers/examples/Expander.java | 2 +- .../compress/archivers/sevenz/SevenZFile.java | 5 +- .../commons/compress/archivers/tar/TarUtils.java | 4 +- .../commons/compress/archivers/zip/BinaryTree.java | 5 +- .../zip/X0017_StrongEncryptionHeader.java | 2 +- .../archivers/zip/ZipArchiveInputStream.java | 27 +++--- .../commons/compress/archivers/zip/ZipFile.java| 24 -- .../org/apache/commons/compress/utils/IOUtils.java | 98 ++ .../apache/commons/compress/utils/IOUtilsTest.java | 84 +++ 14 files changed, 258 insertions(+), 78 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveInputStream.java index 36ef33f..f30951d 100644 --- a/src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveInputStream.java @@ -101,8 +101,8 @@ public class ArArchiveInputStream extends ArchiveInputStream { if (offset == 0) { final byte[] expected = ArchiveUtils.toAsciiBytes(ArArchiveEntry.HEADER); -final byte[] realized = new byte[expected.length]; -final int read = IOUtils.readFully(input, realized); +final byte[] realized = IOUtils.readRange(input, expected.length); +final int read = realized.length; trackReadBytes(read); if (read != expected.length) { throw new IOException("Failed to read header. Occurred at byte: " + getBytesRead()); @@ -133,8 +133,8 @@ public class ArArchiveInputStream extends ArchiveInputStream { { final byte[] expected = ArchiveUtils.toAsciiBytes(ArArchiveEntry.TRAILER); -final byte[] realized = new byte[expected.length]; -final int read = IOUtils.readFully(input, realized); +final byte[] realized = IOUtils.readRange(input, expected.length); +final int read = realized.length; trackReadBytes(read); if (read != expected.length) { throw new IOException("Failed to read entry trailer. Occurred at byte: " + getBytesRead()); @@ -340,8 +340,8 @@ public class ArArchiveInputStream extends ArchiveInputStream { private String getBSDLongName(final String bsdLongName) throws IOException { final int nameLen = Integer.parseInt(bsdLongName.substring(BSD_LONGNAME_PREFIX_LEN)); -final byte[] name = new byte[nameLen]; -final int read = IOUtils.readFully(input, name); +final byte[] name = IOUtils.readRange(input, nameLen); +final int read = name.length; trackReadBytes(read); if (read != nameLen) { throw new EOFException(); @@ -386,8 +386,8 @@ public class ArArchiveInputStream extends ArchiveInputStream { */ private ArArchiveEntry readGNUStringTable(final byte[] length, final int offset, final int len) throws IOException { final int bufflen = asInt(length, offset, len); // Assume length will fit in an int -namebuffer = new byte[bufflen]; -final int read = IOUtils.readFully(input, namebuffer, 0, bufflen); +namebuffer = IOUtils.readRange(input, bufflen); +final int read = namebuffer.length; trackReadBytes(read); if (read != bufflen){ throw new IOException("Failed to read complete // record: expected=" diff --git a/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java index b0c16b2..4b20c12 100644 --- a/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java @@ -20,6 +20,7 @@ package org.apache.commons.compress.archivers.arj; import java.io.ByteArrayInputStream; import java.io.
[commons-compress] branch master updated: sanity check entry sizes in TarFile
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new d80d76f sanity check entry sizes in TarFile d80d76f is described below commit d80d76f81b45655b0b1d234d25ee26f817f7b770 Author: Stefan Bodewig AuthorDate: Fri Jul 2 15:34:45 2021 +0200 sanity check entry sizes in TarFile Credit to OSS-Fuzz --- src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java | 5 - 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 70e314a..5491c8b 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -667,8 +667,11 @@ public class TarFile implements Closeable { private int currentSparseInputStreamIndex; -BoundedTarEntryInputStream(final TarArchiveEntry entry, final SeekableByteChannel channel) { +BoundedTarEntryInputStream(final TarArchiveEntry entry, final SeekableByteChannel channel) throws IOException { super(entry.getDataOffset(), entry.getRealSize()); +if (channel.size() - entry.getSize() < entry.getDataOffset()) { +throw new IOException("entry size exceeds archive size"); +} this.entry = entry; this.channel = channel; }
[commons-compress] branch pack200-harmony-as-fallback updated: adapt to changed test dependencies in master
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/pack200-harmony-as-fallback by this push: new fd3f09c adapt to changed test dependencies in master fd3f09c is described below commit fd3f09c5b824d7f8be1b78bab6bd397d9c800587 Author: Stefan Bodewig AuthorDate: Fri Jul 2 17:04:55 2021 +0200 adapt to changed test dependencies in master --- pom.xml| 8 +- .../utils/FixedLengthBlockOutputStreamTest.java| 32 ++ 2 files changed, 21 insertions(+), 19 deletions(-) diff --git a/pom.xml b/pom.xml index 84657d6..d0225bc 100644 --- a/pom.xml +++ b/pom.xml @@ -117,7 +117,13 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. junit junit - 4.13 + 4.13.2 + test + + + org.hamcrest + hamcrest + 2.2 test diff --git a/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java b/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java index 5c01580..93b5cf5 100644 --- a/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java +++ b/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java @@ -18,16 +18,16 @@ */ package org.apache.commons.compress.utils; +import static org.hamcrest.MatcherAssert.assertThat; +import static org.hamcrest.Matchers.greaterThanOrEqualTo; import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertFalse; -import static org.junit.Assert.assertThat; import static org.junit.Assert.assertTrue; import static org.junit.Assert.fail; import java.io.ByteArrayOutputStream; import java.io.DataInputStream; import java.io.DataOutputStream; -import java.io.FileOutputStream; import java.io.IOException; import java.io.OutputStream; import java.nio.ByteBuffer; @@ -37,9 +37,9 @@ import java.nio.charset.StandardCharsets; import java.nio.file.Files; import java.nio.file.Path; import java.util.concurrent.atomic.AtomicBoolean; + import org.hamcrest.core.IsInstanceOf; import org.junit.Test; -import org.mockito.internal.matchers.GreaterOrEqual; public class FixedLengthBlockOutputStreamTest { @@ -192,18 +192,15 @@ public class FixedLengthBlockOutputStreamTest { @Test public void testWithFileOutputStream() throws IOException { final Path tempFile = Files.createTempFile("xxx", "yyy"); -Runtime.getRuntime().addShutdownHook(new Thread() { -@Override -public void run() { -try { -Files.deleteIfExists(tempFile); -} catch (final IOException e) { -} +Runtime.getRuntime().addShutdownHook(new Thread(() -> { +try { +Files.deleteIfExists(tempFile); +} catch (final IOException e) { } -}); +})); final int blockSize = 512; final int reps = 1000; -final OutputStream os = new FileOutputStream(tempFile.toFile()); +final OutputStream os = Files.newOutputStream(tempFile.toFile().toPath()); try (FixedLengthBlockOutputStream out = new FixedLengthBlockOutputStream( os, blockSize)) { final DataOutputStream dos = new DataOutputStream(out); @@ -294,7 +291,7 @@ public class FixedLengthBlockOutputStreamTest { private static void assertContainsAtOffset(final String msg, final byte[] expected, final int offset, final byte[] actual) { -assertThat(actual.length, new GreaterOrEqual<>(offset + expected.length)); +assertThat(actual.length, greaterThanOrEqualTo(offset + expected.length)); for (int i = 0; i < expected.length; i++) { assertEquals(String.format("%s ([%d])", msg, i), expected[i], actual[i + offset]); } @@ -302,7 +299,7 @@ public class FixedLengthBlockOutputStreamTest { private static class MockOutputStream extends OutputStream { -ByteArrayOutputStream bos = new ByteArrayOutputStream(); +final ByteArrayOutputStream bos = new ByteArrayOutputStream(); private final int requiredWriteSize; private final boolean doPartialWrite; private final AtomicBoolean closed = new AtomicBoolean(); @@ -324,8 +321,7 @@ public class FixedLengthBlockOutputStreamTest { private void checkIsOpen() throws IOException { if (closed.get()) { -final IOException e = new IOException("Closed"); -throw e; +throw new IOException("Closed"); } } @@ -346,7 +342,7 @@ public class FixedLengthBlockOutputStreamTest {
[commons-compress] branch pack200-harmony-as-fallback updated: sync pom with parent, hoping to get github actions build to pass
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/pack200-harmony-as-fallback by this push: new 3b2227a sync pom with parent, hoping to get github actions build to pass 3b2227a is described below commit 3b2227a562500c52baa52799ea3618c72811708d Author: Stefan Bodewig AuthorDate: Fri Jul 2 16:56:59 2021 +0200 sync pom with parent, hoping to get github actions build to pass --- pom.xml | 21 ++--- 1 file changed, 10 insertions(+), 11 deletions(-) diff --git a/pom.xml b/pom.xml index 0940702..84657d6 100644 --- a/pom.xml +++ b/pom.xml @@ -40,8 +40,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. 1.8 1.8 - -5.1.1 +5.1.2 compress org.apache.commons.compress @@ -52,8 +51,8 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. RC1 1.20 -1.10.19 -3.13.0 +3.11.1 +3.14.0 ${project.build.outputDirectory}/META-INF ${commons.manifestlocation}/MANIFEST.MF @@ -69,9 +68,6 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. true - - false - https://svn.apache.org/repos/infra/websites/production/commons/content/proper/${project.artifactId} @@ -79,6 +75,9 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. 4.13.1 1.7.30 +0.8.7 +0.15.3 +3.3.0 @@ -90,7 +89,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. com.github.luben zstd-jni - 1.4.5-6 + 1.5.0-2 true @@ -102,7 +101,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. org.tukaani xz - 1.8 + 1.9 true @@ -130,7 +129,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. com.github.marschall memoryfilesystem - 1.3.0 + 2.1.0 test @@ -162,7 +161,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. org.apache.felix org.apache.felix.framework - 6.0.3 + 7.0.0 test
[commons-compress] branch pack200-harmony-as-fallback updated: sync github actions config with master
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/pack200-harmony-as-fallback by this push: new 18cb531 sync github actions config with master 18cb531 is described below commit 18cb531d61feafdb3b8dcfce621115bc994d36d5 Author: Stefan Bodewig AuthorDate: Fri Jul 2 16:49:27 2021 +0200 sync github actions config with master --- .github/workflows/maven.yml | 14 +++--- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/.github/workflows/maven.yml b/.github/workflows/maven.yml index 699a76f..5b869d5 100644 --- a/.github/workflows/maven.yml +++ b/.github/workflows/maven.yml @@ -15,7 +15,7 @@ name: Java CI -on: [push] +on: [push, pull_request] jobs: build: @@ -29,20 +29,20 @@ jobs: experimental: [false] include: - java: 17-ea -experimental: true +experimental: true steps: -- uses: actions/checkout@v2.3.2 -- uses: actions/cache@v2 +- uses: actions/checkout@v2.3.4 +- uses: actions/cache@v2.1.6 with: path: ~/.m2/repository key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }} restore-keys: | ${{ runner.os }}-maven- - name: Set up JDK ${{ matrix.java }} - uses: actions/setup-java@v1.4.1 + uses: actions/setup-java@v2 with: +distribution: 'adopt' java-version: ${{ matrix.java }} - name: Build with Maven - # TEMP -Ddoclint=none - run: mvn -V --file pom.xml --no-transfer-progress -Ddoclint=none + run: mvn -V --file pom.xml --no-transfer-progress
[commons-compress] branch pack200-harmony-as-fallback updated: run tests on Java 14+ in CI
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/pack200-harmony-as-fallback by this push: new 58c250c run tests on Java 14+ in CI 58c250c is described below commit 58c250c3da9d09ced19ebdeab4ecaebe91a1dc43 Author: Stefan Bodewig AuthorDate: Thu Jul 1 22:17:20 2021 +0200 run tests on Java 14+ in CI --- .github/workflows/maven.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/maven.yml b/.github/workflows/maven.yml index 33afb05..699a76f 100644 --- a/.github/workflows/maven.yml +++ b/.github/workflows/maven.yml @@ -25,10 +25,10 @@ jobs: strategy: fail-fast: false matrix: -java: [ 8, 11, 14 ] +java: [ 8, 11, 16 ] experimental: [false] include: - - java: 15-ea + - java: 17-ea experimental: true steps:
[commons-compress] branch pack200-harmony-as-fallback created (now d3ef88f)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch pack200-harmony-as-fallback in repository https://gitbox.apache.org/repos/asf/commons-compress.git. at d3ef88f try to use java.util.jar.Packe200, only fall back to Harmony This branch includes the following new commits: new d3ef88f try to use java.util.jar.Packe200, only fall back to Harmony The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[commons-compress] branch master updated: sanity check for link length in AsiExtraField
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new ef5d70b sanity check for link length in AsiExtraField ef5d70b is described below commit ef5d70b625000e38404194aaab311b771c44efda Author: Stefan Bodewig AuthorDate: Wed Jun 30 21:45:52 2021 +0200 sanity check for link length in AsiExtraField Credit to OSS-Fuzz --- .../apache/commons/compress/archivers/zip/AsiExtraField.java | 12 ++-- 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java b/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java index fa6c864..bf82a3b 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java @@ -289,17 +289,17 @@ public class AsiExtraField implements ZipExtraField, UnixStat, Cloneable { final int newMode = ZipShort.getValue(tmp, 0); // CheckStyle:MagicNumber OFF -final byte[] linkArray = new byte[(int) ZipLong.getValue(tmp, 2)]; -final int linkArrayLength = linkArray.length; +final int linkArrayLength = (int) ZipLong.getValue(tmp, 2); +if (linkArrayLength < 0 || linkArrayLength > tmp.length - 10) { +throw new ZipException("Bad symbolic link name length " + linkArrayLength ++ " in ASI extra field"); +} uid = ZipShort.getValue(tmp, 6); gid = ZipShort.getValue(tmp, 8); - if (linkArrayLength == 0) { link = ""; -} else if (linkArrayLength > tmp.length - 10) { -throw new ZipException("Bad symbolic link name length " + linkArrayLength -+ " in ASI extra field"); } else { +final byte[] linkArray = new byte[linkArrayLength]; System.arraycopy(tmp, 10, linkArray, 0, linkArrayLength); link = new String(linkArray); // Uses default charset - see class Javadoc }
[commons-compress] branch master updated: potential integer overflow in check
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 7ce1b07 potential integer overflow in check 7ce1b07 is described below commit 7ce1b0796d6cbe1f41b969583bd49f33ae0efef0 Author: Stefan Bodewig AuthorDate: Wed Jun 30 22:01:22 2021 +0200 potential integer overflow in check --- .../java/org/apache/commons/compress/archivers/tar/TarUtils.java | 5 - 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index d809125..ec12f17 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -741,13 +741,16 @@ public class TarUtils { while((ch = inputStream.read()) != -1) { read++; totalRead++; +if (totalRead < 0 || (headerSize >= 0 && totalRead >= headerSize)) { +break; +} if (ch == '='){ // end of keyword final String keyword = coll.toString(CharsetNames.UTF_8); // Get rest of entry final int restLen = len - read; if (restLen <= 1) { // only NL headers.remove(keyword); -} else if (headerSize >= 0 && totalRead + restLen > headerSize) { +} else if (headerSize >= 0 && restLen > headerSize - totalRead) { throw new IOException("Paxheader value size " + restLen + " exceeds size of header record"); } else {
[commons-compress] branch master updated (488425c -> d0af873)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 488425c only try to recover from broken 7z archive if explicitly asked to new 3fe6b42 make sure coders are only used once in folder new d0af873 make sure PAX header value fits into the size of the current entry The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../commons/compress/archivers/sevenz/Folder.java | 6 +++- .../archivers/tar/TarArchiveInputStream.java | 4 +-- .../commons/compress/archivers/tar/TarFile.java| 5 +-- .../commons/compress/archivers/tar/TarUtils.java | 42 +- 4 files changed, 51 insertions(+), 6 deletions(-)
[commons-compress] 02/02: make sure PAX header value fits into the size of the current entry
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit d0af873e77d16f41edfef7b69da5c8c35c96a650 Author: Stefan Bodewig AuthorDate: Sat Jun 5 21:01:22 2021 +0200 make sure PAX header value fits into the size of the current entry --- .../archivers/tar/TarArchiveInputStream.java | 4 +-- .../commons/compress/archivers/tar/TarFile.java| 5 +-- .../commons/compress/archivers/tar/TarUtils.java | 42 +- 3 files changed, 46 insertions(+), 5 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index 7bf705e..7a07926 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -567,7 +567,7 @@ public class TarArchiveInputStream extends ArchiveInputStream { } private void readGlobalPaxHeaders() throws IOException { -globalPaxHeaders = TarUtils.parsePaxHeaders(this, globalSparseHeaders, globalPaxHeaders); +globalPaxHeaders = TarUtils.parsePaxHeaders(this, globalSparseHeaders, globalPaxHeaders, entrySize); getNextEntry(); // Get the actual file entry if (currEntry == null) { @@ -602,7 +602,7 @@ public class TarArchiveInputStream extends ArchiveInputStream { */ private void paxHeaders() throws IOException { List sparseHeaders = new ArrayList<>(); -final Map headers = TarUtils.parsePaxHeaders(this, sparseHeaders, globalPaxHeaders); +final Map headers = TarUtils.parsePaxHeaders(this, sparseHeaders, globalPaxHeaders, entrySize); // for 0.1 PAX Headers if (headers.containsKey("GNU.sparse.map")) { diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index af87df5..70e314a 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -425,7 +425,7 @@ public class TarFile implements Closeable { List sparseHeaders = new ArrayList<>(); final Map headers; try (final InputStream input = getInputStream(currEntry)) { -headers = TarUtils.parsePaxHeaders(input, sparseHeaders, globalPaxHeaders); +headers = TarUtils.parsePaxHeaders(input, sparseHeaders, globalPaxHeaders, currEntry.getSize()); } // for 0.1 PAX Headers @@ -455,7 +455,8 @@ public class TarFile implements Closeable { private void readGlobalPaxHeaders() throws IOException { try (InputStream input = getInputStream(currEntry)) { -globalPaxHeaders = TarUtils.parsePaxHeaders(input, globalSparseHeaders, globalPaxHeaders); +globalPaxHeaders = TarUtils.parsePaxHeaders(input, globalSparseHeaders, globalPaxHeaders, +currEntry.getSize()); } getNextTarEntry(); // Get the actual file entry diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index 1476bba..d809125 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -685,18 +685,53 @@ public class TarUtils { * @param globalPaxHeaders global PAX headers of the tar archive * @return map of PAX headers values found inside of the current (local or global) PAX headers tar entry. * @throws IOException if an I/O error occurs. + * @deprecated use the four-arg version instead */ +@Deprecated protected static Map parsePaxHeaders(final InputStream inputStream, final List sparseHeaders, final Map globalPaxHeaders) throws IOException { +return parsePaxHeaders(inputStream, sparseHeaders, globalPaxHeaders, -1); +} + +/** + * For PAX Format 0.0, the sparse headers(GNU.sparse.offset and GNU.sparse.numbytes) + * may appear multi times, and they look like: + * + * GNU.sparse.size=size + * GNU.sparse.numblocks=numblocks + * repeat numblocks times + * GNU.sparse.offset=offset + * GNU.sparse.numbytes=numbytes + * end repeat + * + * For PAX Format 0.1, the sparse headers are stored in a single variable : GNU.sparse.map + * + * GNU.sparse.map + *Map of non-null data chunks. It is a string consisting of comma-separated values "offset,size[,offset-1,size-1...]" + * + * @param inputStream input stream to read keys and values + * @param sparseHeaders used in PAX Format 0.0 0.1,
[commons-compress] 01/02: make sure coders are only used once in folder
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 3fe6b42110dc56d0d6fe0aaf80cfecb8feea5321 Author: Stefan Bodewig AuthorDate: Sun Jun 6 08:09:38 2021 +0200 make sure coders are only used once in folder --- .../java/org/apache/commons/compress/archivers/sevenz/Folder.java | 6 +- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java index 1725be0..82fe51f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java @@ -17,6 +17,7 @@ */ package org.apache.commons.compress.archivers.sevenz; +import java.io.IOException; import java.util.Collections; import java.util.LinkedList; @@ -53,13 +54,16 @@ class Folder { * only support single input stream decoders), the second reads * from the output of the first and so on. */ -Iterable getOrderedCoders() { +Iterable getOrderedCoders() throws IOException { if (packedStreams == null || coders == null || packedStreams.length == 0 || coders.length == 0) { return Collections.emptyList(); } final LinkedList l = new LinkedList<>(); int current = (int) packedStreams[0]; // more that 2^31 coders? while (current >= 0 && current < coders.length) { +if (l.contains(coders[current])) { +throw new IOException("folder uses the same coder more than once in coder chain"); +} l.addLast(coders[current]); final int pair = findBindPairForOutStream(current); current = pair != -1 ? (int) bindPairs[pair].inIndex : -1;
[commons-compress] branch master updated: only try to recover from broken 7z archive if explicitly asked to
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 488425c only try to recover from broken 7z archive if explicitly asked to 488425c is described below commit 488425c1b9fb8c8d0f1ef1ce7d665058880870e2 Author: Stefan Bodewig AuthorDate: Sun Jun 27 22:06:22 2021 +0200 only try to recover from broken 7z archive if explicitly asked to --- src/changes/changes.xml| 4 +++ .../compress/archivers/sevenz/SevenZFile.java | 7 +++- .../archivers/sevenz/SevenZFileOptions.java| 42 -- .../compress/archivers/sevenz/SevenZFileTest.java | 26 +++--- 4 files changed, 71 insertions(+), 8 deletions(-) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 6225284..9b6cdfb 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -191,6 +191,10 @@ The type attribute can be add,update,fix,remove. Also added sanity checks before even trying to parse an archive and made SevenZFileOptions' maxMemorySizeInKb apply to the stored metadata for an archive. + +And further added an option that needs to be enabled in order +to make SevenZFile try to recover a broken archive. This is a +backwards incompatible change. diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 967d6e9..fde329e 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -474,7 +474,12 @@ public class SevenZFile implements Closeable { return initializeArchive(startHeader, password, true); } // No valid header found - probably first file of multipart archive was removed too early. Scan for end header. -return tryToLocateEndHeader(password); +if (options.getTryToRecoverBrokenArchives()) { +return tryToLocateEndHeader(password); +} +throw new IOException("archive seems to be invalid.\nYou may want to retry and enable the" ++ " tryToRecoverBrokenArchives if the archive could be a multi volume archive that has been closed" ++ " prematurely."); } private Archive tryToLocateEndHeader(final byte[] password) throws IOException { diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java index ad920b5..d886091 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java @@ -26,13 +26,17 @@ package org.apache.commons.compress.archivers.sevenz; public class SevenZFileOptions { private static final int DEFAUL_MEMORY_LIMIT_IN_KB = Integer.MAX_VALUE; private static final boolean DEFAULT_USE_DEFAULTNAME_FOR_UNNAMED_ENTRIES= false; +private static final boolean DEFAULT_TRY_TO_RECOVER_BROKEN_ARCHIVES = false; private final int maxMemoryLimitInKb; private final boolean useDefaultNameForUnnamedEntries; +private final boolean tryToRecoverBrokenArchives; -private SevenZFileOptions(final int maxMemoryLimitInKb, final boolean useDefaultNameForUnnamedEntries) { +private SevenZFileOptions(final int maxMemoryLimitInKb, final boolean useDefaultNameForUnnamedEntries, +final boolean tryToRecoverBrokenArchives) { this.maxMemoryLimitInKb = maxMemoryLimitInKb; this.useDefaultNameForUnnamedEntries = useDefaultNameForUnnamedEntries; +this.tryToRecoverBrokenArchives = tryToRecoverBrokenArchives; } /** @@ -44,7 +48,8 @@ public class SevenZFileOptions { * */ public static final SevenZFileOptions DEFAULT = new SevenZFileOptions(DEFAUL_MEMORY_LIMIT_IN_KB, -DEFAULT_USE_DEFAULTNAME_FOR_UNNAMED_ENTRIES); +DEFAULT_USE_DEFAULTNAME_FOR_UNNAMED_ENTRIES, +DEFAULT_TRY_TO_RECOVER_BROKEN_ARCHIVES); /** * Obtains a builder for SevenZFileOptions. @@ -78,6 +83,15 @@ public class SevenZFileOptions { } /** + * Whether {@link SevenZFile} shall try to recover from a certain type of broken archive. + * @return whether SevenZFile shall try to recover from a certain type of broken archive. + * @since 1.21 + */ +public boolean getTryToRecoverBrokenArchives() { +return tryToRecoverBrokenArchives; +} + +/** * Mutable builder for the immutable {@link SevenZFileOptions}. * * @since 1.19 @@ -85,6 +99,8 @@ public class SevenZFi
[commons-compress] branch master updated: remove some redundant tests of the read phase that have been checked earlier
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new c24ac00 remove some redundant tests of the read phase that have been checked earlier c24ac00 is described below commit c24ac00d27f2347250fadf1db92ed516a6576e0d Author: Stefan Bodewig AuthorDate: Sun Jun 27 15:28:34 2021 +0200 remove some redundant tests of the read phase that have been checked earlier --- .../compress/archivers/sevenz/SevenZFile.java | 101 ++--- 1 file changed, 7 insertions(+), 94 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index d84ab5e..967d6e9 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -601,10 +601,6 @@ public class SevenZFile implements Closeable { readFilesInfo(header, archive); nid = getUnsignedByte(header); } - -if (nid != NID.kEnd) { -throw new IOException("Badly terminated header, found " + nid); -} } private ArchiveStatistics sanityCheckAndCollectStatistics(final ByteBuffer header) @@ -755,10 +751,6 @@ public class SevenZFile implements Closeable { readSubStreamsInfo(header, archive); nid = getUnsignedByte(header); } - -if (nid != NID.kEnd) { -throw new IOException("Badly terminated StreamsInfo"); -} } private void sanityCheckPackInfo(final ByteBuffer header, final ArchiveStatistics stats) throws IOException { @@ -801,9 +793,7 @@ public class SevenZFile implements Closeable { private void readPackInfo(final ByteBuffer header, final Archive archive) throws IOException { archive.packPos = readUint64(header); -final long numPackStreams = readUint64(header); -assertFitsIntoNonNegativeInt("numPackStreams", numPackStreams); -final int numPackStreamsInt = (int) numPackStreams; +final int numPackStreamsInt = (int) readUint64(header); int nid = getUnsignedByte(header); if (nid == NID.kSize) { archive.packSizes = new long[numPackStreamsInt]; @@ -824,10 +814,6 @@ public class SevenZFile implements Closeable { nid = getUnsignedByte(header); } - -if (nid != NID.kEnd) { -throw new IOException("Badly terminated PackInfo (" + nid + ")"); -} } private void sanityCheckUnpackInfo(final ByteBuffer header, final ArchiveStatistics stats) @@ -885,26 +871,15 @@ public class SevenZFile implements Closeable { private void readUnpackInfo(final ByteBuffer header, final Archive archive) throws IOException { int nid = getUnsignedByte(header); -if (nid != NID.kFolder) { -throw new IOException("Expected kFolder, got " + nid); -} -final long numFolders = readUint64(header); -assertFitsIntoNonNegativeInt("numFolders", numFolders); -final int numFoldersInt = (int) numFolders; +final int numFoldersInt = (int) readUint64(header); final Folder[] folders = new Folder[numFoldersInt]; archive.folders = folders; final int external = getUnsignedByte(header); -if (external != 0) { -throw new IOException("External unsupported"); -} for (int i = 0; i < numFoldersInt; i++) { folders[i] = readFolder(header); } nid = getUnsignedByte(header); -if (nid != NID.kCodersUnpackSize) { -throw new IOException("Expected kCodersUnpackSize, got " + nid); -} for (final Folder folder : folders) { assertFitsIntoNonNegativeInt("totalOutputStreams", folder.totalOutputStreams); folder.unpackSizes = new long[(int)folder.totalOutputStreams]; @@ -927,10 +902,6 @@ public class SevenZFile implements Closeable { nid = getUnsignedByte(header); } - -if (nid != NID.kEnd) { -throw new IOException("Badly terminated UnpackInfo"); -} } private void sanityCheckSubStreamsInfo(final ByteBuffer header, final ArchiveStatistics stats) throws IOException { @@ -954,13 +925,11 @@ public class SevenZFile implements Closeable { if (numUnpackSubStreams == 0) { continue; } -long sum = 0; for (int i = 0; i < numUnpackSubStreams - 1; i++) { final long size = readUint
[commons-compress] 01/02: simplify comparator
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 3af47ee06aab8fd715c304695d18ebc88eb9aa09 Author: Stefan Bodewig AuthorDate: Fri Jun 11 22:41:19 2021 +0200 simplify comparator --- .../commons/compress/archivers/zip/ZipFile.java| 25 ++ 1 file changed, 2 insertions(+), 23 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java index 8792563..3ae2ecf 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java @@ -1390,29 +1390,8 @@ public class ZipFile implements Closeable { * @since 1.1 */ private final Comparator offsetComparator = -(e1, e2) -> { -if (e1 == e2) { -return 0; -} - -final Entry ent1 = e1 instanceof Entry ? (Entry) e1 : null; -final Entry ent2 = e2 instanceof Entry ? (Entry) e2 : null; -if (ent1 == null) { -return 1; -} -if (ent2 == null) { -return -1; -} - -// disk number is prior to relative offset -final long diskNumberStartVal = ent1.getDiskNumberStart() - ent2.getDiskNumberStart(); -if (diskNumberStartVal != 0) { -return diskNumberStartVal < 0 ? -1 : +1; -} -final long val = (ent1.getLocalHeaderOffset() -- ent2.getLocalHeaderOffset()); -return val == 0 ? 0 : val < 0 ? -1 : +1; -}; +Comparator.comparingLong(ZipArchiveEntry::getDiskNumberStart) +.thenComparingLong(ZipArchiveEntry::getLocalHeaderOffset); /** * Extends ZipArchiveEntry to store the offset within the archive.
[commons-compress] 02/02: ZipFile could use some sanity checks as well
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 4eea95853e15e832104118b392a6a1cd0ce05841 Author: Stefan Bodewig AuthorDate: Fri Jun 11 23:08:42 2021 +0200 ZipFile could use some sanity checks as well --- .../commons/compress/archivers/zip/ZipFile.java| 59 +++--- .../commons/compress/archivers/ZipTestCase.java| 4 +- 2 files changed, 54 insertions(+), 9 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java index 3ae2ecf..0f2bb53 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java @@ -160,6 +160,9 @@ public class ZipFile implements Closeable { private final ByteBuffer cfhBbuf = ByteBuffer.wrap(cfhBuf); private final ByteBuffer shortBbuf = ByteBuffer.wrap(shortBuf); +private long centralDirectoryStartDiskNumber, centralDirectoryStartRelativeOffset; +private long centralDirectoryStartOffset; + /** * Opens the given file for reading, assuming "UTF8" for file names. * @@ -708,6 +711,7 @@ public class ZipFile implements Closeable { new HashMap<>(); positionAtCentralDirectory(); +centralDirectoryStartOffset = archive.position(); wordBbuf.rewind(); IOUtils.readFully(archive, wordBbuf); @@ -791,12 +795,21 @@ public class ZipFile implements Closeable { final int fileNameLen = ZipShort.getValue(cfhBuf, off); off += SHORT; +if (fileNameLen < 0) { +throw new IOException("broken archive, entry with negative fileNameLen"); +} final int extraLen = ZipShort.getValue(cfhBuf, off); off += SHORT; +if (extraLen < 0) { +throw new IOException("broken archive, entry with negative extraLen"); +} final int commentLen = ZipShort.getValue(cfhBuf, off); off += SHORT; +if (commentLen < 0) { +throw new IOException("broken archive, entry with negative commentLen"); +} ze.setDiskNumberStart(ZipShort.getValue(cfhBuf, off)); off += SHORT; @@ -827,6 +840,7 @@ public class ZipFile implements Closeable { } setSizesAndOffsetFromZip64Extra(ze); +sanityCheckLFHOffset(ze); final byte[] comment = new byte[commentLen]; IOUtils.readFully(archive, ByteBuffer.wrap(comment)); @@ -839,6 +853,28 @@ public class ZipFile implements Closeable { ze.setStreamContiguous(true); } +private void sanityCheckLFHOffset(final ZipArchiveEntry ze) throws IOException { +if (ze.getDiskNumberStart() < 0) { +throw new IOException("broken archive, entry with negative disk number"); +} +if (ze.getLocalHeaderOffset() < 0) { +throw new IOException("broken archive, entry with negative local file header offset"); +} +if (isSplitZipArchive) { +if (ze.getDiskNumberStart() > centralDirectoryStartDiskNumber) { +throw new IOException("local file header for " + ze.getName() + " starts on a later disk than central directory"); +} +if (ze.getDiskNumberStart() == centralDirectoryStartDiskNumber +&& ze.getLocalHeaderOffset() > centralDirectoryStartRelativeOffset) { +throw new IOException("local file header for " + ze.getName() + " starts after central directory"); +} +} else { +if (ze.getLocalHeaderOffset() > centralDirectoryStartOffset) { +throw new IOException("local file header for " + ze.getName() + " starts after central directory"); +} +} +} + /** * If the entry holds a Zip64 extended information extra field, * read sizes from there if the entry's sizes are set to @@ -1115,21 +1151,23 @@ public class ZipFile implements Closeable { - WORD /* signature has already been read */); wordBbuf.rewind(); IOUtils.readFully(archive, wordBbuf); -final long diskNumberOfCFD = ZipLong.getValue(wordBuf); +centralDirectoryStartDiskNumber = ZipLong.getValue(wordBuf); skipBytes(ZIP64_EOCD_CFD_LOCATOR_RELATIVE_OFFSET); dwordBbuf.rewind(); IOUtils.readFully(archive, dwordBbuf); -final long relativeOffsetOfCFD = ZipEightByteInteger.getLongValue(dwordBuf); +centralDirectoryStartRelativeOffset = ZipEightByteInteger.getLongValue(dwordBuf); ((ZipSplitRead
[commons-compress] branch master updated (e1e5635 -> 4eea958)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from e1e5635 Bump actions/cache from 2.1.4 to 2.1.6 (#200) new 3af47ee simplify comparator new 4eea958 ZipFile could use some sanity checks as well The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../commons/compress/archivers/zip/ZipFile.java| 84 ++ .../commons/compress/archivers/ZipTestCase.java| 4 +- 2 files changed, 56 insertions(+), 32 deletions(-)
[commons-compress] branch master updated: COMPRESS-542 make memory limit configuration apply to 7z metadata as well
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new ae2b27c COMPRESS-542 make memory limit configuration apply to 7z metadata as well ae2b27c is described below commit ae2b27cc011f47f0289cb24a11f2d4f1db711f8a Author: Stefan Bodewig AuthorDate: Sun Jun 6 08:20:13 2021 +0200 COMPRESS-542 make memory limit configuration apply to 7z metadata as well --- src/changes/changes.xml | 4 .../commons/compress/archivers/sevenz/SevenZFile.java | 12 +--- .../compress/archivers/sevenz/SevenZFileOptions.java| 17 +++-- 3 files changed, 24 insertions(+), 9 deletions(-) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 439187b..d713d11 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -187,6 +187,10 @@ The type attribute can be add,update,fix,remove. Make the memory allocation in SevenZFile.readFilesInfo a lazy allocation to avoid OOM when dealing some giant 7z archives. Github Pull Request #120. + +Also added sanity checks before even trying to parse an +archive and made SevenZFileOptions' maxMemorySizeInKb apply to +the stored metadata for an archive. diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index f4bec6c..d84ab5e 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -45,6 +45,7 @@ import java.util.Map; import java.util.stream.Collectors; import java.util.zip.CRC32; +import org.apache.commons.compress.MemoryLimitException; import org.apache.commons.compress.utils.BoundedInputStream; import org.apache.commons.compress.utils.ByteUtils; import org.apache.commons.compress.utils.CRC32VerifyingInputStream; @@ -576,7 +577,7 @@ public class SevenZFile implements Closeable { private void readHeader(final ByteBuffer header, final Archive archive) throws IOException { final int pos = header.position(); final ArchiveStatistics stats = sanityCheckAndCollectStatistics(header); -stats.assertValidity(); +stats.assertValidity(options.getMaxMemoryLimitInKb()); header.position(pos); int nid = getUnsignedByte(header); @@ -669,7 +670,7 @@ public class SevenZFile implements Closeable { final int pos = header.position(); ArchiveStatistics stats = new ArchiveStatistics(); sanityCheckStreamsInfo(header, stats); -stats.assertValidity(); +stats.assertValidity(options.getMaxMemoryLimitInKb()); header.position(pos); readStreamsInfo(header, archive); @@ -2191,13 +2192,18 @@ public class SevenZFile implements Closeable { return 2 * lowerBound /* conservative guess */; } -void assertValidity() throws IOException { +void assertValidity(int maxMemoryLimitInKb) throws IOException { if (numberOfEntriesWithStream > 0 && numberOfFolders == 0) { throw new IOException("archive with entries but no folders"); } if (numberOfEntriesWithStream > numberOfUnpackSubStreams) { throw new IOException("archive doesn't contain enough substreams for entries"); } + +final long memoryNeededInKb = estimateSize() / 1024; +if (maxMemoryLimitInKb < memoryNeededInKb) { +throw new MemoryLimitException(memoryNeededInKb, maxMemoryLimitInKb); +} } private long folderSize() { diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java index 4869225..ad920b5 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFileOptions.java @@ -55,9 +55,12 @@ public class SevenZFileOptions { } /** - * Gets the maximum amount of memory to use for extraction. Not - * all codecs will honor this setting. Currently only lzma and - * lzma2 are supported. + * Gets the maximum amount of memory to use for parsing the + * archive and during extraction. + * + * Not all codecs will honor this setting. Currently only lzma + * and lzma2 are supported. + * * @return the maximum amount of memory to use for extraction */ public int getMaxMemoryLimitInKb() { @@ -83,9 +86,11 @@ public class SevenZFileOptions {
[commons-compress] branch catch-RuntimeExceptions created (now 6cb3167)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch catch-RuntimeExceptions in repository https://gitbox.apache.org/repos/asf/commons-compress.git. at 6cb3167 turn RuntimeExceptions into IOExceptions This branch includes the following new commits: new 6cb3167 turn RuntimeExceptions into IOExceptions The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
[commons-compress] 01/01: turn RuntimeExceptions into IOExceptions
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch catch-RuntimeExceptions in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 6cb3167e9bdc92b96eed309f938fa1ec76c19444 Author: Stefan Bodewig AuthorDate: Sun Jun 6 07:36:08 2021 +0200 turn RuntimeExceptions into IOExceptions --- .../commons/compress/UnhandledInputException.java | 59 ++ .../compress/archivers/sevenz/SevenZFile.java | 19 ++- .../commons/compress/archivers/tar/TarFile.java| 13 + .../commons/compress/archivers/zip/ZipFile.java| 14 - 4 files changed, 102 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/UnhandledInputException.java b/src/main/java/org/apache/commons/compress/UnhandledInputException.java new file mode 100644 index 000..920c538 --- /dev/null +++ b/src/main/java/org/apache/commons/compress/UnhandledInputException.java @@ -0,0 +1,59 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.commons.compress; + +import java.io.IOException; + +/** + * Thrown if reading from an archive or compressed stream results in a RuntimeException. + * + * Usually this means the input has been corrupt in a way Compress' + * code didn't detect by itself. If the input is not corrupt then + * you've found a bug in Compress and we ask you to report it. + * + * @since 1.21 + */ +public class UnhandledInputException extends IOException { + +private static final long serialVersionUID = 1L; + +/** + * Wraps an unhandled RuntimeException for an input of unknown name. + */ +public UnhandledInputException(final RuntimeException ex) { +this(ex, null); +} + +/** + * Wraps an unhandled RuntimeException for an input with a known name. + * + * @param ex the unhandled excetion + * @param inputName name of the input + */ +public UnhandledInputException(final RuntimeException ex, final String inputName) { +super(buildMessage(inputName), ex); +} + +private static String buildMessage(final String name) { +return "Either the input" ++ (name == null ? "" : " " + name) ++ " is corrupt or you have found a bug in Apache Commons Compress. Please report it at" ++ " https://issues.apache.org/jira/browse/COMPRESS if you think this is a bug."; +} +} diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index f4bec6c..351a858 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -45,6 +45,7 @@ import java.util.Map; import java.util.stream.Collectors; import java.util.zip.CRC32; +import org.apache.commons.compress.UnhandledInputException; import org.apache.commons.compress.utils.BoundedInputStream; import org.apache.commons.compress.utils.ByteUtils; import org.apache.commons.compress.utils.CRC32VerifyingInputStream; @@ -346,6 +347,8 @@ public class SevenZFile implements Closeable { this.password = null; } succeeded = true; +} catch (RuntimeException ex) { +throw new UnhandledInputException(ex, fileName); } finally { if (!succeeded && closeOnError) { this.channel.close(); @@ -410,7 +413,11 @@ public class SevenZFile implements Closeable { if (entry.getName() == null && options.getUseDefaultNameForUnnamedEntries()) { entry.setName(getDefaultName()); } -buildDecodingStream(currentEntryIndex, false); +try { +buildDecodingStream(currentEntryIndex, false); +} catch (RuntimeException ex) { +throw new UnhandledInputException(ex); +} uncompressedBytesReadFromCurrentEntry = compressedBytesReadFromCurrentEntry = 0; return entry; } @@ -510,7 +517,7 @@ public class SevenZFile implements C
[commons-compress] branch master updated: properly document difference between tar getSize and getRealSize ...
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 9450bcc properly document difference between tar getSize and getRealSize ... 9450bcc is described below commit 9450bcc7eefa2d6acf1bdbf49740934a6426cf5e Author: Stefan Bodewig AuthorDate: Sat Jun 5 12:47:31 2021 +0200 properly document difference between tar getSize and getRealSize ... ... and simplify a few unnecessary isSparse branches --- .../commons/compress/archivers/tar/TarArchiveEntry.java | 8 +++- .../compress/archivers/tar/TarArchiveInputStream.java | 11 ++- .../org/apache/commons/compress/archivers/tar/TarFile.java | 13 +++-- 3 files changed, 12 insertions(+), 20 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index ff10db2..44dcf54 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -905,6 +905,9 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO /** * Get this entry's file size. * + * This is the size the entry's data uses inside of the archive. Usually this is the same as {@link + * #getRealSize}, but it doesn't take the "holes" into account when the entry represents a sparse file. + * * @return This entry's file size. */ @Override @@ -1057,13 +1060,16 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO /** * Get this entry's real file size in case of a sparse file. + * + * This is the size a file would take on disk if the entry was expanded. + * * If the file is not a sparse file, return size instead of realSize. * * @return This entry's real file size, if the file is not a sparse file, return size instead of realSize. */ public long getRealSize() { if (!isSparse()) { -return size; +return getSize(); } return realSize; } diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index 6311bd3..7bf705e 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -725,15 +725,8 @@ public class TarArchiveInputStream extends ArchiveInputStream { throw new IllegalStateException("No current tar entry"); } -if (!currEntry.isSparse()) { -if (entryOffset >= entrySize) { -return -1; -} -} else { -// for sparse entries, there are actually currEntry.getRealSize() bytes to read -if (entryOffset >= currEntry.getRealSize()) { -return -1; -} +if (entryOffset >= currEntry.getRealSize()) { +return -1; } numToRead = Math.min(numToRead, available()); diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 15adb18..e79d390 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -666,22 +666,15 @@ public class TarFile implements Closeable { private int currentSparseInputStreamIndex; BoundedTarEntryInputStream(final TarArchiveEntry entry, final SeekableByteChannel channel) { -super(entry.getDataOffset(), entry.isSparse() ? entry.getRealSize() : entry.getSize()); +super(entry.getDataOffset(), entry.getRealSize()); this.entry = entry; this.channel = channel; } @Override protected int read(final long pos, final ByteBuffer buf) throws IOException { -if (entry.isSparse()) { -// for sparse entries, there are actually currEntry.getRealSize() bytes to read -if (entryOffset >= entry.getRealSize()) { -return -1; -} -} else { -if (entryOffset >= entry.getSize()) { -return -1; -} +if (entryOffset >= entry.getRealSize()) { +return -1; } final int totalRead;
[commons-compress] branch master updated: add TarFile special case to Lister and Expander
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new e0d4555 add TarFile special case to Lister and Expander e0d4555 is described below commit e0d455582500ad0ff70dbb89b02d875ebd70b110 Author: Stefan Bodewig AuthorDate: Sat Jun 5 08:57:41 2021 +0200 add TarFile special case to Lister and Expander --- .../apache/commons/compress/archivers/Lister.java | 14 + .../compress/archivers/examples/Expander.java | 31 ++-- .../commons/compress/archivers/tar/TarFile.java| 12 +++- .../compress/archivers/examples/ExpanderTest.java | 34 ++ 4 files changed, 88 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/Lister.java b/src/main/java/org/apache/commons/compress/archivers/Lister.java index 0b65199..e50020f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/Lister.java +++ b/src/main/java/org/apache/commons/compress/archivers/Lister.java @@ -25,6 +25,8 @@ import java.io.InputStream; import java.nio.file.Files; import java.util.Enumeration; import org.apache.commons.compress.archivers.sevenz.SevenZFile; +import org.apache.commons.compress.archivers.tar.TarArchiveEntry; +import org.apache.commons.compress.archivers.tar.TarFile; import org.apache.commons.compress.archivers.zip.ZipArchiveEntry; import org.apache.commons.compress.archivers.zip.ZipFile; @@ -55,6 +57,8 @@ public final class Lister { list7z(f); } else if ("zipfile".equals(format)) { listZipUsingZipFile(f); +} else if ("tarfile".equals(format)) { +listZipUsingTarFile(f); } else { listStream(f, args); } @@ -106,9 +110,19 @@ public final class Lister { } } +private static void listZipUsingTarFile(final File f) throws ArchiveException, IOException { +try (TarFile t = new TarFile(f)) { +System.out.println("Created " + t.toString()); +for (TarArchiveEntry en : t.getEntries()) { +System.out.println(en.getName()); +} +} +} + private static void usage() { System.out.println("Parameters: archive-name [archive-type]\n"); System.out.println("the magic archive-type 'zipfile' prefers ZipFile over ZipArchiveInputStream"); +System.out.println("the magic archive-type 'tarfile' prefers TarFile over TarArchiveInputStream"); } } diff --git a/src/main/java/org/apache/commons/compress/archivers/examples/Expander.java b/src/main/java/org/apache/commons/compress/archivers/examples/Expander.java index 1902a8b..bbbae00 100644 --- a/src/main/java/org/apache/commons/compress/archivers/examples/Expander.java +++ b/src/main/java/org/apache/commons/compress/archivers/examples/Expander.java @@ -29,12 +29,15 @@ import java.nio.channels.SeekableByteChannel; import java.nio.file.Files; import java.nio.file.StandardOpenOption; import java.util.Enumeration; +import java.util.Iterator; import org.apache.commons.compress.archivers.ArchiveEntry; import org.apache.commons.compress.archivers.ArchiveException; import org.apache.commons.compress.archivers.ArchiveInputStream; import org.apache.commons.compress.archivers.ArchiveStreamFactory; import org.apache.commons.compress.archivers.sevenz.SevenZFile; +import org.apache.commons.compress.archivers.tar.TarArchiveEntry; +import org.apache.commons.compress.archivers.tar.TarFile; import org.apache.commons.compress.archivers.zip.ZipArchiveEntry; import org.apache.commons.compress.archivers.zip.ZipFile; import org.apache.commons.compress.utils.IOUtils; @@ -237,12 +240,14 @@ public class Expander { try (CloseableConsumerAdapter c = new CloseableConsumerAdapter(closeableConsumer)) { if (!prefersSeekableByteChannel(format)) { expand(format, c.track(Channels.newInputStream(archive)), targetDirectory); +} else if (ArchiveStreamFactory.TAR.equalsIgnoreCase(format)) { +expand(c.track(new TarFile(archive)), targetDirectory); } else if (ArchiveStreamFactory.ZIP.equalsIgnoreCase(format)) { expand(c.track(new ZipFile(archive)), targetDirectory); } else if (ArchiveStreamFactory.SEVEN_Z.equalsIgnoreCase(format)) { expand(c.track(new SevenZFile(archive)), targetDirectory); } else { -// never reached as prefersSeekableByteChannel only returns true for ZIP and 7z +// never reached as prefersSeekableByteChannel only returns true for TAR, ZIP and 7z throw new ArchiveException("Don't know how to handle format " + format); } } @@ -274,6 +279,26 @@ public class Expander {
[commons-compress] branch master updated: avoid NPE when there is no GNU tar extended header where one should be
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new c5b60ca avoid NPE when there is no GNU tar extended header where one should be c5b60ca is described below commit c5b60cabd3ca50072f2eb1a9aec36c89babf Author: Stefan Bodewig AuthorDate: Sat Jun 5 08:50:00 2021 +0200 avoid NPE when there is no GNU tar extended header where one should be Credit to OSS-Fuzz --- src/changes/changes.xml | 5 + .../apache/commons/compress/archivers/tar/TarArchiveInputStream.java | 3 +-- src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java | 3 +-- 3 files changed, 7 insertions(+), 4 deletions(-) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 82dd735..439187b 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -356,6 +356,11 @@ The type attribute can be add,update,fix,remove. due-to="Brett Okken"> gzip deflate buffer size is now configurable. + +The parser for GNU sparse tar headers could throw a +NullPointerExcpetion rather than an IOException if the archive +ended while more sparse headers were expected. + diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index 7ff96eb..6311bd3 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -643,8 +643,7 @@ public class TarArchiveInputStream extends ArchiveInputStream { do { final byte[] headerBuf = getRecord(); if (headerBuf == null) { -currEntry = null; -break; +throw new IOException("premature end of tar archive. Didn't find extended_header after header with extended flag."); } entry = new TarArchiveSparseEntry(headerBuf); currEntry.getSparseHeaders().addAll(entry.getSparseHeaders()); diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 34d9351..148d573 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -310,8 +310,7 @@ public class TarFile implements Closeable { do { final ByteBuffer headerBuf = getRecord(); if (headerBuf == null) { -currEntry = null; -break; +throw new IOException("premature end of tar archive. Didn't find extended_header after header with extended flag."); } entry = new TarArchiveSparseEntry(headerBuf.array()); currEntry.getSparseHeaders().addAll(entry.getSparseHeaders());
[commons-compress] branch master updated: minor cleanup of unused variables
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new f1218e2 minor cleanup of unused variables f1218e2 is described below commit f1218e2ce44ba7939ed07ed81e218bc97a8ee93c Author: Stefan Bodewig AuthorDate: Fri May 28 20:51:33 2021 +0200 minor cleanup of unused variables --- .../commons/compress/archivers/sevenz/SevenZFile.java | 14 +- 1 file changed, 5 insertions(+), 9 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 2d9f593..f4bec6c 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -766,7 +766,6 @@ public class SevenZFile implements Closeable { || SIGNATURE_HEADER_SIZE + packPos < 0) { throw new IOException("packPos (" + packPos + ") is out of range"); } -stats.packPos = packPos; final long numPackStreams = readUint64(header); stats.numberOfPackedStreams = assertFitsIntoNonNegativeInt("numPackStreams", numPackStreams); int nid = getUnsignedByte(header); @@ -1292,8 +1291,6 @@ public class SevenZFile implements Closeable { stats.numberOfEntries = assertFitsIntoNonNegativeInt("numFiles", readUint64(header)); int emptyStreams = -1; -int emptyFiles = -1; -int antis = -1; while (true) { final int propertyType = getUnsignedByte(header); if (propertyType == 0) { @@ -1309,14 +1306,14 @@ public class SevenZFile implements Closeable { if (emptyStreams == -1) { throw new IOException("Header format error: kEmptyStream must appear before kEmptyFile"); } -emptyFiles = readBits(header, emptyStreams).cardinality(); +readBits(header, emptyStreams); break; } case NID.kAnti: { if (emptyStreams == -1) { throw new IOException("Header format error: kEmptyStream must appear before kAnti"); } -antis = readBits(header, emptyStreams).cardinality(); +readBits(header, emptyStreams); break; } case NID.kName: { @@ -1324,11 +1321,11 @@ public class SevenZFile implements Closeable { if (external != 0) { throw new IOException("Not implemented"); } -if (((size - 1) & 1) != 0) { -throw new IOException("File names length invalid"); -} final int namesLength = assertFitsIntoNonNegativeInt("file names length", size - 1); +if ((namesLength & 1) != 0) { +throw new IOException("File names length invalid"); +} int filesSeen = 0; for (int i = 0; i < namesLength; i += 2) { @@ -2164,7 +2161,6 @@ public class SevenZFile implements Closeable { } private static class ArchiveStatistics { -private long packPos; private int numberOfPackedStreams; private long numberOfCoders; private long numberOfOutStreams;
[commons-compress] branch master updated: record COMPRESS-566
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 6efdc66 record COMPRESS-566 6efdc66 is described below commit 6efdc66e84703ed2dcfd4a512992ad95a1e1a702 Author: Stefan Bodewig AuthorDate: Mon May 24 08:30:13 2021 +0200 record COMPRESS-566 --- src/changes/changes.xml | 4 1 file changed, 4 insertions(+) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index dd85b43..82dd735 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -352,6 +352,10 @@ The type attribute can be add,update,fix,remove. Github Pull Request #169. + +gzip deflate buffer size is now configurable. +
[commons-compress] branch master updated (b1cbfdd -> 7928482)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from b1cbfdd COMPRESS-567 more structured way to deal with buffer undeflow new 8797ec4 COMPRESS-566 allow gzip buffer size to be configured new 309681f COMPRESS-566 allow gzip buffer size to be configured new 7928482 Merge pull request #168 from bokken/COMPRESS-566_buf_size The 3236 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../gzip/GzipCompressorOutputStream.java | 4 ++-- .../compress/compressors/gzip/GzipParameters.java | 26 + .../commons/compress/compressors/GZipTestCase.java | 27 ++ 3 files changed, 51 insertions(+), 6 deletions(-)
[commons-compress] branch master updated: COMPRESS-567 more structured way to deal with buffer undeflow
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new b1cbfdd COMPRESS-567 more structured way to deal with buffer undeflow b1cbfdd is described below commit b1cbfdd99eb96f1d9e87593747ea8b0fdbb2cce1 Author: Stefan Bodewig AuthorDate: Sun May 23 17:23:25 2021 +0200 COMPRESS-567 more structured way to deal with buffer undeflow --- .../compress/archivers/sevenz/SevenZFile.java | 118 - 1 file changed, 46 insertions(+), 72 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 7a936a7..2d9f593 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -26,7 +26,6 @@ import java.io.File; import java.io.FilterInputStream; import java.io.IOException; import java.io.InputStream; -import java.nio.BufferUnderflowException; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.CharBuffer; @@ -647,11 +646,7 @@ public class SevenZFile implements Closeable { final long propertySize = readUint64(input); assertFitsIntoNonNegativeInt("propertySize", propertySize); final byte[] property = new byte[(int)propertySize]; -try { -input.get(property); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +get(input, property); nid = getUnsignedByte(input); } } @@ -823,11 +818,7 @@ public class SevenZFile implements Closeable { archive.packCrcs = new long[numPackStreamsInt]; for (int i = 0; i < numPackStreamsInt; i++) { if (archive.packCrcsDefined.get(i)) { -try { -archive.packCrcs[i] = 0xL & header.getInt(); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +archive.packCrcs[i] = 0xL & getInt(header); } } @@ -928,11 +919,7 @@ public class SevenZFile implements Closeable { for (int i = 0; i < numFoldersInt; i++) { if (crcsDefined.get(i)) { folders[i].hasCrc = true; -try { -folders[i].crc = 0xL & header.getInt(); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +folders[i].crc = 0xL & getInt(header); } else { folders[i].hasCrc = false; } @@ -1068,11 +1055,7 @@ public class SevenZFile implements Closeable { final long[] missingCrcs = new long[numDigests]; for (int i = 0; i < numDigests; i++) { if (hasMissingCrc.get(i)) { -try { -missingCrcs[i] = 0xL & header.getInt(); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +missingCrcs[i] = 0xL & getInt(header); } } int nextCrc = 0; @@ -1116,11 +1099,7 @@ public class SevenZFile implements Closeable { for (int i = 0; i < numCoders; i++) { final int bits = getUnsignedByte(header); final int idSize = bits & 0xf; -try { -header.get(new byte[idSize]); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +get(header, new byte[idSize]); final boolean isSimple = (bits & 0x10) == 0; final boolean hasAttributes = (bits & 0x20) != 0; @@ -1212,11 +1191,7 @@ public class SevenZFile implements Closeable { final boolean moreAlternativeMethods = (bits & 0x80) != 0; coders[i].decompressionMethodId = new byte[idSize]; -try { -header.get(coders[i].decompressionMethodId); -} catch (BufferUnderflowException ex) { -throw new IOException(ex); -} +get(header, coders[i].decompressionMethodId); if (isSimple) { coders[i].numInStreams = 1; coders[i].numOutStreams = 1; @@ -1230,11 +1205,7 @@ public class SevenZFile implements Closeable { final long propertiesSize = readUint64(header);
[commons-compress] branch master updated: COMPRESS-567 and even more 7z sanity checks
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new b05ed49 COMPRESS-567 and even more 7z sanity checks b05ed49 is described below commit b05ed497feadf0fc69f49eb23d6b37d0509d7855 Author: Stefan Bodewig AuthorDate: Sat May 22 21:08:32 2021 +0200 COMPRESS-567 and even more 7z sanity checks --- .../commons/compress/archivers/sevenz/Folder.java | 22 -- .../compress/archivers/sevenz/SevenZFile.java | 86 +- 2 files changed, 84 insertions(+), 24 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java index dff9eea..1725be0 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/Folder.java @@ -17,6 +17,7 @@ */ package org.apache.commons.compress.archivers.sevenz; +import java.util.Collections; import java.util.LinkedList; /** @@ -53,9 +54,12 @@ class Folder { * from the output of the first and so on. */ Iterable getOrderedCoders() { +if (packedStreams == null || coders == null || packedStreams.length == 0 || coders.length == 0) { +return Collections.emptyList(); +} final LinkedList l = new LinkedList<>(); int current = (int) packedStreams[0]; // more that 2^31 coders? -while (current != -1) { +while (current >= 0 && current < coders.length) { l.addLast(coders[current]); final int pair = findBindPairForOutStream(current); current = pair != -1 ? (int) bindPairs[pair].inIndex : -1; @@ -64,18 +68,22 @@ class Folder { } int findBindPairForInStream(final int index) { -for (int i = 0; i < bindPairs.length; i++) { -if (bindPairs[i].inIndex == index) { -return i; +if (bindPairs != null) { +for (int i = 0; i < bindPairs.length; i++) { +if (bindPairs[i].inIndex == index) { +return i; +} } } return -1; } int findBindPairForOutStream(final int index) { -for (int i = 0; i < bindPairs.length; i++) { -if (bindPairs[i].outIndex == index) { -return i; +if (bindPairs != null) { +for (int i = 0; i < bindPairs.length; i++) { +if (bindPairs[i].outIndex == index) { +return i; +} } } return -1; diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 1a1b029..7a936a7 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -589,7 +589,7 @@ public class SevenZFile implements Closeable { if (nid == NID.kAdditionalStreamsInfo) { throw new IOException("Additional streams unsupported"); -//nid = header.readUnsignedByte(); +//nid = getUnsignedByte(header); } if (nid == NID.kMainStreamsInfo) { @@ -620,7 +620,7 @@ public class SevenZFile implements Closeable { if (nid == NID.kAdditionalStreamsInfo) { throw new IOException("Additional streams unsupported"); -//nid = header.readUnsignedByte(); +//nid = getUnsignedByte(header); } if (nid == NID.kMainStreamsInfo) { @@ -647,7 +647,11 @@ public class SevenZFile implements Closeable { final long propertySize = readUint64(input); assertFitsIntoNonNegativeInt("propertySize", propertySize); final byte[] property = new byte[(int)propertySize]; -input.get(property); +try { +input.get(property); +} catch (BufferUnderflowException ex) { +throw new IOException(ex); +} nid = getUnsignedByte(input); } } @@ -819,7 +823,11 @@ public class SevenZFile implements Closeable { archive.packCrcs = new long[numPackStreamsInt]; for (int i = 0; i < numPackStreamsInt; i++) { if (archive.packCrcsDefined.get(i)) { -archive.packCrcs[i] = 0xL & header.getInt(); +try { +archive.packCrcs[i] = 0xL & header.getInt(); +} catch (BufferUnderflowException ex) { +throw new IOException(ex); +}
[commons-compress] branch master updated: handle cases where more data attempted to be read than exists
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 8289193 handle cases where more data attempted to be read than exists 8289193 is described below commit 82891935174bd58b77c44fdea40580a1717a480b Author: Stefan Bodewig AuthorDate: Sat May 22 20:12:28 2021 +0200 handle cases where more data attempted to be read than exists Credit to OSS-Fuzz --- .../commons/compress/archivers/sevenz/SevenZFile.java | 14 +++--- 1 file changed, 11 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 87aa52a..1a1b029 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -21,10 +21,12 @@ import java.io.BufferedInputStream; import java.io.ByteArrayInputStream; import java.io.Closeable; import java.io.DataInputStream; +import java.io.EOFException; import java.io.File; import java.io.FilterInputStream; import java.io.IOException; import java.io.InputStream; +import java.nio.BufferUnderflowException; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.CharBuffer; @@ -493,7 +495,9 @@ public class SevenZFile implements Closeable { pos--; channel.position(pos); nidBuf.rewind(); -channel.read(nidBuf); +if (channel.read(nidBuf) < 1) { +throw new EOFException(); +} final int nid = nidBuf.array()[0]; // First indicator: Byte equals one of these header identifiers if (nid == NID.kEncodedHeader || nid == NID.kHeader) { @@ -2026,8 +2030,12 @@ public class SevenZFile implements Closeable { return value; } -private static int getUnsignedByte(final ByteBuffer buf) { -return buf.get() & 0xff; +private static int getUnsignedByte(final ByteBuffer buf) throws IOException { +try { +return buf.get() & 0xff; +} catch (BufferUnderflowException ex) { +throw new IOException(ex); +} } /**
[commons-compress] branch master updated: COMPRESS-578 - Java 8 improvements
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 7072023 COMPRESS-578 - Java 8 improvements new 8225539 Merge pull request #194 from arturobernalg/feature/COMPRESS-578 7072023 is described below commit 70720232b973bce00d395e0c5449564aec1d8eea Author: Arturo Bernal AuthorDate: Sat May 22 14:05:27 2021 +0200 COMPRESS-578 - Java 8 improvements --- .../archivers/zip/ParallelScatterZipCreatorTest.java| 8 .../commons/compress/changes/ChangeSetTestCase.java | 17 +++-- .../utils/FixedLengthBlockOutputStreamTest.java | 13 + 3 files changed, 12 insertions(+), 26 deletions(-) diff --git a/src/test/java/org/apache/commons/compress/archivers/zip/ParallelScatterZipCreatorTest.java b/src/test/java/org/apache/commons/compress/archivers/zip/ParallelScatterZipCreatorTest.java index 0bf8f4d..588f5fc 100644 --- a/src/test/java/org/apache/commons/compress/archivers/zip/ParallelScatterZipCreatorTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/zip/ParallelScatterZipCreatorTest.java @@ -79,13 +79,13 @@ public class ParallelScatterZipCreatorTest { @Test public void callableApiUsingSubmit() throws Exception { result = File.createTempFile("parallelScatterGather2", ""); -callableApi(zipCreator -> c -> zipCreator.submit(c)); +callableApi(zipCreator -> zipCreator::submit); } @Test public void callableApiUsingSubmitStreamAwareCallable() throws Exception { result = File.createTempFile("parallelScatterGather3", ""); -callableApi(zipCreator -> c -> zipCreator.submitStreamAwareCallable(c)); +callableApi(zipCreator -> zipCreator::submitStreamAwareCallable); } @Test(expected = IllegalArgumentException.class) @@ -109,13 +109,13 @@ public class ParallelScatterZipCreatorTest { @Test public void callableWithLowestLevelApiUsingSubmit() throws Exception { result = File.createTempFile("parallelScatterGather4", ""); -callableApiWithTestFiles(zipCreator -> c -> zipCreator.submit(c), Deflater.NO_COMPRESSION); +callableApiWithTestFiles(zipCreator -> zipCreator::submit, Deflater.NO_COMPRESSION); } @Test public void callableApiWithHighestLevelUsingSubmitStreamAwareCallable() throws Exception { result = File.createTempFile("parallelScatterGather5", ""); -callableApiWithTestFiles(zipCreator -> c -> zipCreator.submitStreamAwareCallable(c), Deflater.BEST_COMPRESSION); +callableApiWithTestFiles(zipCreator -> zipCreator::submitStreamAwareCallable, Deflater.BEST_COMPRESSION); } private void callableApi(final CallableConsumerSupplier consumerSupplier) throws Exception { diff --git a/src/test/java/org/apache/commons/compress/changes/ChangeSetTestCase.java b/src/test/java/org/apache/commons/compress/changes/ChangeSetTestCase.java index 0b3fb71..68be4b1 100644 --- a/src/test/java/org/apache/commons/compress/changes/ChangeSetTestCase.java +++ b/src/test/java/org/apache/commons/compress/changes/ChangeSetTestCase.java @@ -49,24 +49,13 @@ public final class ChangeSetTestCase extends AbstractTestCase { // Delete a directory tree private void archiveListDeleteDir(final String prefix){ -final Iterator it = archiveList.iterator(); -while(it.hasNext()){ -final String entry = it.next(); -if (entry.startsWith(prefix+"/")){ // TODO won't work with folders -it.remove(); -} -} +// TODO won't work with folders +archiveList.removeIf(entry -> entry.startsWith(prefix + "/")); } // Delete a single file private void archiveListDelete(final String prefix){ -final Iterator it = archiveList.iterator(); -while(it.hasNext()){ -final String entry = it.next(); -if (entry.equals(prefix)){ -it.remove(); -} -} +archiveList.removeIf(entry -> entry.equals(prefix)); } /** diff --git a/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java b/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java index 6752ff3..4f83bc0 100644 --- a/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java +++ b/src/test/java/org/apache/commons/compress/utils/FixedLengthBlockOutputStreamTest.java @@ -192,15 +192,12 @@ public class FixedLengthBlockOutputStreamTest { @Test public void testWithFileOutputStream() throws IOException { final Path tempFile = Files.
[commons-compress] 02/02: extra guard against illegal sparse struct values
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 1dde1a3e12934a904d84907ebb509cc73d58cc57 Author: Stefan Bodewig AuthorDate: Sat May 22 17:28:50 2021 +0200 extra guard against illegal sparse struct values --- .../commons/compress/archivers/tar/TarArchiveStructSparse.java | 6 ++ 1 file changed, 6 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveStructSparse.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveStructSparse.java index 116a15f..f21e7e8 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveStructSparse.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveStructSparse.java @@ -37,6 +37,12 @@ public final class TarArchiveStructSparse { private final long numbytes; public TarArchiveStructSparse(final long offset, final long numbytes) { +if (offset < 0) { +throw new IllegalArgumentException("offset must not be negative"); +} +if (numbytes < 0) { +throw new IllegalArgumentException("numbytes must not be negative"); +} this.offset = offset; this.numbytes = numbytes; }
[commons-compress] branch master updated (a3b3b82 -> 1dde1a3)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from a3b3b82 add back assertion that has been removed new 6a72242 tiny performance improvement new 1dde1a3 extra guard against illegal sparse struct values The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../org/apache/commons/compress/archivers/tar/TarArchiveEntry.java | 7 --- .../commons/compress/archivers/tar/TarArchiveStructSparse.java | 6 ++ 2 files changed, 10 insertions(+), 3 deletions(-)
[commons-compress] 01/02: tiny performance improvement
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 6a72242a7b6089cad8f35eac21785399f29d8a8c Author: Stefan Bodewig AuthorDate: Sat May 22 17:14:02 2021 +0200 tiny performance improvement --- .../org/apache/commons/compress/archivers/tar/TarArchiveEntry.java | 7 --- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index 158c1d2..ff10db2 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -947,9 +947,10 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO .sorted(Comparator.comparingLong(TarArchiveStructSparse::getOffset)) .collect(Collectors.toList()); -for (int i = 0; i < orderedAndFiltered.size(); i++) { +final int numberOfHeaders = orderedAndFiltered.size(); +for (int i = 0; i < numberOfHeaders; i++) { final TarArchiveStructSparse str = orderedAndFiltered.get(i); -if (i + 1 < orderedAndFiltered.size()) { +if (i + 1 < numberOfHeaders) { if (str.getOffset() + str.getNumbytes() > orderedAndFiltered.get(i + 1).getOffset()) { throw new IOException("Corrupted TAR archive. Sparse blocks for " + getName() + " overlap each other."); @@ -962,7 +963,7 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO } } if (!orderedAndFiltered.isEmpty()) { -final TarArchiveStructSparse last = orderedAndFiltered.get(orderedAndFiltered.size() - 1); +final TarArchiveStructSparse last = orderedAndFiltered.get(numberOfHeaders - 1); if (last.getOffset() + last.getNumbytes() > getRealSize()) { throw new IOException("Corrupted TAR archive. Sparse block extends beyond real size of the entry"); }
[commons-compress] branch master updated: add back assertion that has been removed
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new a3b3b82 add back assertion that has been removed a3b3b82 is described below commit a3b3b82506426708a7048405c2b77c1ebfdbf361 Author: Stefan Bodewig AuthorDate: Sat May 22 17:05:56 2021 +0200 add back assertion that has been removed --- .../org/apache/commons/compress/archivers/sevenz/SevenZFile.java | 8 +--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 0fe9da9..87aa52a 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -996,20 +996,22 @@ public class SevenZFile implements Closeable { for (final Folder folder : archive.folders) { folder.numUnpackSubStreams = 1; } -int totalUnpackStreams = archive.folders.length; +long unpackStreamsCount = archive.folders.length; int nid = getUnsignedByte(header); if (nid == NID.kNumUnpackStream) { -totalUnpackStreams = 0; +unpackStreamsCount = 0; for (final Folder folder : archive.folders) { final long numStreams = readUint64(header); assertFitsIntoNonNegativeInt("numStreams", numStreams); folder.numUnpackSubStreams = (int)numStreams; -totalUnpackStreams += numStreams; +unpackStreamsCount += numStreams; } nid = getUnsignedByte(header); } +final int totalUnpackStreams = +assertFitsIntoNonNegativeInt("totalUnpackStreams", unpackStreamsCount); final SubStreamsInfo subStreamsInfo = new SubStreamsInfo(); subStreamsInfo.unpackSizes = new long[totalUnpackStreams]; subStreamsInfo.hasCrc = new BitSet(totalUnpackStreams);
[commons-compress] branch master updated: simplify readEncodedHeader a bit
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new f8a4a3e simplify readEncodedHeader a bit f8a4a3e is described below commit f8a4a3ef5c455e80a25d6d6e00904ae9a54546b2 Author: Stefan Bodewig AuthorDate: Sat May 22 16:48:40 2021 +0200 simplify readEncodedHeader a bit --- .../commons/compress/archivers/sevenz/SevenZFile.java | 13 +++-- 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index b0b5abf..0fe9da9 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -671,10 +671,10 @@ public class SevenZFile implements Closeable { readStreamsInfo(header, archive); -if (archive.folders.length == 0) { +if (archive.folders == null || archive.folders.length == 0) { throw new IOException("no folders, can't read encoded header"); } -if (archive.packSizes.length == 0) { +if (archive.packSizes == null || archive.packSizes.length == 0) { throw new IOException("no packed streams, can't read encoded header"); } @@ -698,11 +698,12 @@ public class SevenZFile implements Closeable { inputStreamStack = new CRC32VerifyingInputStream(inputStreamStack, folder.getUnpackSize(), folder.crc); } -assertFitsIntoNonNegativeInt("unpackSize", folder.getUnpackSize()); -final byte[] nextHeader = new byte[(int)folder.getUnpackSize()]; -try (DataInputStream nextHeaderInputStream = new DataInputStream(inputStreamStack)) { -nextHeaderInputStream.readFully(nextHeader); +final int unpackSize = assertFitsIntoNonNegativeInt("unpackSize", folder.getUnpackSize()); +final byte[] nextHeader = new byte[unpackSize]; +if (IOUtils.readFully(inputStreamStack, nextHeader) < unpackSize) { +throw new IOException("premature end of stream"); } +inputStreamStack.close(); return ByteBuffer.wrap(nextHeader).order(ByteOrder.LITTLE_ENDIAN); }
[commons-compress] branch master updated: COMPRESS-580 - Remove redundant operation
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new ce1c3f0 COMPRESS-580 - Remove redundant operation new 93aab64 Merge pull request #196 from arturobernalg/feature/COMPRESS-580 ce1c3f0 is described below commit ce1c3f0d6d3ef873d43fa1337821800de7cc77ce Author: Arturo Bernal AuthorDate: Sat May 22 14:36:16 2021 +0200 COMPRESS-580 - Remove redundant operation --- .../archivers/arj/ArjArchiveInputStream.java | 4 +-- .../commons/compress/archivers/sevenz/CLI.java | 2 +- .../compress/archivers/sevenz/SevenZFile.java | 4 +-- .../compress/archivers/zip/ExtraFieldUtils.java| 4 +-- .../commons/compress/archivers/zip/ZipFile.java| 2 +- .../apache/commons/compress/ArchiveUtilsTest.java | 2 +- .../compress/archivers/cpio/CpioArchiveTest.java | 6 ++-- .../lz4/BlockLZ4CompressorRoundtripTest.java | 6 ++-- .../lz4/FramedLZ4CompressorRoundtripTest.java | 36 ++ .../compressors/z/ZCompressorInputStreamTest.java | 2 +- 10 files changed, 31 insertions(+), 37 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java index 8a420c0..b0c16b2 100644 --- a/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/arj/ArjArchiveInputStream.java @@ -115,10 +115,10 @@ public class ArjArchiveInputStream extends ArchiveInputStream { buffer.write(nextByte); } if (charsetName != null) { -return new String(buffer.toByteArray(), charsetName); +return buffer.toString(charsetName); } // intentionally using the default encoding as that's the contract for a null charsetName -return new String(buffer.toByteArray()); +return buffer.toString(); } } diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/CLI.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/CLI.java index 0d1510d..dfa1c58 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/CLI.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/CLI.java @@ -42,7 +42,7 @@ public class CLI { if (!entry.isDirectory()) { System.out.println(" " + getContentMethods(entry)); } else { -System.out.println(""); +System.out.println(); } } diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 39d8b82..b0b5abf 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -178,7 +178,7 @@ public class SevenZFile implements Closeable { * @since 1.19 */ public SevenZFile(final SeekableByteChannel channel, final SevenZFileOptions options) throws IOException { -this(channel, DEFAULT_FILE_NAME, (char[]) null, options); +this(channel, DEFAULT_FILE_NAME, null, options); } /** @@ -371,7 +371,7 @@ public class SevenZFile implements Closeable { * @since 1.19 */ public SevenZFile(final File fileName, final SevenZFileOptions options) throws IOException { -this(fileName, (char[]) null, options); +this(fileName, null, options); } /** diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ExtraFieldUtils.java b/src/main/java/org/apache/commons/compress/archivers/zip/ExtraFieldUtils.java index 84e769d..8908628 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ExtraFieldUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ExtraFieldUtils.java @@ -68,11 +68,11 @@ public class ExtraFieldUtils { final ZipExtraField ze = (ZipExtraField) c.newInstance(); implementations.put(ze.getHeaderId(), c); } catch (final ClassCastException cc) { // NOSONAR -throw new RuntimeException(c + " doesn\'t implement ZipExtraField"); //NOSONAR +throw new RuntimeException(c + " doesn't implement ZipExtraField"); //NOSONAR } catch (final InstantiationException ie) { // NOSONAR throw new RuntimeException(c + " is not a concrete class"); //NOSONAR } catch (final IllegalAccessException ie) { // NOSONAR -throw new RuntimeException(c + "\'s no-arg constructor is not public
[commons-compress] branch master updated: COMPRESS-579 - Remove redundant local variable
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 8dbf582 COMPRESS-579 - Remove redundant local variable new 8fc60bf Merge pull request #195 from arturobernalg/feature/COMPRESS-579 8dbf582 is described below commit 8dbf582b9e46573025fc80dc9a40b7d45bcf9f56 Author: Arturo Bernal AuthorDate: Sat May 22 14:14:29 2021 +0200 COMPRESS-579 - Remove redundant local variable --- src/test/java/org/apache/commons/compress/archivers/ArTestCase.java | 5 ++--- .../java/org/apache/commons/compress/archivers/CpioTestCase.java | 3 +-- src/test/java/org/apache/commons/compress/archivers/TarTestCase.java | 2 +- .../java/org/apache/commons/compress/archivers/tar/BigFilesIT.java | 2 +- .../commons/compress/archivers/tar/TarArchiveOutputStreamTest.java | 5 ++--- .../java/org/apache/commons/compress/compressors/BZip2TestCase.java | 3 +-- .../commons/compress/compressors/snappy/SnappyRoundtripTest.java | 3 +-- .../commons/compress/utils/FixedLengthBlockOutputStreamTest.java | 3 +-- 8 files changed, 10 insertions(+), 16 deletions(-) diff --git a/src/test/java/org/apache/commons/compress/archivers/ArTestCase.java b/src/test/java/org/apache/commons/compress/archivers/ArTestCase.java index 755a9ad..ab186db 100644 --- a/src/test/java/org/apache/commons/compress/archivers/ArTestCase.java +++ b/src/test/java/org/apache/commons/compress/archivers/ArTestCase.java @@ -81,9 +81,8 @@ public final class ArTestCase extends AbstractTestCase { } // UnArArchive Operation -final File input = output; -try (final InputStream is = Files.newInputStream(input.toPath()); -final ArchiveInputStream in = new ArchiveStreamFactory() +try (final InputStream is = Files.newInputStream(output.toPath()); + final ArchiveInputStream in = new ArchiveStreamFactory() .createArchiveInputStream(new BufferedInputStream(is))) { final ArArchiveEntry entry = (ArArchiveEntry) in.getNextEntry(); diff --git a/src/test/java/org/apache/commons/compress/archivers/CpioTestCase.java b/src/test/java/org/apache/commons/compress/archivers/CpioTestCase.java index c61c71f..32edacb 100644 --- a/src/test/java/org/apache/commons/compress/archivers/CpioTestCase.java +++ b/src/test/java/org/apache/commons/compress/archivers/CpioTestCase.java @@ -89,8 +89,7 @@ public final class CpioTestCase extends AbstractTestCase { } // Unarchive Operation -final File input = output; -final InputStream is = Files.newInputStream(input.toPath()); +final InputStream is = Files.newInputStream(output.toPath()); final ArchiveInputStream in = ArchiveStreamFactory.DEFAULT.createArchiveInputStream("cpio", is); diff --git a/src/test/java/org/apache/commons/compress/archivers/TarTestCase.java b/src/test/java/org/apache/commons/compress/archivers/TarTestCase.java index e787796..d75ef8f 100644 --- a/src/test/java/org/apache/commons/compress/archivers/TarTestCase.java +++ b/src/test/java/org/apache/commons/compress/archivers/TarTestCase.java @@ -569,7 +569,7 @@ public final class TarTestCase extends AbstractTestCase { } private String createLongName(final int nameLength) { -final StringBuffer buffer = new StringBuffer(); +final StringBuilder buffer = new StringBuilder(); for (int i = 0; i < nameLength; i++) { buffer.append('a'); } diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/BigFilesIT.java b/src/test/java/org/apache/commons/compress/archivers/tar/BigFilesIT.java index 9c58a48..447262c 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/BigFilesIT.java +++ b/src/test/java/org/apache/commons/compress/archivers/tar/BigFilesIT.java @@ -76,7 +76,7 @@ public class BigFilesIT extends AbstractTestCase { private void readFileBiggerThan8GByte(final String name) throws Exception { try (InputStream in = new BufferedInputStream(Files.newInputStream(getPath(name))); GzipCompressorInputStream gzin = new GzipCompressorInputStream(in); - TarArchiveInputStream tin = new TarArchiveInputStream(gzin);) { + TarArchiveInputStream tin = new TarArchiveInputStream(gzin)) { final TarArchiveEntry e = tin.getNextTarEntry(); assertNotNull(e); assertEquals(8200L * 1024 * 1024, e.getSize()); diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveOutputStreamTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveOutputStreamTest.java index 874a2ff..da21346 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveOutputStreamTest.java +++ b/src/test/ja
[commons-compress] branch master updated: COMPRESS-577 - Simplify assertion
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new e893413 COMPRESS-577 - Simplify assertion new e404cbc Merge pull request #193 from arturobernalg/feature/COMPRESS-577 e893413 is described below commit e893413aed7433646989a3f2ef7d7cc9491d9d7b Author: Arturo Bernal AuthorDate: Sat May 22 14:01:11 2021 +0200 COMPRESS-577 - Simplify assertion --- .../apache/commons/compress/ArchiveUtilsTest.java | 3 +- .../compress/archivers/sevenz/SevenZFileTest.java | 3 +- .../archivers/sevenz/SevenZOutputFileTest.java | 2 +- .../archivers/tar/TarArchiveInputStreamTest.java | 7 ++-- .../compress/archivers/zip/AsiExtraFieldTest.java | 9 ++--- .../archivers/zip/GeneralPurposeBitTest.java | 13 --- .../compress/archivers/zip/X000A_NTFSTest.java | 3 +- .../archivers/zip/X5455_ExtendedTimestampTest.java | 20 +-- .../compress/archivers/zip/X7875_NewUnixTest.java | 42 +++--- .../archivers/zip/ZipArchiveEntryTest.java | 9 ++--- .../archivers/zip/ZipEightByteIntegerTest.java | 14 .../compress/archivers/zip/ZipLongTest.java| 14 .../compress/archivers/zip/ZipShortTest.java | 14 .../compress/archivers/zip/ZipUtilTest.java| 8 ++--- .../brotli/BrotliCompressorInputStreamTest.java| 2 +- .../Deflate64CompressorInputStreamTest.java| 7 ++-- .../utils/ChecksumCalculatingInputStreamTest.java | 4 +-- 17 files changed, 89 insertions(+), 85 deletions(-) diff --git a/src/test/java/org/apache/commons/compress/ArchiveUtilsTest.java b/src/test/java/org/apache/commons/compress/ArchiveUtilsTest.java index 5ee9a43..c361ebf 100644 --- a/src/test/java/org/apache/commons/compress/ArchiveUtilsTest.java +++ b/src/test/java/org/apache/commons/compress/ArchiveUtilsTest.java @@ -25,6 +25,7 @@ import org.junit.Test; import static org.junit.Assert.assertArrayEquals; import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertNotEquals; import static org.junit.Assert.assertTrue; public class ArchiveUtilsTest extends AbstractTestCase { @@ -151,6 +152,6 @@ public class ArchiveUtilsTest extends AbstractTestCase { } private void asciiToByteAndBackFail(final String inputString) { - assertFalse(inputString.equals(ArchiveUtils.toAsciiString(ArchiveUtils.toAsciiBytes(inputString; +assertNotEquals(inputString, ArchiveUtils.toAsciiString(ArchiveUtils.toAsciiBytes(inputString))); } } diff --git a/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZFileTest.java b/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZFileTest.java index 7691a01..7dde40a 100644 --- a/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZFileTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZFileTest.java @@ -107,8 +107,7 @@ public class SevenZFileTest extends AbstractTestCase { if (entry.hasStream()) { assertTrue(entriesByName.containsKey(entry.getName())); final byte[] content = readFully(archive); -assertTrue("Content mismatch on: " + fileName + "!" + entry.getName(), -Arrays.equals(content, entriesByName.get(entry.getName(; +assertArrayEquals("Content mismatch on: " + fileName + "!" + entry.getName(), content, entriesByName.get(entry.getName())); } } diff --git a/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZOutputFileTest.java b/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZOutputFileTest.java index 6ff0e20..43237ad 100644 --- a/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZOutputFileTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/sevenz/SevenZOutputFileTest.java @@ -542,7 +542,7 @@ public class SevenZOutputFileTest extends AbstractTestCase { return null; } assertEquals("foo/" + index + ".txt", entry.getName()); -assertEquals(false, entry.isDirectory()); +assertFalse(entry.isDirectory()); if (entry.getSize() == 0) { return Boolean.FALSE; } diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStreamTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStreamTest.java index 03d1318..a49ade9 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStreamTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/
[commons-compress] 01/02: Fix typo
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 06f1169984ad6dee94c3568029b3ea057dfe3533 Author: Helder Magalhães AuthorDate: Fri May 21 23:01:01 2021 +0100 Fix typo --- src/site/xdoc/examples.xml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/site/xdoc/examples.xml b/src/site/xdoc/examples.xml index 957294e..ce7e813 100644 --- a/src/site/xdoc/examples.xml +++ b/src/site/xdoc/examples.xml @@ -117,7 +117,7 @@ CompressorInputStream input = new CompressorStreamFactory() interface can be used to track progress while extracting a stream or to detect potential https://en.wikipedia.org/wiki/Zip_bomb;>zip bombs -when the compression ration becomes suspiciously large. +when the compression ratio becomes suspiciously large.
[commons-compress] 02/02: Grammar glitch
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 0f8a69c71b663404c592a938e2a04ff42a9ee682 Author: Helder Magalhães AuthorDate: Fri May 21 23:10:13 2021 +0100 Grammar glitch --- .../java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index bdb3058..158c1d2 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -166,7 +166,7 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO /** * Value used to indicate unknown mode, user/groupids, device numbers and modTime when parsing a file in lenient - * mode an the archive contains illegal fields. + * mode and the archive contains illegal fields. * @since 1.19 */ public static final long UNKNOWN = -1L;
[commons-compress] branch master updated (9bc02b2 -> 0f8a69c)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 9bc02b2 simplify BoundedArchiveIS#read, add upper bounds check for offset new 06f1169 Fix typo new 0f8a69c Grammar glitch The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java | 2 +- src/site/xdoc/examples.xml | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-)
[commons-compress] branch master updated: simplify BoundedArchiveIS#read, add upper bounds check for offset
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 9bc02b2 simplify BoundedArchiveIS#read, add upper bounds check for offset 9bc02b2 is described below commit 9bc02b24a67db2783ca6817278e4b6b218677e26 Author: Stefan Bodewig AuthorDate: Fri May 21 22:16:36 2021 +0200 simplify BoundedArchiveIS#read, add upper bounds check for offset --- .../compress/utils/BoundedArchiveInputStream.java | 18 +++--- 1 file changed, 7 insertions(+), 11 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java index c6a5840..a72aa15 100644 --- a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java @@ -67,25 +67,21 @@ public abstract class BoundedArchiveInputStream extends InputStream { @Override public synchronized int read(final byte[] b, final int off, int len) throws IOException { -if (len <= 0) { +if (loc >= end) { +return -1; +} +final long maxLen = Math.min(len, end - loc); +if (maxLen <= 0) { return 0; } -if (off < 0 || len > b.length - off) { +if (off < 0 || off > b.length || maxLen > b.length - off) { throw new IndexOutOfBoundsException("offset or len are out of bounds"); } -if (len > end - loc) { -if (loc >= end) { -return -1; -} -len = (int) (end - loc); -} - -ByteBuffer buf = ByteBuffer.wrap(b, off, len); +ByteBuffer buf = ByteBuffer.wrap(b, off, (int) maxLen); int ret = read(loc, buf); if (ret > 0) { loc += ret; -return ret; } return ret; }
[commons-compress] branch master updated: looks as if we may forget flipping the buffer for sparse entries
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 24788a9 looks as if we may forget flipping the buffer for sparse entries 24788a9 is described below commit 24788a9c0616cb160c29657db499e9690528e36e Author: Stefan Bodewig AuthorDate: Fri May 21 22:05:03 2021 +0200 looks as if we may forget flipping the buffer for sparse entries --- src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 1c25ca8..34d9351 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -689,6 +689,7 @@ public class TarFile implements Closeable { setAtEOF(true); } else { entryOffset += totalRead; +buf.flip(); } return totalRead; } @@ -741,9 +742,7 @@ public class TarFile implements Closeable { private int readArchive(final long pos, final ByteBuffer buf) throws IOException { channel.position(pos); -final int read = channel.read(buf); -buf.flip(); -return read; +return channel.read(buf); } } }
[commons-compress] branch master updated: COMPRESS-567 more uncaught runtime exceptions
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new d15c285 COMPRESS-567 more uncaught runtime exceptions d15c285 is described below commit d15c285941351958a902265aeacdc151fa98c127 Author: Stefan Bodewig AuthorDate: Tue May 18 21:29:20 2021 +0200 COMPRESS-567 more uncaught runtime exceptions Credit to OSS-Fuzz --- .../compress/archivers/tar/TarArchiveEntry.java| 18 +++--- 1 file changed, 15 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index e49e180..bdb3058 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -1392,16 +1392,28 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO setUserName(val); break; case "size": -setSize(Long.parseLong(val)); +final long size = Long.parseLong(val); +if (size < 0) { +throw new IOException("Corrupted TAR archive. Entry size is negative"); +} +setSize(size); break; case "mtime": setModTime((long) (Double.parseDouble(val) * 1000)); break; case "SCHILY.devminor": -setDevMinor(Integer.parseInt(val)); +final int devMinor = Integer.parseInt(val); +if (devMinor < 0) { +throw new IOException("Corrupted TAR archive. Dev-Minor is negative"); +} +setDevMinor(devMinor); break; case "SCHILY.devmajor": -setDevMajor(Integer.parseInt(val)); +final int devMajor = Integer.parseInt(val); +if (devMajor < 0) { +throw new IOException("Corrupted TAR archive. Dev-Major is negative"); +} +setDevMajor(devMajor); break; case "GNU.sparse.size": fillGNUSparse0xData(headers);
[commons-compress] branch master updated: COMPRESS-542 each folder requires at least one coder
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 5761493 COMPRESS-542 each folder requires at least one coder 5761493 is described below commit 5761493cbaf7a7d608a3b68f4d61aaa822dbeb4f Author: Stefan Bodewig AuthorDate: Sun May 16 18:20:23 2021 +0200 COMPRESS-542 each folder requires at least one coder --- .../java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index e26cd96..39d8b82 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -1083,6 +1083,9 @@ public class SevenZFile implements Closeable { throws IOException { final int numCoders = assertFitsIntoNonNegativeInt("numCoders", readUint64(header)); +if (numCoders == 0) { +throw new IOException("Folder without coders"); +} stats.numberOfCoders += numCoders; long totalOutStreams = 0;
[commons-compress] branch master updated: COMPRESS-542 hit commit to quickly
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new bf5a534 COMPRESS-542 hit commit to quickly bf5a534 is described below commit bf5a5346ae04b9d2a5b0356ca75f11dcc8d94789 Author: Stefan Bodewig AuthorDate: Sun May 16 17:43:24 2021 +0200 COMPRESS-542 hit commit to quickly --- .../java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 521aed8..e26cd96 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -943,7 +943,7 @@ public class SevenZFile implements Closeable { stats.numberOfUnpackSubStreams = stats.numberOfFolders; } -assertFitsIntoNonNegativeInt(stats.numberOfUnpackSubStreams); +assertFitsIntoNonNegativeInt("totalUnpackStreams", stats.numberOfUnpackSubStreams); if (nid == NID.kSize) { for (final int numUnpackSubStreams : numUnpackSubStreamsPerFolder) {
[commons-compress] branch master updated: COMPRESS-542 guard against integer overflow
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 60d551a COMPRESS-542 guard against integer overflow 60d551a is described below commit 60d551a748236d7f4651a4ae88d5a351f7c5754b Author: Stefan Bodewig AuthorDate: Sun May 16 17:39:44 2021 +0200 COMPRESS-542 guard against integer overflow --- .../java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java | 2 ++ 1 file changed, 2 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 2d7bb77..521aed8 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -943,6 +943,8 @@ public class SevenZFile implements Closeable { stats.numberOfUnpackSubStreams = stats.numberOfFolders; } +assertFitsIntoNonNegativeInt(stats.numberOfUnpackSubStreams); + if (nid == NID.kSize) { for (final int numUnpackSubStreams : numUnpackSubStreamsPerFolder) { if (numUnpackSubStreams == 0) {
[commons-compress] branch master updated: COMPRESS-552 check interfaces of superclasses as well
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 23b758f COMPRESS-552 check interfaces of superclasses as well 23b758f is described below commit 23b758fbe071f58eb2625e50166351bdb5983f13 Author: Stefan Bodewig AuthorDate: Sun May 16 12:04:18 2021 +0200 COMPRESS-552 check interfaces of superclasses as well --- src/main/java/org/apache/commons/compress/utils/OsgiUtils.java | 10 +- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/utils/OsgiUtils.java b/src/main/java/org/apache/commons/compress/utils/OsgiUtils.java index 3ec7c4e..a4e84b5 100644 --- a/src/main/java/org/apache/commons/compress/utils/OsgiUtils.java +++ b/src/main/java/org/apache/commons/compress/utils/OsgiUtils.java @@ -38,12 +38,12 @@ public class OsgiUtils { if (c.getName().equals("org.osgi.framework.BundleReference")) { return true; } -c = c.getSuperclass(); -} -for (Class ifc : clazz.getInterfaces()) { -if (isBundleReference(ifc)) { -return true; +for (Class ifc : c.getInterfaces()) { +if (isBundleReference(ifc)) { +return true; +} } +c = c.getSuperclass(); } return false; }
[commons-compress] branch master updated: COMPRESS-542 and some final sanity checks
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 0aba8b8 COMPRESS-542 and some final sanity checks 0aba8b8 is described below commit 0aba8b8fd8053ae323f15d736d1762b2161c76a6 Author: Stefan Bodewig AuthorDate: Sun May 16 11:00:49 2021 +0200 COMPRESS-542 and some final sanity checks --- .../apache/commons/compress/archivers/sevenz/SevenZFile.java | 12 +++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index a08c02a..2d7bb77 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -41,6 +41,7 @@ import java.util.HashMap; import java.util.LinkedList; import java.util.List; import java.util.Map; +import java.util.stream.Collectors; import java.util.zip.CRC32; import org.apache.commons.compress.utils.BoundedInputStream; @@ -936,7 +937,10 @@ public class SevenZFile implements Closeable { for (int i = 0; i < stats.numberOfFolders; i++) { numUnpackSubStreamsPerFolder.add(assertFitsIntoNonNegativeInt("numStreams", readUint64(header))); } +stats.numberOfUnpackSubStreams = numUnpackSubStreamsPerFolder.stream().collect(Collectors.summingLong(Integer::longValue)); nid = getUnsignedByte(header); +} else { +stats.numberOfUnpackSubStreams = stats.numberOfFolders; } if (nid == NID.kSize) { @@ -952,7 +956,6 @@ public class SevenZFile implements Closeable { } sum += size; } -// TODO sum < folder.unpackSize } nid = getUnsignedByte(header); } @@ -1022,6 +1025,9 @@ public class SevenZFile implements Closeable { sum += size; } } +if (sum > folder.getUnpackSize()) { +throw new IOException("sum of unpack sizes of folder exceeds total unpack size"); +} subStreamsInfo.unpackSizes[nextUnpackStream++] = folder.getUnpackSize() - sum; } if (nid == NID.kSize) { @@ -2121,6 +2127,7 @@ public class SevenZFile implements Closeable { private long numberOfCoders; private long numberOfOutStreams; private long numberOfInStreams; +private long numberOfUnpackSubStreams; private int numberOfFolders; private BitSet folderHasCrc; private int numberOfEntries; @@ -2150,6 +2157,9 @@ public class SevenZFile implements Closeable { if (numberOfEntriesWithStream > 0 && numberOfFolders == 0) { throw new IOException("archive with entries but no folders"); } +if (numberOfEntriesWithStream > numberOfUnpackSubStreams) { +throw new IOException("archive doesn't contain enough substreams for entries"); +} } private long folderSize() {
[commons-compress] 02/02: COMPRESS-542 add some extra sanity checks
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit c51de6cfaec75b21566374158f25e1734c3a94cb Author: Stefan Bodewig AuthorDate: Sat May 15 22:36:02 2021 +0200 COMPRESS-542 add some extra sanity checks --- .../compress/archivers/sevenz/SevenZFile.java | 88 -- 1 file changed, 83 insertions(+), 5 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 288cfd4..a08c02a 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -503,7 +503,7 @@ public class SevenZFile implements Closeable { startHeader.nextHeaderSize = channel.size() - pos; final Archive result = initializeArchive(startHeader, password, false); // Sanity check: There must be some data... -if (result.packSizes != null && result.files.length > 0) { +if (result.packSizes.length > 0 && result.files.length > 0) { return result; } } catch (final Exception ignore) { @@ -572,6 +572,7 @@ public class SevenZFile implements Closeable { private void readHeader(final ByteBuffer header, final Archive archive) throws IOException { final int pos = header.position(); final ArchiveStatistics stats = sanityCheckAndCollectStatistics(header); +stats.assertValidity(); header.position(pos); int nid = getUnsignedByte(header); @@ -662,10 +663,20 @@ public class SevenZFile implements Closeable { private ByteBuffer readEncodedHeader(final ByteBuffer header, final Archive archive, final byte[] password) throws IOException { final int pos = header.position(); -sanityCheckStreamsInfo(header, new ArchiveStatistics()); +ArchiveStatistics stats = new ArchiveStatistics(); +sanityCheckStreamsInfo(header, stats); +stats.assertValidity(); header.position(pos); + readStreamsInfo(header, archive); +if (archive.folders.length == 0) { +throw new IOException("no folders, can't read encoded header"); +} +if (archive.packSizes.length == 0) { +throw new IOException("no packed streams, can't read encoded header"); +} + // FIXME: merge with buildDecodingStream()/buildDecoderStack() at some stage? final Folder folder = archive.folders[0]; final int firstPackStreamIndex = 0; @@ -832,6 +843,12 @@ public class SevenZFile implements Closeable { numberOfOutputStreamsPerFolder.add(sanityCheckFolder(header, stats)); } +final long totalNumberOfBindPairs = stats.numberOfOutStreams - stats.numberOfFolders; +final long packedStreamsRequiredByFolders = stats.numberOfInStreams - totalNumberOfBindPairs; +if (packedStreamsRequiredByFolders < stats.numberOfPackedStreams) { +throw new IOException("archive doesn't contain enough packed streams"); +} + nid = getUnsignedByte(header); if (nid != NID.kCodersUnpackSize) { throw new IOException("Expected kCodersUnpackSize, got " + nid); @@ -1058,6 +1075,7 @@ public class SevenZFile implements Closeable { throws IOException { final int numCoders = assertFitsIntoNonNegativeInt("numCoders", readUint64(header)); +stats.numberOfCoders += numCoders; long totalOutStreams = 0; long totalInStreams = 0; @@ -1094,6 +1112,8 @@ public class SevenZFile implements Closeable { } assertFitsIntoNonNegativeInt("totalInStreams", totalInStreams); assertFitsIntoNonNegativeInt("totalOutStreams", totalOutStreams); +stats.numberOfOutStreams += totalOutStreams; +stats.numberOfInStreams += totalInStreams; if (totalOutStreams == 0) { throw new IOException("Total output streams can't be 0"); @@ -1119,6 +1139,7 @@ public class SevenZFile implements Closeable { final int numPackedStreams = assertFitsIntoNonNegativeInt("numPackedStreams", totalInStreams - numBindPairs); + if (numPackedStreams == 1) { if (inStreamsBound.nextClearBit(0) == -1) { throw new IOException("Couldn't find stream's bind pair index"); @@ -1127,8 +1148,8 @@ public class SevenZFile implements Closeable { for (int i = 0; i < numPackedStreams; i++) {
[commons-compress] branch master updated (8a825f0 -> c51de6c)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 8a825f0 really fix single-byte read when nothing has been read new 8bae05c COMPRESS-542 clear temporary storage after it is no longer needed new c51de6c COMPRESS-542 add some extra sanity checks The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../compress/archivers/sevenz/SevenZFile.java | 89 -- 1 file changed, 84 insertions(+), 5 deletions(-)
[commons-compress] 01/02: COMPRESS-542 clear temporary storage after it is no longer needed
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 8bae05c63549d086b80b81197f72afb290e63c39 Author: Stefan Bodewig AuthorDate: Fri May 14 20:47:31 2021 +0200 COMPRESS-542 clear temporary storage after it is no longer needed --- .../java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java| 1 + 1 file changed, 1 insertion(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 9f5986f..288cfd4 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -540,6 +540,7 @@ public class SevenZFile implements Closeable { throw new IOException("Broken or unsupported archive: no Header"); } readHeader(buf, archive); +archive.subStreamsInfo = null; return archive; }
[commons-compress] branch master updated: really fix single-byte read when nothing has been read
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 8a825f0 really fix single-byte read when nothing has been read 8a825f0 is described below commit 8a825f0c829471a5bac52dcd72bfa854dc1613a3 Author: Stefan Bodewig AuthorDate: Fri May 14 18:11:17 2021 +0200 really fix single-byte read when nothing has been read --- .../org/apache/commons/compress/utils/BoundedArchiveInputStream.java| 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java index d685968..c6a5840 100644 --- a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java @@ -59,7 +59,7 @@ public abstract class BoundedArchiveInputStream extends InputStream { } int read = read(loc, singleByteBuffer); if (read < 1) { -return read; +return -1; } loc++; return singleByteBuffer.get() & 0xff;
[commons-compress] branch master updated (de39b85 -> 2d1392d)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from de39b85 COMPRESS-567 turn possible RuntimeExceptions into IOExceptions new ed32eb9 actually, InputStream wants an IIOBE here new 2d1392d fix single-byte read when nothing has been read The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../org/apache/commons/compress/utils/BoundedArchiveInputStream.java | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-)
[commons-compress] 01/02: actually, InputStream wants an IIOBE here
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit ed32eb968ef124b9c0e19ccdda2c9d60dbfe451e Author: Stefan Bodewig AuthorDate: Fri May 14 18:02:58 2021 +0200 actually, InputStream wants an IIOBE here --- .../org/apache/commons/compress/utils/BoundedArchiveInputStream.java| 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java index 4f7ec27..b8e351c 100644 --- a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java @@ -71,7 +71,7 @@ public abstract class BoundedArchiveInputStream extends InputStream { return 0; } if (off < 0 || len > b.length - off) { -throw new IllegalArgumentException("offset or len are out of bounds"); +throw new IndexOutOfBoundsException("offset or len are out of bounds"); } if (len > end - loc) {
[commons-compress] 02/02: fix single-byte read when nothing has been read
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 2d1392d2544165df2ae6bdb73f7473fd30dad33f Author: Stefan Bodewig AuthorDate: Fri May 14 18:05:39 2021 +0200 fix single-byte read when nothing has been read Credit to OSS-Fuzz --- .../org/apache/commons/compress/utils/BoundedArchiveInputStream.java| 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java index b8e351c..d685968 100644 --- a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java @@ -58,7 +58,7 @@ public abstract class BoundedArchiveInputStream extends InputStream { singleByteBuffer.rewind(); } int read = read(loc, singleByteBuffer); -if (read < 0) { +if (read < 1) { return read; } loc++;
[commons-compress] 02/03: properly fulfill InputStream's contract
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 8046bd27c09651cfc6b5f153171a87ce48b009c7 Author: Stefan Bodewig AuthorDate: Fri May 14 17:43:23 2021 +0200 properly fulfill InputStream's contract --- .../org/apache/commons/compress/utils/BoundedArchiveInputStream.java | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java index db8d948..4f7ec27 100644 --- a/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/utils/BoundedArchiveInputStream.java @@ -70,6 +70,9 @@ public abstract class BoundedArchiveInputStream extends InputStream { if (len <= 0) { return 0; } +if (off < 0 || len > b.length - off) { +throw new IllegalArgumentException("offset or len are out of bounds"); +} if (len > end - loc) { if (loc >= end) {
[commons-compress] branch master updated (882c6dd -> de39b85)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 882c6dd one more case where the JDK throwing RuntimeEx may hurt us new 190939b handle integer overflow new 8046bd2 properly fulfill InputStream's contract new de39b85 COMPRESS-567 turn possible RuntimeExceptions into IOExceptions The 3 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../java/org/apache/commons/compress/archivers/tar/TarFile.java | 8 ++-- .../apache/commons/compress/utils/BoundedArchiveInputStream.java | 3 +++ src/main/java/org/apache/commons/compress/utils/IOUtils.java | 2 +- 3 files changed, 10 insertions(+), 3 deletions(-)
[commons-compress] 03/03: COMPRESS-567 turn possible RuntimeExceptions into IOExceptions
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit de39b85b6d74031fb3a5c269d80be1f1253d1c91 Author: Stefan Bodewig AuthorDate: Fri May 14 17:49:34 2021 +0200 COMPRESS-567 turn possible RuntimeExceptions into IOExceptions Credit to OSS-Fuzz --- .../java/org/apache/commons/compress/archivers/tar/TarFile.java | 8 ++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 8de9260..1c25ca8 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -633,8 +633,12 @@ public class TarFile implements Closeable { * @param entry Entry to get the input stream from * @return Input stream of the provided entry */ -public InputStream getInputStream(final TarArchiveEntry entry) { -return new BoundedTarEntryInputStream(entry, archive); +public InputStream getInputStream(final TarArchiveEntry entry) throws IOException { +try { +return new BoundedTarEntryInputStream(entry, archive); +} catch (RuntimeException ex) { +throw new IOException("Corrupted TAR archive. Can't read entry", ex); +} } @Override
[commons-compress] 01/03: handle integer overflow
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 190939b04c566e3285b15c109ff69887ecba919d Author: Stefan Bodewig AuthorDate: Fri May 14 17:03:05 2021 +0200 handle integer overflow --- src/main/java/org/apache/commons/compress/utils/IOUtils.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/utils/IOUtils.java b/src/main/java/org/apache/commons/compress/utils/IOUtils.java index 0b7df96..3587ced 100644 --- a/src/main/java/org/apache/commons/compress/utils/IOUtils.java +++ b/src/main/java/org/apache/commons/compress/utils/IOUtils.java @@ -189,7 +189,7 @@ public final class IOUtils { */ public static int readFully(final InputStream input, final byte[] array, final int offset, final int len) throws IOException { -if (len < 0 || offset < 0 || len + offset > array.length) { +if (len < 0 || offset < 0 || len + offset > array.length || len + offset < 0) { throw new IndexOutOfBoundsException(); } int count = 0, x = 0;
[commons-compress] branch master updated: one more case where the JDK throwing RuntimeEx may hurt us
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 882c6dd one more case where the JDK throwing RuntimeEx may hurt us 882c6dd is described below commit 882c6dd12473d7b615d503e08fd6b866d0f866d5 Author: Stefan Bodewig AuthorDate: Thu May 13 20:50:41 2021 +0200 one more case where the JDK throwing RuntimeEx may hurt us similar to 51265b23 Credit to OSS-Fuzz --- .../java/org/apache/commons/compress/archivers/zip/ZipFile.java | 8 +++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java index 17f340b..a3a65a4 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java @@ -818,7 +818,13 @@ public class ZipFile implements Closeable { final byte[] cdExtraData = new byte[extraLen]; IOUtils.readFully(archive, ByteBuffer.wrap(cdExtraData)); -ze.setCentralDirectoryExtra(cdExtraData); +try { +ze.setCentralDirectoryExtra(cdExtraData); +} catch (RuntimeException ex) { +final ZipException z = new ZipException("Invalid extra data in entry " + ze.getName()); +z.initCause(ex); +throw z; +} setSizesAndOffsetFromZip64Extra(ze);
[commons-compress] branch master updated: COMPRESS-542 sanity check 7z metadata with minimizing allocations
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 26924e9 COMPRESS-542 sanity check 7z metadata with minimizing allocations 26924e9 is described below commit 26924e96c7730db014c310757e11c9359db07f3e Author: Stefan Bodewig AuthorDate: Thu May 13 19:08:43 2021 +0200 COMPRESS-542 sanity check 7z metadata with minimizing allocations --- .../compress/archivers/sevenz/SevenZFile.java | 453 - 1 file changed, 452 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index 2f24547..9f5986f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -550,13 +550,29 @@ public class SevenZFile implements Closeable { try (DataInputStream dataInputStream = new DataInputStream(new CRC32VerifyingInputStream( new BoundedSeekableByteChannelInputStream(channel, 20), 20, startHeaderCrc))) { startHeader.nextHeaderOffset = Long.reverseBytes(dataInputStream.readLong()); + if (startHeader.nextHeaderOffset < 0 + || startHeader.nextHeaderOffset + SIGNATURE_HEADER_SIZE > channel.size()) { + throw new IOException("nextHeaderOffset is out of bounds"); + } + startHeader.nextHeaderSize = Long.reverseBytes(dataInputStream.readLong()); + final long nextHeaderEnd = startHeader.nextHeaderOffset + startHeader.nextHeaderSize; + if (nextHeaderEnd < startHeader.nextHeaderOffset + || nextHeaderEnd + SIGNATURE_HEADER_SIZE > channel.size()) { + throw new IOException("nextHeaderSize is out of bounds"); + } + startHeader.nextHeaderCrc = 0xL & Integer.reverseBytes(dataInputStream.readInt()); + return startHeader; } } private void readHeader(final ByteBuffer header, final Archive archive) throws IOException { +final int pos = header.position(); +final ArchiveStatistics stats = sanityCheckAndCollectStatistics(header); +header.position(pos); + int nid = getUnsignedByte(header); if (nid == NID.kArchiveProperties) { @@ -584,6 +600,39 @@ public class SevenZFile implements Closeable { } } +private ArchiveStatistics sanityCheckAndCollectStatistics(final ByteBuffer header) +throws IOException { +final ArchiveStatistics stats = new ArchiveStatistics(); + +int nid = getUnsignedByte(header); + +if (nid == NID.kArchiveProperties) { +sanityCheckArchiveProperties(header, stats); +nid = getUnsignedByte(header); +} + +if (nid == NID.kAdditionalStreamsInfo) { +throw new IOException("Additional streams unsupported"); +//nid = header.readUnsignedByte(); +} + +if (nid == NID.kMainStreamsInfo) { +sanityCheckStreamsInfo(header, stats); +nid = getUnsignedByte(header); +} + +if (nid == NID.kFilesInfo) { +sanityCheckFilesInfo(header, stats); +nid = getUnsignedByte(header); +} + +if (nid != NID.kEnd) { +throw new IOException("Badly terminated header, found " + nid); +} + +return stats; +} + private void readArchiveProperties(final ByteBuffer input) throws IOException { // FIXME: the reference implementation just throws them away? int nid = getUnsignedByte(input); @@ -596,8 +645,24 @@ public class SevenZFile implements Closeable { } } +private void sanityCheckArchiveProperties(final ByteBuffer header, final ArchiveStatistics stats) +throws IOException { +int nid = getUnsignedByte(header); +while (nid != NID.kEnd) { +final int propertySize = +assertFitsIntoNonNegativeInt("propertySize", readUint64(header)); +if (skipBytesFully(header, propertySize) < propertySize) { +throw new IOException("invalid property size"); +} +nid = getUnsignedByte(header); +} +} + private ByteBuffer readEncodedHeader(final ByteBuffer header, final Archive archive, final byte[] password) throws IOException { +final int pos = header.position(); +sanityCheckStreamsInfo(header, new ArchiveStatistics()); +header.position(pos); readStreamsInfo(header, archive);
[commons-compress] branch master updated: update XZ for Java to get access to improved performance
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 5672064 update XZ for Java to get access to improved performance 5672064 is described below commit 56720646af04a7b2102c14a07708bd4ac72a9dc8 Author: Stefan Bodewig AuthorDate: Wed May 12 21:26:09 2021 +0200 update XZ for Java to get access to improved performance --- pom.xml | 2 +- src/changes/changes.xml | 3 +++ 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/pom.xml b/pom.xml index 7603e08..3230613 100644 --- a/pom.xml +++ b/pom.xml @@ -109,7 +109,7 @@ Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj. org.tukaani xz - 1.8 + 1.9 true diff --git a/src/changes/changes.xml b/src/changes/changes.xml index d365a79..72c22b6 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -338,6 +338,9 @@ The type attribute can be add,update,fix,remove. Handling of sparse tar entries has been hardened to ensure bad inputs cause expected IOExceptions rather than RuntimeExceptions. + +Update org.tukaani:xz from 1.8 to 1.9 +
[commons-compress] branch master updated: only update name, wnen present in PAX header, deal with non-numbers in header
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 02dbe7b only update name, wnen present in PAX header, deal with non-numbers in header 02dbe7b is described below commit 02dbe7bb52d541271c3048728c156ca2f1034bc0 Author: Stefan Bodewig AuthorDate: Sun May 2 17:16:20 2021 +0200 only update name, wnen present in PAX header, deal with non-numbers in header --- .../compress/archivers/tar/TarArchiveEntry.java| 41 -- .../archivers/tar/TarArchiveInputStream.java | 3 +- .../commons/compress/archivers/tar/TarFile.java| 3 +- 3 files changed, 34 insertions(+), 13 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index f2205c1..e49e180 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -1298,8 +1298,12 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO * @param value value of header. * @since 1.15 */ -public void addPaxHeader(final String name,final String value) { - processPaxHeader(name,value); +public void addPaxHeader(final String name, final String value) { +try { +processPaxHeader(name,value); +} catch (IOException ex) { +throw new IllegalArgumentException("Invalid input", ex); +} } /** @@ -1317,7 +1321,7 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO * @param headers * @since 1.15 */ -void updateEntryFromPaxHeaders(final Map headers) { +void updateEntryFromPaxHeaders(final Map headers) throws IOException { for (final Map.Entry ent : headers.entrySet()) { final String key = ent.getKey(); final String val = ent.getValue(); @@ -1332,8 +1336,8 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO * @param val * @since 1.15 */ -private void processPaxHeader(final String key, final String val) { -processPaxHeader(key,val,extraPaxHeaders); +private void processPaxHeader(final String key, final String val) throws IOException { +processPaxHeader(key, val, extraPaxHeaders); } /** @@ -1346,7 +1350,8 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO * @throws NumberFormatException if encountered errors when parsing the numbers * @since 1.15 */ -private void processPaxHeader(final String key, final String val, final Map headers) { +private void processPaxHeader(final String key, final String val, final Map headers) +throws IOException { /* * The following headers are defined for Pax. * atime, ctime, charset: cannot use these without changing TarArchiveEntry fields @@ -1749,17 +1754,31 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO } } -void fillGNUSparse1xData(final Map headers) { +void fillGNUSparse1xData(final Map headers) throws IOException { paxGNUSparse = true; paxGNU1XSparse = true; -realSize = Integer.parseInt(headers.get("GNU.sparse.realsize")); -name = headers.get("GNU.sparse.name"); +if (headers.containsKey("GNU.sparse.name")) { +name = headers.get("GNU.sparse.name"); +} +if (headers.containsKey("GNU.sparse.realsize")) { +try { +realSize = Integer.parseInt(headers.get("GNU.sparse.realsize")); +} catch (NumberFormatException ex) { +throw new IOException("Corrupted TAR archive. GNU.sparse.realsize header for " ++ name + " contains non-numeric value"); +} +} } -void fillStarSparseData(final Map headers) { +void fillStarSparseData(final Map headers) throws IOException { starSparse = true; if (headers.containsKey("SCHILY.realsize")) { -realSize = Long.parseLong(headers.get("SCHILY.realsize")); +try { +realSize = Long.parseLong(headers.get("SCHILY.realsize")); +} catch (NumberFormatException ex) { +throw new IOException("Corrupted TAR archive. SCHILY.realsize header for " ++ name + " contains non-numeric value"); +} } } } diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/Tar
[commons-compress] branch master updated: COMPRESS-567 parsing tar headers actually throws IllegalArgumentException
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 004e873 COMPRESS-567 parsing tar headers actually throws IllegalArgumentException 004e873 is described below commit 004e87375572d459ff51c19fe35aa83685cc0cd0 Author: Stefan Bodewig AuthorDate: Sun May 2 13:23:11 2021 +0200 COMPRESS-567 parsing tar headers actually throws IllegalArgumentException --- .../apache/commons/compress/archivers/tar/TarArchiveEntry.java | 10 ++ 1 file changed, 10 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index 0bd8344..f2205c1 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -1568,6 +1568,16 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO private void parseTarHeader(final byte[] header, final ZipEncoding encoding, final boolean oldStyle, final boolean lenient) throws IOException { +try { +parseTarHeaderUnwrapped(header, encoding, oldStyle, lenient); +} catch (IllegalArgumentException ex) { +throw new IOException("Corrupted TAR archive.", ex); +} +} + +private void parseTarHeaderUnwrapped(final byte[] header, final ZipEncoding encoding, + final boolean oldStyle, final boolean lenient) +throws IOException { int offset = 0; name = oldStyle ? TarUtils.parseName(header, offset, NAMELEN)
[commons-compress] branch master updated: more stongly guard what is supposed to become an array size
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new cf4608b more stongly guard what is supposed to become an array size cf4608b is described below commit cf4608bc5752c066d6902d7eb075f6c6da57c397 Author: Stefan Bodewig AuthorDate: Sat May 1 18:46:09 2021 +0200 more stongly guard what is supposed to become an array size --- .../compress/archivers/sevenz/SevenZFile.java | 35 +++--- 1 file changed, 18 insertions(+), 17 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java index dcdb5e3..2f24547 100644 --- a/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/sevenz/SevenZFile.java @@ -515,7 +515,7 @@ public class SevenZFile implements Closeable { } private Archive initializeArchive(final StartHeader startHeader, final byte[] password, final boolean verifyCrc) throws IOException { -assertFitsIntoInt("nextHeaderSize", startHeader.nextHeaderSize); +assertFitsIntoNonNegativeInt("nextHeaderSize", startHeader.nextHeaderSize); final int nextHeaderSizeInt = (int) startHeader.nextHeaderSize; channel.position(SIGNATURE_HEADER_SIZE + startHeader.nextHeaderOffset); ByteBuffer buf = ByteBuffer.allocate(nextHeaderSizeInt).order(ByteOrder.LITTLE_ENDIAN); @@ -589,7 +589,7 @@ public class SevenZFile implements Closeable { int nid = getUnsignedByte(input); while (nid != NID.kEnd) { final long propertySize = readUint64(input); -assertFitsIntoInt("propertySize", propertySize); +assertFitsIntoNonNegativeInt("propertySize", propertySize); final byte[] property = new byte[(int)propertySize]; input.get(property); nid = getUnsignedByte(input); @@ -620,7 +620,7 @@ public class SevenZFile implements Closeable { inputStreamStack = new CRC32VerifyingInputStream(inputStreamStack, folder.getUnpackSize(), folder.crc); } -assertFitsIntoInt("unpackSize", folder.getUnpackSize()); +assertFitsIntoNonNegativeInt("unpackSize", folder.getUnpackSize()); final byte[] nextHeader = new byte[(int)folder.getUnpackSize()]; try (DataInputStream nextHeaderInputStream = new DataInputStream(inputStreamStack)) { nextHeaderInputStream.readFully(nextHeader); @@ -657,7 +657,7 @@ public class SevenZFile implements Closeable { private void readPackInfo(final ByteBuffer header, final Archive archive) throws IOException { archive.packPos = readUint64(header); final long numPackStreams = readUint64(header); -assertFitsIntoInt("numPackStreams", numPackStreams); +assertFitsIntoNonNegativeInt("numPackStreams", numPackStreams); final int numPackStreamsInt = (int) numPackStreams; int nid = getUnsignedByte(header); if (nid == NID.kSize) { @@ -691,7 +691,7 @@ public class SevenZFile implements Closeable { throw new IOException("Expected kFolder, got " + nid); } final long numFolders = readUint64(header); -assertFitsIntoInt("numFolders", numFolders); +assertFitsIntoNonNegativeInt("numFolders", numFolders); final int numFoldersInt = (int) numFolders; final Folder[] folders = new Folder[numFoldersInt]; archive.folders = folders; @@ -708,7 +708,7 @@ public class SevenZFile implements Closeable { throw new IOException("Expected kCodersUnpackSize, got " + nid); } for (final Folder folder : folders) { -assertFitsIntoInt("totalOutputStreams", folder.totalOutputStreams); +assertFitsIntoNonNegativeInt("totalOutputStreams", folder.totalOutputStreams); folder.unpackSizes = new long[(int)folder.totalOutputStreams]; for (int i = 0; i < folder.totalOutputStreams; i++) { folder.unpackSizes[i] = readUint64(header); @@ -746,7 +746,7 @@ public class SevenZFile implements Closeable { totalUnpackStreams = 0; for (final Folder folder : archive.folders) { final long numStreams = readUint64(header); -assertFitsIntoInt("numStreams", numStreams); +assertFitsIntoNonNegativeInt("numStreams", numStreams); folder.numUnpackSubStreams = (int)numStreams; totalUnpackStreams += numStreams; } @@ -785,6 +785,
[commons-compress] 02/02: JDK's ZipEntry#setExtra parses a few extra fields itself ...
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 51265b23722d9ce2262d68979ce7dbb79b94f430 Author: Stefan Bodewig AuthorDate: Sat May 1 18:31:34 2021 +0200 JDK's ZipEntry#setExtra parses a few extra fields itself ... ... and may throw RuntimeExceptions every now and then --- .../commons/compress/archivers/zip/ZipArchiveInputStream.java | 8 +++- .../java/org/apache/commons/compress/archivers/zip/ZipFile.java | 8 +++- 2 files changed, 14 insertions(+), 2 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ZipArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/zip/ZipArchiveInputStream.java index 2652294..af3d45f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ZipArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ZipArchiveInputStream.java @@ -348,7 +348,13 @@ public class ZipArchiveInputStream extends ArchiveInputStream implements InputSt final byte[] extraData = new byte[extraLen]; readFully(extraData); -current.entry.setExtra(extraData); +try { +current.entry.setExtra(extraData); +} catch (RuntimeException ex) { +final ZipException z = new ZipException("Invalid extra data in entry " + current.entry.getName()); +z.initCause(ex); +throw z; +} if (!hasUTF8Flag && useUnicodeExtraFields) { ZipUtil.setNameAndCommentFromExtraFields(current.entry, fileName, null); diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java index d3dd565..17f340b 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/ZipFile.java @@ -1262,7 +1262,13 @@ public class ZipFile implements Closeable { skipBytes(fileNameLen); final byte[] localExtraData = new byte[extraFieldLen]; IOUtils.readFully(archive, ByteBuffer.wrap(localExtraData)); -ze.setExtra(localExtraData); +try { +ze.setExtra(localExtraData); +} catch (RuntimeException ex) { +final ZipException z = new ZipException("Invalid extra data in entry " + ze.getName()); +z.initCause(ex); +throw z; +} if (entriesWithoutUTF8Flag.containsKey(ze)) { final NameAndComment nc = entriesWithoutUTF8Flag.get(ze);
[commons-compress] 01/02: AsiExtraField actually expects quite a few more bytes than it claims
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 96317153c10f6358877729672fc5a221055a0c31 Author: Stefan Bodewig AuthorDate: Sat May 1 18:26:09 2021 +0200 AsiExtraField actually expects quite a few more bytes than it claims --- .../org/apache/commons/compress/archivers/zip/AsiExtraField.java | 9 ++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java b/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java index 98194cd..fa6c864 100644 --- a/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java +++ b/src/main/java/org/apache/commons/compress/archivers/zip/AsiExtraField.java @@ -21,6 +21,9 @@ package org.apache.commons.compress.archivers.zip; import java.util.zip.CRC32; import java.util.zip.ZipException; +import static org.apache.commons.compress.archivers.zip.ZipConstants.SHORT; +import static org.apache.commons.compress.archivers.zip.ZipConstants.WORD; + /** * Adds Unix file permission and UID/GID fields as well as symbolic * link handling. @@ -52,7 +55,7 @@ import java.util.zip.ZipException; public class AsiExtraField implements ZipExtraField, UnixStat, Cloneable { private static final ZipShort HEADER_ID = new ZipShort(0x756E); -private static final int WORD = 4; +private static final int MIN_SIZE = WORD + SHORT + WORD + SHORT + SHORT; /** * Standard Unix stat(2) file mode. */ @@ -266,9 +269,9 @@ public class AsiExtraField implements ZipExtraField, UnixStat, Cloneable { @Override public void parseFromLocalFileData(final byte[] data, final int offset, final int length) throws ZipException { -if (length < WORD) { +if (length < MIN_SIZE) { throw new ZipException("The length is too short, only " -+ length + " bytes, expected at least " + WORD); ++ length + " bytes, expected at least " + MIN_SIZE); } final long givenChecksum = ZipLong.getValue(data, offset);
[commons-compress] branch master updated (deabd92 -> 51265b2)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from deabd92 COMPRESS-575 document changes new 9631715 AsiExtraField actually expects quite a few more bytes than it claims new 51265b2 JDK's ZipEntry#setExtra parses a few extra fields itself ... The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../org/apache/commons/compress/archivers/zip/AsiExtraField.java | 9 ++--- .../commons/compress/archivers/zip/ZipArchiveInputStream.java| 8 +++- .../java/org/apache/commons/compress/archivers/zip/ZipFile.java | 8 +++- 3 files changed, 20 insertions(+), 5 deletions(-)
[commons-compress] 04/05: COMPRESS-567 overlooked a RuntimeException in BoundedArchiveInputStream
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 30ff58d0f120b01e74db83eaee9f819fcf840843 Author: Stefan Bodewig AuthorDate: Sat May 1 15:43:37 2021 +0200 COMPRESS-567 overlooked a RuntimeException in BoundedArchiveInputStream unfortunately I cannot change the signature of BoundedArchiveInputStream's constructor as the way it is used in ZipFile doesn't allow it to throw an IOException without breaking backwards compatibility of ZipFile#getRawInputStream --- src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java | 4 1 file changed, 4 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 2ba4ee2..378d4a5 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -359,6 +359,10 @@ public class TarFile implements Closeable { if (sparseHeader.getNumbytes() > 0) { final long start = currEntry.getDataOffset() + sparseHeader.getOffset() - numberOfZeroBytesInSparseEntry; +if (start + sparseHeader.getNumbytes() < start) { +// possible integer overflow +throw new IOException("Unreadable TAR archive, sparse block offset or length too big"); +} streams.add(new BoundedSeekableByteChannelInputStream(start, sparseHeader.getNumbytes(), archive)); }
[commons-compress] 03/05: COMPRESS-575 confine sparse entry to its claimed realSize
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit 864588382b21a5ea80d03e4e34fe6756ea8b131c Author: Stefan Bodewig AuthorDate: Sat May 1 15:41:02 2021 +0200 COMPRESS-575 confine sparse entry to its claimed realSize --- .../commons/compress/archivers/tar/TarArchiveEntry.java | 7 +++ .../commons/compress/archivers/tar/TarArchiveEntryTest.java | 13 + 2 files changed, 20 insertions(+) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index de9f26f..0bd8344 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -961,6 +961,13 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO + getName() + " too large."); } } +if (!orderedAndFiltered.isEmpty()) { +final TarArchiveStructSparse last = orderedAndFiltered.get(orderedAndFiltered.size() - 1); +if (last.getOffset() + last.getNumbytes() > getRealSize()) { +throw new IOException("Corrupted TAR archive. Sparse block extends beyond real size of the entry"); +} +} + return orderedAndFiltered; } diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveEntryTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveEntryTest.java index 47195ac..c0b1e97 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveEntryTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/tar/TarArchiveEntryTest.java @@ -35,6 +35,7 @@ import java.io.IOException; import java.nio.charset.StandardCharsets; import java.nio.file.Files; import java.util.Arrays; +import java.util.Collections; import java.util.List; import java.util.Locale; import org.apache.commons.compress.AbstractTestCase; @@ -276,6 +277,8 @@ public class TarArchiveEntryTest implements TarConstants { @Test public void getOrderedSparseHeadersSortsAndFiltersSparseStructs() throws Exception { final TarArchiveEntry te = new TarArchiveEntry("test"); +// hacky way to set realSize +te.fillStarSparseData(Collections.singletonMap("SCHILY.realsize", "201")); te.setSparseHeaders(Arrays.asList(new TarArchiveStructSparse(10, 2), new TarArchiveStructSparse(20, 0), new TarArchiveStructSparse(15, 1), new TarArchiveStructSparse(0, 0))); final List strs = te.getOrderedSparseHeaders(); @@ -288,6 +291,7 @@ public class TarArchiveEntryTest implements TarConstants { @Test(expected = IOException.class) public void getOrderedSparseHeadersRejectsOverlappingStructs() throws Exception { final TarArchiveEntry te = new TarArchiveEntry("test"); +te.fillStarSparseData(Collections.singletonMap("SCHILY.realsize", "201")); te.setSparseHeaders(Arrays.asList(new TarArchiveStructSparse(10, 5), new TarArchiveStructSparse(12, 1))); te.getOrderedSparseHeaders(); } @@ -295,10 +299,19 @@ public class TarArchiveEntryTest implements TarConstants { @Test(expected = IOException.class) public void getOrderedSparseHeadersRejectsStructsWithReallyBigNumbers() throws Exception { final TarArchiveEntry te = new TarArchiveEntry("test"); +te.fillStarSparseData(Collections.singletonMap("SCHILY.realsize", String.valueOf(Long.MAX_VALUE))); te.setSparseHeaders(Arrays.asList(new TarArchiveStructSparse(Long.MAX_VALUE, 2))); te.getOrderedSparseHeaders(); } +@Test(expected = IOException.class) +public void getOrderedSparseHeadersRejectsStructsPointingBeyondOutputEntry() throws Exception { +final TarArchiveEntry te = new TarArchiveEntry("test"); +te.setSparseHeaders(Arrays.asList(new TarArchiveStructSparse(200, 2))); +te.fillStarSparseData(Collections.singletonMap("SCHILY.realsize", "201")); +te.getOrderedSparseHeaders(); +} + private void assertGnuMagic(final TarArchiveEntry t) { assertEquals(MAGIC_GNU + VERSION_GNU_SPACE, readMagic(t)); }
[commons-compress] 05/05: COMPRESS-575 document changes
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit deabd925d789efcfaf677ec38fd74c13b6ddd04d Author: Stefan Bodewig AuthorDate: Sat May 1 15:48:08 2021 +0200 COMPRESS-575 document changes --- src/changes/changes.xml | 4 1 file changed, 4 insertions(+) diff --git a/src/changes/changes.xml b/src/changes/changes.xml index 67520d5..d365a79 100644 --- a/src/changes/changes.xml +++ b/src/changes/changes.xml @@ -334,6 +334,10 @@ The type attribute can be add,update,fix,remove. Update com.github.luben:zstd-jni from 1.4.8-7 to 1.4.9-1 #176. + +Handling of sparse tar entries has been hardened to ensure bad +inputs cause expected IOExceptions rather than RuntimeExceptions. +
[commons-compress] branch master updated (2887391 -> deabd92)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 2887391 COMPRESS-575 harden parser for PAX 1.0 sparse struct blocks new f196253 COMPRESS-575 extract and harden sorting of sparse structs new ad40318 whitespace changes only new 8645883 COMPRESS-575 confine sparse entry to its claimed realSize new 30ff58d COMPRESS-567 overlooked a RuntimeException in BoundedArchiveInputStream new deabd92 COMPRESS-575 document changes The 5 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: src/changes/changes.xml| 4 ++ .../compress/archivers/tar/TarArchiveEntry.java| 42 ++ .../archivers/tar/TarArchiveInputStream.java | 55 +++--- .../archivers/tar/TarArchiveSparseEntry.java | 4 ++ .../commons/compress/archivers/tar/TarFile.java| 67 +- .../commons/compress/archivers/tar/TarUtils.java | 5 +- .../compress/archivers/tar/SparseFilesTest.java| 38 ++-- .../archivers/tar/TarArchiveEntryTest.java | 41 + 8 files changed, 175 insertions(+), 81 deletions(-)
[commons-compress] 02/05: whitespace changes only
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit ad403183376fdd12881cc8fd48eab73e56334afd Author: Stefan Bodewig AuthorDate: Sat May 1 15:29:57 2021 +0200 whitespace changes only --- .../archivers/tar/TarArchiveInputStream.java | 40 +- .../commons/compress/archivers/tar/TarFile.java| 48 +++--- 2 files changed, 44 insertions(+), 44 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index 902af1f..63b9109 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -914,30 +914,30 @@ public class TarArchiveInputStream extends ArchiveInputStream { final List sparseHeaders = currEntry.getOrderedSparseHeaders(); -// Stream doesn't need to be closed at all as it doesn't use any resources -final InputStream zeroInputStream = new TarArchiveSparseZeroInputStream(); //NOSONAR -// logical offset into the extracted entry -long offset = 0; -for (final TarArchiveStructSparse sparseHeader : sparseHeaders) { -final long zeroBlockSize = sparseHeader.getOffset() - offset; -if (zeroBlockSize < 0) { -// sparse header says to move backwards inside of the extracted entry -throw new IOException("Corrupted struct sparse detected"); -} - -// only store the zero block if it is not empty -if (zeroBlockSize > 0) { -sparseInputStreams.add(new BoundedInputStream(zeroInputStream, sparseHeader.getOffset() - offset)); -} +// Stream doesn't need to be closed at all as it doesn't use any resources +final InputStream zeroInputStream = new TarArchiveSparseZeroInputStream(); //NOSONAR +// logical offset into the extracted entry +long offset = 0; +for (final TarArchiveStructSparse sparseHeader : sparseHeaders) { +final long zeroBlockSize = sparseHeader.getOffset() - offset; +if (zeroBlockSize < 0) { +// sparse header says to move backwards inside of the extracted entry +throw new IOException("Corrupted struct sparse detected"); +} -// only store the input streams with non-zero size -if (sparseHeader.getNumbytes() > 0) { -sparseInputStreams.add(new BoundedInputStream(inputStream, sparseHeader.getNumbytes())); -} +// only store the zero block if it is not empty +if (zeroBlockSize > 0) { +sparseInputStreams.add(new BoundedInputStream(zeroInputStream, sparseHeader.getOffset() - offset)); +} -offset = sparseHeader.getOffset() + sparseHeader.getNumbytes(); +// only store the input streams with non-zero size +if (sparseHeader.getNumbytes() > 0) { +sparseInputStreams.add(new BoundedInputStream(inputStream, sparseHeader.getNumbytes())); } +offset = sparseHeader.getOffset() + sparseHeader.getNumbytes(); +} + if (!sparseInputStreams.isEmpty()) { currentSparseInputStreamIndex = 0; } diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 6bcd701..2ba4ee2 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -337,34 +337,34 @@ public class TarFile implements Closeable { final List sparseHeaders = currEntry.getOrderedSparseHeaders(); -// Stream doesn't need to be closed at all as it doesn't use any resources -final InputStream zeroInputStream = new TarArchiveSparseZeroInputStream(); //NOSONAR -// logical offset into the extracted entry -long offset = 0; -long numberOfZeroBytesInSparseEntry = 0; -for (TarArchiveStructSparse sparseHeader : sparseHeaders) { -final long zeroBlockSize = sparseHeader.getOffset() - offset; -if (zeroBlockSize < 0) { -// sparse header says to move backwards inside of the extracted entry -throw new IOException("Corrupted struct sparse detected"); -} - -// only store the zero block if it is not empty -if (zeroBlockSize > 0) { -
[commons-compress] 01/05: COMPRESS-575 extract and harden sorting of sparse structs
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit f196253cc9794ac682679854e68c3184ab938618 Author: Stefan Bodewig AuthorDate: Sat May 1 15:27:50 2021 +0200 COMPRESS-575 extract and harden sorting of sparse structs --- .../compress/archivers/tar/TarArchiveEntry.java| 35 .../archivers/tar/TarArchiveInputStream.java | 27 --- .../archivers/tar/TarArchiveSparseEntry.java | 4 +++ .../commons/compress/archivers/tar/TarFile.java| 33 +-- .../commons/compress/archivers/tar/TarUtils.java | 5 +-- .../compress/archivers/tar/SparseFilesTest.java| 38 +++--- .../archivers/tar/TarArchiveEntryTest.java | 28 7 files changed, 118 insertions(+), 52 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index fdeb0de..de9f26f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -30,6 +30,7 @@ import java.nio.file.attribute.FileTime; import java.nio.file.attribute.PosixFileAttributes; import java.util.ArrayList; import java.util.Collections; +import java.util.Comparator; import java.util.Date; import java.util.HashMap; import java.util.Iterator; @@ -38,6 +39,7 @@ import java.util.Locale; import java.util.Map; import java.util.Set; import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; import org.apache.commons.compress.archivers.ArchiveEntry; import org.apache.commons.compress.archivers.EntryStreamOffsets; @@ -930,6 +932,39 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO } /** + * Get this entry's sparse headers ordered by offset with all empty sparse sections at the start filtered out. + * + * @return immutable list of this entry's sparse headers, never null + * @since 1.21 + * @throws IOException if the list of sparse headers contains blocks that overlap + */ +public List getOrderedSparseHeaders() throws IOException { +if (sparseHeaders == null || sparseHeaders.isEmpty()) { +return Collections.emptyList(); +} +final List orderedAndFiltered = sparseHeaders.stream() +.filter(s -> s.getOffset() > 0 || s.getNumbytes() > 0) + .sorted(Comparator.comparingLong(TarArchiveStructSparse::getOffset)) +.collect(Collectors.toList()); + +for (int i = 0; i < orderedAndFiltered.size(); i++) { +final TarArchiveStructSparse str = orderedAndFiltered.get(i); +if (i + 1 < orderedAndFiltered.size()) { +if (str.getOffset() + str.getNumbytes() > orderedAndFiltered.get(i + 1).getOffset()) { +throw new IOException("Corrupted TAR archive. Sparse blocks for " ++ getName() + " overlap each other."); +} +} +if (str.getOffset() + str.getNumbytes() < 0) { +// integer overflow? +throw new IOException("Unreadable TAR archive. Offset and numbytes for sparse block in " ++ getName() + " too large."); +} +} +return orderedAndFiltered; +} + +/** * Get if this entry is a sparse file with 1.X PAX Format or not * * @return True if this entry is a sparse file with 1.X PAX Format diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index 4aaeba9..902af1f 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -28,7 +28,6 @@ import java.io.FileInputStream; import java.io.IOException; import java.io.InputStream; import java.util.ArrayList; -import java.util.Comparator; import java.util.HashMap; import java.util.List; import java.util.Map; @@ -913,32 +912,21 @@ public class TarArchiveInputStream extends ArchiveInputStream { currentSparseInputStreamIndex = -1; sparseInputStreams = new ArrayList<>(); -final List sparseHeaders = currEntry.getSparseHeaders(); -// sort the sparse headers in case they are written in wrong order -if (sparseHeaders != null && sparseHeaders.size() > 1) { -final Comparator sparseHeaderComparator = (p, q) -> { -final Long pOffset = p.getOffset(); -fina
[commons-compress] branch master updated: COMPRESS-575 harden parser for PAX 1.0 sparse struct blocks
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 2887391 COMPRESS-575 harden parser for PAX 1.0 sparse struct blocks 2887391 is described below commit 2887391296216e5196921f04bea8cd834843cfe2 Author: Stefan Bodewig AuthorDate: Sat May 1 13:47:25 2021 +0200 COMPRESS-575 harden parser for PAX 1.0 sparse struct blocks --- .../commons/compress/archivers/tar/TarUtils.java | 19 +++- .../compress/archivers/tar/TarUtilsTest.java | 119 + 2 files changed, 136 insertions(+), 2 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index eee894e..3ba7a1b 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -882,14 +882,26 @@ public class TarUtils { long[] readResult = readLineOfNumberForPax1X(inputStream); long sparseHeadersCount = readResult[0]; +if (sparseHeadersCount < 0) { +// overflow while reading number? +throw new IOException("Corrupted TAR archive. Negative value in sparse headers block"); +} bytesRead += readResult[1]; while (sparseHeadersCount-- > 0) { readResult = readLineOfNumberForPax1X(inputStream); -long sparseOffset = readResult[0]; +final long sparseOffset = readResult[0]; +if (sparseOffset < 0) { +throw new IOException("Corrupted TAR archive." ++ " Sparse header block offset contains negative value"); +} bytesRead += readResult[1]; readResult = readLineOfNumberForPax1X(inputStream); -long sparseNumbytes = readResult[0]; +final long sparseNumbytes = readResult[0]; +if (sparseNumbytes < 0) { +throw new IOException("Corrupted TAR archive." ++ " Sparse header block numbytes contains negative value"); +} bytesRead += readResult[1]; sparseHeaders.add(new TarArchiveStructSparse(sparseOffset, sparseNumbytes)); } @@ -918,6 +930,9 @@ public class TarUtils { if (number == -1) { throw new IOException("Unexpected EOF when reading parse information of 1.X PAX format"); } +if (number < '0' || number > '9') { +throw new IOException("Corrupted TAR archive. Non-numeric value in sparse headers block"); +} result = result * 10 + (number - '0'); } bytesRead += 1; diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java index 0bca2f0..adb2408 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java @@ -672,4 +672,123 @@ public class TarUtilsTest { TarUtils.parsePAX01SparseHeaders(map); } +@Test +public void parsePAX1XSparseHeaders() throws Exception { +final byte[] header = ("1\n" ++ "0\n" ++ "20\n") +.getBytes(); +final byte[] block = new byte[512]; +System.arraycopy(header, 0, block, 0, header.length); +try (ByteArrayInputStream in = new ByteArrayInputStream(block)) { +final List sparse = TarUtils.parsePAX1XSparseHeaders(in, 512); +assertEquals(1, sparse.size()); +assertEquals(0, sparse.get(0).getOffset()); +assertEquals(20, sparse.get(0).getNumbytes()); +assertEquals(-1, in.read()); +} +} + +@Test +public void parsePAX1XSparseHeadersRejectsIncompleteLastLine() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Unexpected EOF")); +final byte[] header = ("1\n" ++ "0\n" ++ "20") +.getBytes(); +try (ByteArrayInputStream in = new ByteArrayInputStream(header)) { +TarUtils.parsePAX1XSparseHeaders(in, 512); +} +} + +@Test +public void parsePAX1XSparseHeadersRejectsNonNumericNumberOfEntries() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Corrupted TAR archive.")); +final byte[] header = ("x\n" ++ "0\n" +
[commons-compress] 02/02: COMPRESS-575 forgot tests for parseSparseStructs
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit c7c1a8b561bb94dc1154ac654cb957e807a21d77 Author: Stefan Bodewig AuthorDate: Sat May 1 12:43:25 2021 +0200 COMPRESS-575 forgot tests for parseSparseStructs --- .../compress/archivers/tar/TarUtilsTest.java | 63 ++ 1 file changed, 63 insertions(+) diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java index 1727ecd..0bca2f0 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java @@ -549,6 +549,69 @@ public class TarUtilsTest { } @Test +public void readSparseStructsOctal() throws Exception { +final byte[] header = "000 007 ".getBytes(); +assertEquals(24, header.length); +final List sparse = TarUtils.readSparseStructs(header, 0, 1); +assertEquals(1, sparse.size()); +assertEquals(0, sparse.get(0).getOffset()); +assertEquals(7, sparse.get(0).getNumbytes()); +} + +@Test +public void readSparseStructsBinary() throws Exception { +final byte[] header = { +(byte) 0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, +(byte) 0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 7, +}; +assertEquals(24, header.length); +final List sparse = TarUtils.readSparseStructs(header, 0, 1); +assertEquals(1, sparse.size()); +assertEquals(0, sparse.get(0).getOffset()); +assertEquals(7, sparse.get(0).getNumbytes()); +} + +@Test +public void readSparseStructsRejectsNonNumericOffset() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Corrupted TAR archive")); +final byte[] header = "00x 007 ".getBytes(); +TarUtils.readSparseStructs(header, 0, 1); +} + +@Test +public void readSparseStructsRejectsNegativeOffset() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Corrupted TAR archive")); +final byte[] header = { +(byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, +(byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, +(byte) 0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 7, +}; +TarUtils.readSparseStructs(header, 0, 1); +} + +@Test +public void readSparseStructsRejectsNonNumericNumbytes() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Corrupted TAR archive")); +final byte[] header = "000 00x ".getBytes(); +TarUtils.readSparseStructs(header, 0, 1); +} + +@Test +public void readSparseStructsRejectsNegativeNumbytes() throws Exception { +thrown.expect(IOException.class); +thrown.expectMessage(startsWith("Corrupted TAR archive")); +final byte[] header = { +(byte) 0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, +(byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, +(byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, (byte) 0xff, +}; +TarUtils.readSparseStructs(header, 0, 1); +} + +@Test public void parseFromPAX01SparseHeaders() throws Exception { final String map = "0,10,20,0,20,5"; final List sparse = TarUtils.parseFromPAX01SparseHeaders(map);
[commons-compress] 01/02: COMPRESS-575 harden parser of PAX 0.1 GNU sparse map
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git commit ecf514b46342c783586c2703a5b0f371092933c3 Author: Stefan Bodewig AuthorDate: Sat May 1 11:29:35 2021 +0200 COMPRESS-575 harden parser of PAX 0.1 GNU sparse map --- .../archivers/tar/TarArchiveInputStream.java | 2 +- .../commons/compress/archivers/tar/TarFile.java| 2 +- .../commons/compress/archivers/tar/TarUtils.java | 51 -- .../compress/archivers/tar/TarUtilsTest.java | 61 ++ 4 files changed, 111 insertions(+), 5 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java index e67f1d7..4aaeba9 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveInputStream.java @@ -607,7 +607,7 @@ public class TarArchiveInputStream extends ArchiveInputStream { // for 0.1 PAX Headers if (headers.containsKey("GNU.sparse.map")) { -sparseHeaders = TarUtils.parsePAX01SparseHeaders(headers.get("GNU.sparse.map")); +sparseHeaders = new ArrayList<>(TarUtils.parseFromPAX01SparseHeaders(headers.get("GNU.sparse.map"))); } getNextEntry(); // Get the actual file entry if (currEntry == null) { diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java index 6d4a4a8..9b82025 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarFile.java @@ -431,7 +431,7 @@ public class TarFile implements Closeable { // for 0.1 PAX Headers if (headers.containsKey("GNU.sparse.map")) { -sparseHeaders = TarUtils.parsePAX01SparseHeaders(headers.get("GNU.sparse.map")); +sparseHeaders = new ArrayList<>(TarUtils.parseFromPAX01SparseHeaders(headers.get("GNU.sparse.map"))); } getNextTarEntry(); // Get the actual file entry if (currEntry == null) { diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index fe6fe4d..eee894e 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -803,20 +803,65 @@ public class TarUtils { * GNU.sparse.map *Map of non-null data chunks. It is a string consisting of comma-separated values "offset,size[,offset-1,size-1...]" * + * Will internally invoke {@link #parseFromPAX01SparseHeaders} and map IOExceptions to a RzuntimeException, You + * should use {@link #parseFromPAX01SparseHeaders} directly instead. + * * @param sparseMap the sparse map string consisting of comma-separated values "offset,size[,offset-1,size-1...]" * @return sparse headers parsed from sparse map + * @deprecated use #parseFromPAX01SparseHeaders instead */ protected static List parsePAX01SparseHeaders(String sparseMap) { +try { +return parseFromPAX01SparseHeaders(sparseMap); +} catch (IOException ex) { +throw new RuntimeException(ex.getMessage(), ex); +} +} + +/** + * For PAX Format 0.1, the sparse headers are stored in a single variable : GNU.sparse.map + * GNU.sparse.map + *Map of non-null data chunks. It is a string consisting of comma-separated values "offset,size[,offset-1,size-1...]" + * + * @param sparseMap the sparse map string consisting of comma-separated values "offset,size[,offset-1,size-1...]" + * @return unmodifiable list of sparse headers parsed from sparse map + * @since 1.21 + */ +protected static List parseFromPAX01SparseHeaders(String sparseMap) +throws IOException { List sparseHeaders = new ArrayList<>(); String[] sparseHeaderStrings = sparseMap.split(","); +if (sparseHeaderStrings.length % 2 == 1) { +throw new IOException("Corrupted TAR archive. Bad format in GNU.sparse.map PAX Header"); +} for (int i = 0; i < sparseHeaderStrings.length; i += 2) { -long sparseOffset = Long.parseLong(sparseHeaderStrings[i]); -long sparseNumbytes = Long.parseLong(sparseHeaderStrings[i + 1]); +long sparseOffset; +try { +sparseOffset = Long.parseLong(sparseHeaderStrings[i]); +} cat
[commons-compress] branch master updated (ef765af -> c7c1a8b)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from ef765af COMPRESS-575 extract and harden parsing of GNU stuct_sparse entries new ecf514b COMPRESS-575 harden parser of PAX 0.1 GNU sparse map new c7c1a8b COMPRESS-575 forgot tests for parseSparseStructs The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../archivers/tar/TarArchiveInputStream.java | 2 +- .../commons/compress/archivers/tar/TarFile.java| 2 +- .../commons/compress/archivers/tar/TarUtils.java | 51 - .../compress/archivers/tar/TarUtilsTest.java | 124 + 4 files changed, 174 insertions(+), 5 deletions(-)
[commons-compress] branch master updated: COMPRESS-575 extract and harden parsing of GNU stuct_sparse entries
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new ef765af COMPRESS-575 extract and harden parsing of GNU stuct_sparse entries ef765af is described below commit ef765af60786caf58be3efcf1fe84620bf4f2872 Author: Stefan Bodewig AuthorDate: Sat May 1 11:01:13 2021 +0200 COMPRESS-575 extract and harden parsing of GNU stuct_sparse entries I believe the code is wrong and we should drop empty structs even if their offset is non-0 but this makes tests fail in non-obvious ways, will revisit it later. --- .../compress/archivers/tar/TarArchiveEntry.java| 12 ++--- .../archivers/tar/TarArchiveSparseEntry.java | 12 + .../commons/compress/archivers/tar/TarUtils.java | 30 ++ 3 files changed, 33 insertions(+), 21 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java index 12da7bb..fdeb0de 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveEntry.java @@ -1577,16 +1577,8 @@ public class TarArchiveEntry implements ArchiveEntry, TarConstants, EntryStreamO offset += OFFSETLEN_GNU; offset += LONGNAMESLEN_GNU; offset += PAD2LEN_GNU; -sparseHeaders = new ArrayList<>(); -for (int i = 0; i < SPARSE_HEADERS_IN_OLDGNU_HEADER; i++) { -final TarArchiveStructSparse sparseHeader = TarUtils.parseSparse(header, -offset + i * (SPARSE_OFFSET_LEN + SPARSE_NUMBYTES_LEN)); - -// some sparse headers are empty, we need to skip these sparse headers -if(sparseHeader.getOffset() > 0 || sparseHeader.getNumbytes() > 0) { -sparseHeaders.add(sparseHeader); -} -} +sparseHeaders = +new ArrayList<>(TarUtils.readSparseStructs(header, offset, SPARSE_HEADERS_IN_OLDGNU_HEADER)); offset += SPARSELEN_GNU; isExtended = TarUtils.parseBoolean(header, offset); offset += ISEXTENDEDLEN_GNU; diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveSparseEntry.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveSparseEntry.java index b738de1..6973194 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveSparseEntry.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarArchiveSparseEntry.java @@ -57,17 +57,7 @@ public class TarArchiveSparseEntry implements TarConstants { */ public TarArchiveSparseEntry(final byte[] headerBuf) throws IOException { int offset = 0; -sparseHeaders = new ArrayList<>(); -for(int i = 0; i < SPARSE_HEADERS_IN_EXTENSION_HEADER;i++) { -final TarArchiveStructSparse sparseHeader = TarUtils.parseSparse(headerBuf, -offset + i * (SPARSE_OFFSET_LEN + SPARSE_NUMBYTES_LEN)); - -// some sparse headers are empty, we need to skip these sparse headers -if(sparseHeader.getOffset() > 0 || sparseHeader.getNumbytes() > 0) { -sparseHeaders.add(sparseHeader); -} -} - +sparseHeaders = new ArrayList<>(TarUtils.readSparseStructs(headerBuf, 0, SPARSE_HEADERS_IN_EXTENSION_HEADER)); offset += SPARSELEN_GNU_SPARSE; isExtended = TarUtils.parseBoolean(headerBuf, offset); } diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index ea4741a..fe6fe4d 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -25,6 +25,7 @@ import java.math.BigInteger; import java.nio.ByteBuffer; import java.nio.charset.StandardCharsets; import java.util.ArrayList; +import java.util.Collections; import java.util.HashMap; import java.util.List; import java.util.Map; @@ -327,6 +328,35 @@ public class TarUtils { } /** + * @since 1.21 + */ +static List readSparseStructs(final byte[] buffer, final int offset, final int entries) +throws IOException { +final List sparseHeaders = new ArrayList<>(); +for (int i = 0; i < entries; i++) { +try { +final TarArchiveStructSparse sparseHeader = +parseSparse(buffer, offset + i * (SPARSE_OFFSET_LEN + SPARSE_NUMBYTES_LEN)); + +if (sparseHeader.getOffset() < 0) { +
[commons-compress] branch master updated: COMPRESS-575 harden parser for PAX 0.0 sparse headers
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git The following commit(s) were added to refs/heads/master by this push: new 61615b3 COMPRESS-575 harden parser for PAX 0.0 sparse headers 61615b3 is described below commit 61615b31c57fd96e122a70bd823e155117422328 Author: Stefan Bodewig AuthorDate: Sat May 1 10:11:57 2021 +0200 COMPRESS-575 harden parser for PAX 0.0 sparse headers --- .../commons/compress/archivers/tar/TarUtils.java | 28 ++- .../compress/archivers/tar/TarUtilsTest.java | 92 +- 2 files changed, 117 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java index 5b7fb28..ea4741a 100644 --- a/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java +++ b/src/main/java/org/apache/commons/compress/archivers/tar/TarUtils.java @@ -695,6 +695,10 @@ public class TarUtils { + got); } // Drop trailing NL +if (rest[rest.length - 1] != '\n') { +throw new IOException("Failed to read Paxheader." + + "Value should end with a newline"); +} final String value = new String(rest, 0, restLen - 1, StandardCharsets.UTF_8); headers.put(keyword, value); @@ -705,7 +709,16 @@ public class TarUtils { // previous GNU.sparse.offset header but but no numBytes sparseHeaders.add(new TarArchiveStructSparse(offset, 0)); } -offset = Long.valueOf(value); +try { +offset = Long.valueOf(value); +} catch (NumberFormatException ex) { +throw new IOException("Failed to read Paxheader." ++ "GNU.sparse.offset contains a non-numeric value"); +} +if (offset < 0) { +throw new IOException("Failed to read Paxheader." ++ "GNU.sparse.offset contains negative value"); +} } // for 0.0 PAX Headers @@ -714,7 +727,18 @@ public class TarUtils { throw new IOException("Failed to read Paxheader." + "GNU.sparse.offset is expected before GNU.sparse.numbytes shows up."); } -sparseHeaders.add(new TarArchiveStructSparse(offset, Long.parseLong(value))); +long numbytes; +try { +numbytes = Long.parseLong(value); +} catch (NumberFormatException ex) { +throw new IOException("Failed to read Paxheader." ++ "GNU.sparse.numbytes contains a non-numeric value."); +} +if (numbytes < 0) { +throw new IOException("Failed to read Paxheader." ++ "GNU.sparse.numbytes contains negative value"); +} +sparseHeaders.add(new TarArchiveStructSparse(offset, numbytes)); offset = null; } } diff --git a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java index 446a059..4e04f06 100644 --- a/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java +++ b/src/test/java/org/apache/commons/compress/archivers/tar/TarUtilsTest.java @@ -23,7 +23,10 @@ import java.io.IOException; import java.io.InputStream; import java.nio.charset.StandardCharsets; import java.nio.file.Files; +import java.util.ArrayList; +import java.util.Collections; import java.util.HashMap; +
[commons-compress] branch master updated (33789f6 -> 84b8b7a)
This is an automated email from the ASF dual-hosted git repository. bodewig pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/commons-compress.git. from 33789f6 * Remove redundant initializer * Remove not use return value new 40fcdf6 Replace construction of FileInputStream and FileOutputStream objects with Files NIO APIs. new 735e99a Change test to use NIO Api's new 84b8b7a Merge pull request #186 from arturobernalg/feature/filesNioApi The 3176 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../archivers/zip/ZipSplitOutputStream.java| 6 +- .../compress/compressors/pack200/Pack200Utils.java | 9 +- .../org/apache/commons/compress/utils/IOUtils.java | 3 +- .../apache/commons/compress/AbstractTestCase.java | 14 +- .../apache/commons/compress/ArchiveReadTest.java | 4 +- .../apache/commons/compress/ChainingTestCase.java | 6 +- .../commons/compress/DetectArchiverTestCase.java | 7 +- .../org/apache/commons/compress/IOMethodsTest.java | 8 +- .../commons/compress/archivers/ArTestCase.java | 63 + .../archivers/ArchiveOutputStreamTest.java | 8 +- .../archivers/ArchiveStreamFactoryTest.java| 24 ++-- .../commons/compress/archivers/CpioTestCase.java | 43 +++ .../commons/compress/archivers/DumpTestCase.java | 11 +- .../commons/compress/archivers/JarTestCase.java| 21 ++- .../commons/compress/archivers/LongPathTest.java | 7 +- .../compress/archivers/LongSymLinkTest.java| 7 +- .../commons/compress/archivers/SevenZTestCase.java | 5 +- .../commons/compress/archivers/TarTestCase.java| 65 +- .../commons/compress/archivers/ZipTestCase.java| 43 +++ .../archivers/ar/ArArchiveInputStreamTest.java | 16 +-- .../archivers/ar/ArArchiveOutputStreamTest.java| 7 +- .../archivers/arj/ArjArchiveInputStreamTest.java | 13 +- .../archivers/cpio/CpioArchiveInputStreamTest.java | 13 +- .../cpio/CpioArchiveOutputStreamTest.java | 10 +- .../archivers/dump/DumpArchiveInputStreamTest.java | 10 +- .../archivers/jar/JarArchiveOutputStreamTest.java | 4 +- .../compress/archivers/sevenz/SevenZFileTest.java | 3 +- .../compress/archivers/tar/SparseFilesTest.java| 21 ++- .../archivers/tar/TarArchiveEntryTest.java | 7 +- .../archivers/tar/TarArchiveInputStreamTest.java | 34 ++--- .../archivers/tar/TarArchiveOutputStreamTest.java | 10 +- .../commons/compress/archivers/tar/TarLister.java | 4 +- .../compress/archivers/tar/TarUtilsTest.java | 5 +- .../compress/archivers/zip/DataDescriptorTest.java | 9 +- .../archivers/zip/EncryptedArchiveTest.java| 4 +- .../compress/archivers/zip/ExplodeSupportTest.java | 4 +- .../commons/compress/archivers/zip/Lister.java | 8 +- .../archivers/zip/Maven221MultiVolumeTest.java | 7 +- .../zip/ParallelScatterZipCreatorTest.java | 9 +- .../compress/archivers/zip/UTF8ZipFilesTest.java | 18 +-- .../archivers/zip/X5455_ExtendedTimestampTest.java | 6 +- .../compress/archivers/zip/Zip64SupportIT.java | 8 +- .../archivers/zip/ZipArchiveInputStreamTest.java | 58 - .../zip/ZipFileIgnoringLocalFileHeaderTest.java| 4 +- .../compress/archivers/zip/ZipFileTest.java| 22 ++-- .../archivers/zip/ZipSplitOutputStreamTest.java| 6 +- .../compress/changes/ChangeSetTestCase.java| 142 ++--- .../compress/compressors/BZip2TestCase.java| 21 ++- .../compress/compressors/DeflateTestCase.java | 19 ++- .../compressors/DetectCompressorTestCase.java | 14 +- .../compress/compressors/FramedSnappyTestCase.java | 31 ++--- .../commons/compress/compressors/GZipTestCase.java | 31 +++-- .../commons/compress/compressors/LZMATestCase.java | 21 ++- .../compress/compressors/Pack200TestCase.java | 23 ++-- .../commons/compress/compressors/XZTestCase.java | 15 +-- .../commons/compress/compressors/ZTestCase.java| 8 +- .../brotli/BrotliCompressorInputStreamTest.java| 20 +-- .../bzip2/BZip2CompressorInputStreamTest.java | 9 +- .../bzip2/BZip2NSelectorsOverflowTest.java | 4 +- .../deflate/DeflateCompressorInputStreamTest.java | 12 +- .../lz4/BlockLZ4CompressorInputStreamTest.java | 11 +- .../lz4/BlockLZ4CompressorRoundtripTest.java | 13 +- .../compress/compressors/lz4/FactoryTest.java | 11 +- .../lz4/FramedLZ4CompressorInputStreamTest.java| 38 +++--- .../lz4/FramedLZ4CompressorRoundtripTest.java | 5 +- .../compress/compressors/lz4/XXHash32Test.java | 5 +- .../compressors/pack200/Pack200UtilsTest.java | 16 +-- .../FramedSnappyCompressorInputStrea