On Mon, 21 Nov 2011 14:28:55 -0600
Bruce Dubbs <[email protected]> wrote:

> I've been working a bit with tex-live. I now uses xz to compress their 
> very large files (126M and 1.2G).  I note that it takes quite a bit more 
> time to extract these files than bz2.  The tradeoff may be worth it, but 
> the better compression is not free.
> 
> time tar -xf texlive-20110705-texmf.tar.xz
> 
> real    2m3.082s
> 
> This is on a reasonably fast system.

I don't think you can blame xz for this. 1.2GB is a huge file. I get:

andy@eccles:~$ time tar -xf texlive-20110705-texmf.tar.xz

real    2m0.428s
user    1m30.334s
sys     0m5.243s

So far, the same as you. But look at this, if I just use xz to
decompress it it still takes one and a half minutes:

andy@eccles:~$ rm -rf texlive-20110705-texmf
andy@eccles:~$ time xz -d texlive-20110705-texmf.tar.xz

real    1m33.782s
user    1m31.334s
sys     0m1.523s

However, then using tar alone and it's back to two minutes again:

andy@eccles:~$ time tar -xf texlive-20110705-texmf.tar

real    2m0.242s
user    0m0.370s
sys     0m4.563s

It decompresses to 2.3GB which thrashes the disk a lot. I suspect that
much of the time is taken doing disk input/output.
As I understand it the LZMA2 algorithm is slow to compress (and uses a
lot of RAM) but is moderately quick (and uses a moderate amount of RAM)
to decompress. The smaller file size for downloads makes it worth the
extra time it takes to compress the archive in the first place.
FWIW if I'm backing up a partition I don't compress it at all as I'm
not short of disk space, I just use tar to make it into a single file.

Andy
-- 
http://linuxfromscratch.org/mailman/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page

Reply via email to