Looking at that graph, of the tools available as Debian/Ubuntu packages, I think I'd go with lzop. An informal test on my laptop seems to suggest the same.
Don On Sat, Jul 9, 2011 at 8:06 AM, iosif <iosif.neit...@gmail.com> wrote: > http://www.linuxjournal.com/article/8051 > > On Fri, Jul 8, 2011 at 20:59, Huan Truong <hnt7...@truman.edu> wrote: > > On Fri, 08 Jul 2011 19:19 -0500, "iosif" <iosif.neit...@gmail.com> > > wrote: > >> Not all compressions are created equal on all formats: > >> http://sourceforge.net/projects/boost/files/boost/1.46.1/ ... 7zip > >> wins here. > > > > And that is exactly the point: In many cases it's not about the file > > size, it's about how much you gain. > > > > In the past when we had to use dial-up, maybe using an insane > > compression algorithm to get a 40MB file from a 180MB file file is wise, > > even the decompression takes 1 hour, because you would end up spending > > less time downloading+ extracting overall. Now it might be a better > > choice using another program to get a 60MB file with 30 second > > extraction time, the extra 20MB saved isn't really "worth it." > > > > Trading so much time (not free) compressing and de-compressing in cases > > where bandwidth and storage are essentially free is not a good choice. > > > > It's hard to find where the "sweet spot" is, and it varies from cases to > > cases. > > > > Dr. Bindner, I haven't thought of disk cache, I will try again multiple > > time to make sure I got it right. The "std" for "-" is a "technical > > difficulty" because the way the parameters are handled in the demo lz4 > > program. I should have rewritten the whole thing. > > -- > > Huan Truong > > 600-988-9066 > > http://tnhh.net/ > > > > >