Including the manual... El Wed, 24 Sep 2014 01:31:28 -0300 "Matias A. Fonzo" <[email protected]> escribió: > I'm sorry. I forgot to add the following notes: > > El Wed, 24 Sep 2014 01:18:54 -0300 > "Matias A. Fonzo" <[email protected]> escribió: > > El Tue, 23 Sep 2014 02:23:43 -0300 > > Alexandre Oliva <[email protected]> escribió: > > > Hello, Matias, > > > > > > On Sep 18, 2014, "Matias A. Fonzo" <[email protected]> wrote: > > > > > > > Have you considered to use sourceforge as alternative?. It has > > > > mirrors around the world. > > > > > > Not really. But how would it have helped? I surely wouldn't > > > entrust the primary copy of the source tarballs to it, so I'd have > > > to keep them on our server anyway... > > > > Of course not. The good thing is that SourceForge has a network of > > mirrors which represents availability around the world. Even if > > SourceForge is down or are prohibiting access to other countries. > > You would have a copy somewhere. > > > > > > Why? Lzip can compress more than xz with a bit of tuning via > > > > --options. > > > > > > Maybe it can, but when I compared the sizes of the files to decide > > > which one to keep, .xz files were consistently (if slightly) > > > smaller than .lz ones. > > > > > > Maybe I'm not using the best options to compress tarballs, vcdiffs > > > and xdeltas with lzip. Suggestions are certainly welcome. > > > > Probably `lzip -m64 -s64MiB' should be enough. This is equal to the > > values of xz (match-length and dictionary size). Of course, this > > will increase the memory usage in the decompression, because the > > dictionary size has been incremented. In any case, consumes less > > memory than xz: > > > > http://mattmahoney.net/dc/text.html (See the Ranking) > > If the time is a problem (compressing/decompressing), there is a > version of lzip written in C -- called clzip[1], it may be a little > faster since it does not link to libstdc++. But there is plzip[2] "A > multi-threaded compressor using the lzip file format". > > [1] http://lzip.nongnu.org/clzip.html > [2] http://lzip.nongnu.org/plzip.html > > > > > And it is not forcing users to possess a lot of memory to > > > > decompress .xz. > > > > > > That is a point I was not aware of, so I had not taken into > > > account. > > > > > > > Lzip was designed for long-term archiving, having a > > > > tool to recover corrupt files. > > > > > > I very much doubt it could recover corrupt files to the point that > > > the original signature would match, because that would require a > > > lot of redundancy to be added, which is the opposite of what a > > > compressor is supposed to do. And if the original signature > > > doesn't match, I wouldn't trust the result, especially given that > > > we have alternate paths to obtain the tarballs. > > > > I'm not the right person to answer. I think Antonio is subscribed to > > the list, he can give the details.
http://lzip.nongnu.org/manual/lziprecover_manual.html > > > > Ideal for linux-libre, especially because > > > > it is under the GPL. > > > > > > Yup, lzip remains the promoted compression format in the GNU > > > Linux-libre Free Software Directory page. > > > > > > Thanks for your feedback and for educating me on another aspect in > > > which lzip is superior, > > > > > > > Thanks for linux-libre. > > > > Take care, > > Matias > > > > _______________________________________________ > > linux-libre mailing list > > [email protected] > > http://www.fsfla.org/cgi-bin/mailman/listinfo/linux-libre > _______________________________________________ linux-libre mailing list [email protected] http://www.fsfla.org/cgi-bin/mailman/listinfo/linux-libre
