> Which is why you tar them up first (and there's no reason why it shouldn't
> be done in one go.  But there is a reason why it should be done first
> rather than second, which is that doing it first gives better compression).
> Anyway, why is a single compressed file any more lose-able than any other
> kind of file?
> 
> > LHarc format is a generic PD archive envelope definition that is supported 
> > from CP/M, BBC-B, IBM-CLONE, QL, and generic unix!
> 
> That still doesn't tell me what kind of compression it uses.

LZH....

> > NO LHarc method beats the ZIP deflate alogarithm on achieved ratios. BUT 
> > they 
> > ALL require much LESS working ram space to compress and de-compress
> 
> The thing about gzip is that it uses very little RAM to decompress (apart
> from having to store each block of the file, which I would have thought
> was a necessity for most systems to be reasonably efficient).  It is rather
> resource-consuming during compression, but then I probably don't care that
> much because it's the decompression that matters most.

It does? It produces exactly the same table for decompression as for
compression, otherwise it wouldn't be able to decompress...or?! Hmm...
can't remember much now.

> imc
> 


-- 
* Frode Tennebo                         * It's better to live life in     *
* email: [EMAIL PROTECTED]              * wealth and die poor, than live  * 
* phone: +47 712 57716                  * life in poverty and die rich.   *
* snail: Parkv. 31, 6400 Molde, NORWAY  *                   -Frode Tennebo*

Reply via email to