On 03 Nov 94 04:35:00 +0000, Johnathan Taylor said:
> A gzip'd tar archive requires complete decompression BEFORE you can even 
> examine it's directory

That's true.  It all depends on what you want it for.

>  >> Anyway, why is a single compressed file any more lose-able than any other
>  >> kind of file?

> A gzip'd tar must first be decompressed in its entirety.

Your answer is quite correct, but it is to a different question...  It
is true that if a gzip file gets corrupted then that's it.  However I
think specially catering for corrupted files is not the main aim of a
compression/archive system.

Anyway, the explanation that I actually wanted was of the following sentence:

jet>                                                            Plus I'd want
jet> to be able combine assosiated files into a single archive not a seperate
jet> lose-able bunch of seperatly compressed files!

So why is a single compressed file any more lose-able than any other kind of
file?

>  Fr> LZH....

> Or to expand upon that a bit it's based on LZ repeated string encoding but 
> Huffman encodes the length & position as well as unique data using either 
> dynamic or static Huffman coding depending on the the particular method.

How is this different from LZ77?

>                                                                          These
> ARE ALL Documented in various places on the nets, so those with direct access 
> to all those net tools should be able to locate them themselves!

There are quite a lot of things on the net actually, so you will have to
narrow down the search a little more than that.

>  >> The thing about gzip is that it uses
>  >> very little RAM to decompress (apart
>  >> from having to store each block of the
>  >> file

> I don't know WHO wrote the above paragraph imc maybe... but GET REAL! gzip 
> and 
> PKUNZIP2.04g etc are required to keep a running 32K ring-buffer OR rely on 
> flawless random file access of the output stream in order to get at the 32k 
> sliding dictionary!

Which is implied by "storing each block of the file" (if the blocks are 32K.
Note: I did _not_ mean disk blocks.

> Simple way for those that believe that gzip is perfect for the SAM is WRITE 
> IT! Don't sit there making unfounded claims about it! *PROVE US WRONG!*

Do you remember how this started?  By me saying I might write one.

> As if deflate was the BEST lossless compression method... Ever heard of RAR?

No I haven't.

> Oh btw LZHuff can compress some LZW encoded stuff a bit further!

Perhaps, but you are better off uncompressing the LZW first before trying
another compression method.

imc

Reply via email to