Pretty late, but my 2 bits ...

Dwivedi Ajay kumar wrote:

>         The compression techniques are usually general and the percentage
> compression depends on the  content and generally the length of the file.
> Take an example.
> The stats for html files in my /home/httpd/html/manual/mod/mod_php3
>         file                    usage(blocks)   Size
>         *.*                     4885            100%
>         gzip *.*                1965             40.2%
>         gunzip *.*
>         tar *.* >/tmp/a.tar     5158            105.5%
>         gzip a.tar              609             12.5%
> 
>         Your html files must be real small and hence any general
> compression algoritm will not compress your files greatly. The only way is
> to tar -zc them.

tar is basically an archiving program, it actually increases the size of
the file, b'cos it puts header info about the archive in the tar file.

the reason that tar followed by a compression achieves better results
than a set of compressed files tarred together is that in the former
case the compression algo. is better able to make use of the redundancy
amongst the various files in thae archive.

regards,

sachin

-----------------------------------------------------------------------
For information on this and other Linux India mailing lists check out
http://lists.linux-india.org/

Reply via email to