Karl Cunningham wrote:
know how much space would be taken up if all the files in several directories (and their subdirectories) were gzipped. This must be done on the fly as there isn't enough space to store gzipped versions of all the files.

There's got to be some `find ... | gzip | wc -c` magic that can be done here but I haven't found it as yet.

gzip actually has to compress the file to calculate how much it compressed the file. Probably you have to do:

find . -type f -exec gzip -c {} \; | wc

That way gzip will push the compressed data to stdout which wc will then count.

However, that's going to take quite a while and may overflow memory/pipe/etc depending on how large things are.

-a


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to