On Sat, Apr 28, 2001 at 07:34:18PM -0500, Scott G. Miller wrote:
> > This is a special case - text or HTML of a reasonable size that should be
> > compressed to save space and hence increase its likelihood of survival on the
> > network. It would not be sensible for it to occur automatically, unless if only
> > with particular MIME types, but decoding support in freenet_request and fproxy
> > might be worthwhile. Medium sized HTML pages are quite compressible, as
> > evidenced by the W3C performance paper. And it's not just about bandwidth. And
> > sticking stuff in ZIPs is not a good answer for pages which people would like to
> > refer to and browse individually. The size bias means that it will signigicantly
> > improve short term survival of pages.
>
> Again. If the problem is that data is falling out, the solution is not to
> dance around that with compression. FIX THE REAL ISSUE. Also, any gains
> from compression are a drop in the bucket compared to the latency of doing
> a Freenet search... which compression does nothing for.
Compression increases the utility of the network by letting nodes store more
useful data. And data falling out is here to stay.
--
Always hardwire the explosives
-- Fiona Dexter quoting Monkey, J. Gregory Keyes, Dark Genesis
_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/devl