> This is a special case - text or HTML of a reasonable size that should be
> compressed to save space and hence increase its likelihood of survival on the
> network. It would not be sensible for it to occur automatically, unless if 
> only
> with particular MIME types, but decoding support in freenet_request and fproxy
> might be worthwhile. Medium sized HTML pages are quite compressible, as 
> evidenced by the W3C performance paper. And it's not just about bandwidth. And
> sticking stuff in ZIPs is not a good answer for pages which people would like 
> to
> refer to and browse individually. The size bias means that it will 
> signigicantly
> improve short term survival of pages.

Again.  If the problem is that data is falling out, the solution is not to
dance around that with compression.  FIX THE REAL ISSUE.  Also, any gains
from compression are a drop in the bucket compared to the latency of doing
a Freenet search... which compression does nothing for.

-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 232 bytes
Desc: not available
URL: 
<https://emu.freenetproject.org/pipermail/devl/attachments/20010428/9551bdeb/attachment.pgp>

Reply via email to