>
> I disagree. I think the way things are set up now, a significant portion
> of a freenet user's time is spent reading HTML and other
> compressable formats in search of media files. zlib compression is
> very computationally inexpensive, especially when used with human
> readable file formats. It can compress text files upwards of 80-90% in
> many cases. This data does not make up a large part of freenet taken
> purely as data size, but it does make up a significant percentage of
> the files requested by users. Compression of text and HTML files will
> certainly not take longer than sending them over the wire raw. Even
> when sent over a local loop in HTTP 1.1 with compression, the
> increase in retrieval time is about 6-7%, the reduced i/o time nearly
> makes up for the time spent doing compression, and that's without
> any network lag at all. I definitely think this would help improve the
> perceived performance of surfing freenet websites. You must
> remeber that most people on the edges of the network have 56k
> modems that don't even approach 56k in rural areas with older phone
> switching equipment.
>
No, it really wouldn't. The "perceived performance surfing Freenet
websites" is caused by the latency traversing the network to
search for the data. Compression of the data wouldn't help with that at
all.
PGP signature