On Tue, 4 Feb 2003, Todd Walton wrote:

> 
> >Has compression of files in Freenet been considered?
> 
> In a way, yes.  There's another benefit of the compression scheme you 
> suggest.  If you're compressing you may as well archive, too.

I am not sure archiving is a good idea, unlike compression. See below for 
explanation.

> The thing that came up previously was allowing freesite authors the ability 
> to put all of their files into one tarball (or .jar or .zip or 
> whatever).  That way, if you retrieve the freesite you have the whole 
> thing.  If you go to retrieve something else on that freesite then 
> *bickity-bam*, it's there.  The sound effects are open for discussion, but 
> that's the general idea.  The unit of selection competing for datastore 
> space becomes whole freesites instead of just parts and the browsing 
> experience (it is supposed) will improve.

I disagree. There is a huge leap between the two concepts.

Compression could selectively be applied to the current design, as is. It 
could also be selectively left to the browser to deal with the 
decompression for the extra speed (implementation in C vs. Java).

OTOH, if you want to do archiving as well, you have several major 
drawbacks:

1) A site can be big. It would be unusual, granted, but it could be 
hundreds of megabytes. Having to retrieve the entire huge archive before 
you can view the site would clearly be ridiculous.

2) Even for a small site, not all parts/pages of it will be equally 
popular. It would therefore make sense to allow each page to fend for 
itself in the election of dropping out from the node's data store.

3) A compromise solution would be even worse, if you try to archive 
all the images on a page into a tar ball containing the page and the 
images. This is because it would kill the automatic duplicate data removal 
that CHKs allow, and it would remove the ability to re-use linked content 
on other pages. For example, you may use linking banners on multiple 
sites.

I think archiving is a really bad idea.

> >2.1) We care about supporting browsers thad don't support gzip
> 
> >2.2) We DON'T care about browsers that don't support the gzip encoding.
> 
> It'd probably be chaos and unusability to leave it up to the browser.  You 
> never know (literally) where Freenet will be used.  It'd be nice to know 
> that if you needed Freenet and you had only a weak browser that you could 
> still do it.

Indeed. This is why a module for decompressing gzip content would be 
useful. For the non-supporting browsers, you could decompress the data, 
while leaving it copressed for the browsers that claim to support it. This 
probably means that this should be done in fproxy. Any code dealing 
directly with the node direct should probably implement it's own handling 
anyway.

Alternatively, it could all be done completely transparently in the node 
itself, before the data gets as far as fproxy. That way the decompression 
would be done all in one place. Or, make fproxy set a flag depending on 
what the browser says, and request the content compressed based on the 
browser's capabilities.

Gordan


_______________________________________________
Tech mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/tech

Reply via email to