On Thursday 24 July 2003 15:41, Michael Schierl wrote:
> Gordan schrieb:
> > What happens when the same
> > files are linked from multiple pages, e.g. active links?
>
> Adding active links to other .zip manifests is simply broken. They
> should show that the content is still there - so bundle them with the
> html file.

With each HTML file?

How long, exactly, would you expect a large site with links to a lot of other 
sites to load if it has to download a complete 1 MB archive for each active 
link it has in it? It would take forever. I do not believe that is plausible.

> IMO containers are a better approach than creating huge sites (like TFE
> or nubile) or using "images" linking to HTML for preloading sites - or
> providing a compressed version separately (like TFEE), which can hardly
> be retrieved.

This has all been solved before. If the solution is pre-caching, then there 
are better ways to achieve the solution than making each download 1 MB. That 
is just ridiculous. Instead, it would probably be better to implement a 
limited depth web crawler in fproxy that would download things up to 1 or 2 
hops away from a page being visited, with a limit of how many downloads to do 
simultaneously.

That way, it can still be handled in the node, it will help site propagation, 
and it will speed things up. And best of all, it will not require the 
horrible, horrible cludge of using archives to transfer entire sites.

Purely client side solutions already exist. I am sure that I saw a piece of 
software years ago that interfaces with IE and tries to pre-cache things for 
you, so that when you click on a link, the chances are that the next page is 
already cached.

Gordan
_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to