> Actualy, this brings me to something I was thinking about
> a while back.  Right now, a person downloading a freesite must
> make a seperate request for every image on the site.  This could
> be improved by putting an entire freesite (or a chunk of one
> if it's really big) in a .tar.gz, with files being extracted from
> the tarball as needed, instead of going out and requesting something
> new each time.

The same effect could be achieved by adding prefetching to FProxy.



_______________________________________________
Devl mailing list
Devl at freenetproject.org
http://lists.freenetproject.org/mailman/listinfo/devl
>From - Sun May  6 15:04:53 2001
Return-Path: <devl-admin at freenetproject.org>
Received: from hawk.freenetproject.org (postfix@[4.18.42.11])
        by funky.danky.com (8.9.3/8.8.7) with ESMTP id MAA18780
        for <danello at danky.com>; Thu, 3 May 2001 12:14:40 -0400
Received: from hawk.freenetproject.org (localhost [127.0.0.1])
        by hawk.freenetproject.org (Postfix) with ESMTP
        id 664065817C; Thu,  3 May 2001 09:55:03 -0700 (PDT)
Delivered-To: devl at freenetproject.org

Reply via email to