On Sunday 01 June 2003 14:09, Beni Cherniavsky wrote:
> Lacking that (mozilla, konq, etc. won't complete this in one week,
> right? ;-), I'm probably looking for some kind of transparent caching
> proxy setup.  The transparent part I can read in a HOWTO I once saw.
> Now for the caching - I want to be able to force it to download and
> not to erase specific sites.  The download part can be perfectly done
> with wget, leveraging it's recursion controls.  All that's left is a
> caching proxy that can respect local files.  A clean design would also
> allow me to plug in site tarballs obtained through other means,
> including the ability to provide installable packages (rpm -i RFCs
> anybody?), combining system-wide and per-user files.
>
> So does anybody know ready tools for such a setup?

wwwoffled (WWW Offline Daemon) from http://www.gedanken.demon.co.uk/ is a nice 
transparent caching proxy. It has a powerful config file that lets you force 
a site (or any wildcard expression against the url) to stay in cache for X 
days or indefinitely, etc. It also provides an interface for configuration 
and searching the cache via http. And it accepts cli commands to fetch pages 
or whole sites, or you could just tell wget to use it as proxy and fetch 
>/dev/null.

However, it stores cached pages using hashed filenames, so accessing them 
without using it is hard. I don't know if you can disable that - its page 
goes into more detail.

About plugging in cache pieces, since it hashes the filenames, you'd need to 
prepare them with another wwwoffle installation. So it's not clean in that 
regard. Still, maybe you'll find it useful.

-- 
Dan Armak
Matan, Israel
Public GPG key: http://cvs.gentoo.org/~danarmak/danarmak-gpg-public.key

Attachment: pgp00000.pgp
Description: signature

Reply via email to