I'm frequently downloading (e.g. with wget)  sites that I wish to view
offline, especially various documentation and standards.  Some I put
in my home area, some under /usr/share/doc.  I've also heard that
somewhere under /var/ is the correct place...  The problem is that
browsing them sometimes from file:// urls and sometimes from the
on-line pages is inconvenient.  Currently I just bookmark all things I
have locally and manually use the bookmarks as needed but this list is
growing fast.

Which leads me to think about the browser's cache and offline mode.  I
can transparently see pages that happen to be in the cache - but I
can't make sure that specific areas are there all the time.  I can't
even what files are, except through the browser, because some genius
decided that the files should have obfuscated names.  Something like
wget's default www.site.name/path style would be much better.

So I'm looking for some user-visible web cache.  It would be close to
ideal if all browsers would standardize on non-obfuscated cache path
names and would support a way to plug my own static versions into this
cache (not to talk about simply *sharing* the cache between them).  A
wget-like interface to this from the browser (including recursion,
link relativisation and other wget goodies) would complete the
paradise.  Sounds almost like Window's Offline Files!  I never tried
this but if it works and is documeted sio that third-party program
can co-operate with it, MS can finally score a feature I like (I guess
they fail the second part ;).

Lacking that (mozilla, konq, etc. won't complete this in one week,
right? ;-), I'm probably looking for some kind of transparent caching
proxy setup.  The transparent part I can read in a HOWTO I once saw.
Now for the caching - I want to be able to force it to download and
not to erase specific sites.  The download part can be perfectly done
with wget, leveraging it's recursion controls.  All that's left is a
caching proxy that can respect local files.  A clean design would also
allow me to plug in site tarballs obtained through other means,
including the ability to provide installable packages (rpm -i RFCs
anybody?), combining system-wide and per-user files.

So does anybody know ready tools for such a setup?

-- 
Beni Cherniavsky <[EMAIL PROTECTED]>

The Three Laws of Copy-Protechnics:
http://www.technion.ac.il/~cben/threelaws.html

=================================================================
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]

Reply via email to