wget info page:
`--delete-after'
This option tells Wget to delete every single file it downloads,
_after_ having done so. It is useful for pre-fetching popular
pages through a proxy, e.g.:
wget -r -nd --delete-after http://whatever.com/~popular/page/
The `-r' option is to retrieve recursively, and `-nd' to not
create directories.
Note that `--delete-after' deletes files on the local machine. It
does not issue the `DELE' command to remote FTP sites, for
instance. Also note that when `--delete-after' is specified,
`--convert-links' is ignored, so `.orig' files are simply not
created in the first place.
HTH.
--James
> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] Behalf Of
> t
> Sent: Saturday, 22 March 2003 10:23 PM
> To: [EMAIL PROTECTED]
> Subject: [SLUG] Want squid to automatically update certain sites every
> 10 mins
>
>
> Hi Folks
>
> There are a few sites which I look at a lot, and what I want
> to do is make
> sure the cached version is updated on a regular basis even if I do not
> expliclty look at the site with a browser So I want squid
> look up the site
> every 10 mins. I was thinking of simply doing something like
> this in a
> script file
>
> lynx http://www.smh.com.au
> http://www.theaustralian.news.com.au/
> lynx http://news.bbc.co.uk
>
> and just making the script file run every 10 mins. But this
> looks a bit
> dodgy doing this, and lynx will not download the
> pictures(??). Is there a
> better way?
>
> Thanks
>
> Tony
>
> --
> SLUG - Sydney Linux User's Group - http://slug.org.au/
> More Info: http://lists.slug.org.au/listinfo/slug
>
--
SLUG - Sydney Linux User's Group - http://slug.org.au/
More Info: http://lists.slug.org.au/listinfo/slug