On Mon, 2001-12-10 at 09:20, Colin McKinnon wrote: > Hi all. > > Problem - certain $%^&ing **&@~s put bugs on their web pages which demand > to be refreshed every 10 seconds (or so). I have a network connected up via > demand dial / isdn. Users pull up a page with bugs on and go away and leave > their computers switched on - result - the link strays up even when nobody > is using it. > > We run squid on the proxy server. Short of petrol bombing the offending > institutions is there any way of stopping this while still being able to > access dynamic content on the internet?
I seem to remember there being an option in squid to set how careful it is about cache. I can't remember what its called but it lets you set how often squid will look up the source page and refresh its cache, You may be able to use that so that any page thats being refreshed every 10 seconds or so will only be truely refreshed every say 10 minutes, you may even be able to do this for specific sites. I don't have access to the machine i set up to do a similar job to check the config but i know its possible. (The office staff had to use a (.gov) website which had a lot of static dynamic content and one isdn line which caused problems everytime you went to fill in the forms, because it had to download every image/frame/etc every time) > > (I've already exercised the "educating the users option") Forget the education, start the punishment. Sorry this is a bit vague, its monday, and its morning. D > > Colin > > -------------------------------------------------------------------- > http://www.lug.org.uk http://www.linuxportal.co.uk > http://www.linuxjob.co.uk http://www.linuxshop.co.uk > -------------------------------------------------------------------- -------------------------------------------------------------------- http://www.lug.org.uk http://www.linuxportal.co.uk http://www.linuxjob.co.uk http://www.linuxshop.co.uk --------------------------------------------------------------------
