Not entirely what you want.. but something like Squirm [1] can do regex matches on urls -- but you would need to know what files you want to redirect. This combined with a simple perl script could get you the automation you want (ie, grep the Squid logs for large objects, cache them locally, add an entry to Squirm).
Stan Brinkerhoff [1]: http://squirm.foote.com.au/ On Sun, Jan 4, 2009 at 11:04 PM, Kendrick <[email protected]> wrote: > I was wondering if any one had ever heard or made a squid setup or know of > a plugin to allow specific file types to be saved to a http server and > redirected there. > > I want to set it up so that things like iso's zip's media files and the > such that are over 100 mb and or used very commonly to be redirected to a > seperate local webserver. it would be great to have a system in place that > could automaticly move all files falling in to a set of criteria and or > specific file types. i know stuff like dans gaurdian can do avscan and > content blocking. due to space and network config the webserver is not > apart of the cache server and needs to be seperate. It also makes admin of > the files a bit easier that way to as I can have scripts set to purge unused > files on the webserver and make backups of the files. > Thanks > Kendrick >
