Adam, Thank you for replying. Here is my second delay pool attempt. Do you think it will serve the intended purpose, slowing down robots while allowing humans full speed access?
Does using buckets have any detrimental impact on the Squid machine's load? My overall goal is to try to minimize robot's impact on machine load on BOTH the Squid server machine and the back-end webservers its accelerating. Are any special build configuration parameters required to use "browser"? # Common browsers acl humans browser Explorer Netscape Mozilla Firefox Navigator Communicator Opera Safari Shiira Konqueror Amaya AOL Camino Chimera Mosaic OmniWeb wKiosk KidsBrowser Firebird # Delay Pools delay_pools 2 # 2 delay pools delay_class 1 2 # pool 1 is a class 2 pool for humans delay_class 2 2 # pool 2 is a class 2 pool for robots delay_access 1 allow humans delay_access 1 deny all delay_parameters 1 -1/-1 64000/64000 delay_parameters 2 -1/-1 7000/8000 # Non-humans get this slow bucket Thank you, John Kent Webmaster NRL Monterey http://www.nrlmry.navy.mil/sat_products.html -----Original Message----- From: news [mailto:[EMAIL PROTECTED] Behalf Of Adam Aube Sent: Tuesday, December 21, 2004 5:40 PM To: [EMAIL PROTECTED] Subject: [squid-users] Re: Delay Pools for Robots Kent, Mr. John (Contractor) wrote: > Have an image intensive website (satellite weather photos). > Using Squid as an accelerator. > > Want to slow down robots and spiders while basically not > affecting human users who access the web pages. > > Would the following delay_pool parameters be correct for this purpose > or would other values be better? > > delay_pools 1 # 1 delay pools > delay_class 1 2 # pool 1 is a class 2 pool > delay_parameters 1 -1/-1 32000/64000 This makes no distinction between robots and normal visitors. For that you can use the browser acl (which matches on the User-Agent string the client sends), then use different delay pools for the common browsers and robots. Adam
