On 3/26/07, Henrik Nordstrom <[EMAIL PROTECTED]> wrote:
One way is to set up a separate set of cache_peer for these robots,
using the no-cache cache_peer option to avoid having that traffic
cached. Then use cache_peer_access with suitable acls to route the robot
requests via these peers and deny them from the other normal set of
peers.

AFAICS, it won't solve the problem as the robots won't be able to
access the "global" cache readonly.

--
Guillaume

Reply via email to