image_leechers browser Wget
acl has_user_agent browser ^.+$
http_access deny !has_user_agent
http_access deny image_leechers
I promise not to make a habit of just conversing with myself on this list...
2008/10/20 James Cohen [EMAIL PROTECTED]:
Hi,
I think I've found a bug but first wanted to double
2008/10/20 Amos Jeffries [EMAIL PROTECTED]:
It's not so much an empty string. As a completely missing header.
Squid can only test what it has against what it checks. If you get my
meaning.
I haven't tested it, but you might have better luck if you invert the test
to allow access to okay
Hi,
I think I've found a bug but first wanted to double-check I wasn't
doing anything dumb.
In our reverse proxy setup we want to block people from leeching the
images using Wget or similar applications. To do this we want to block
user agents that match Wget and because lots of people use CURL
Henrik/Amos,
Thanks for the replies. You're 100% correct in suggesting that we are
using proxy-only.
Thinking a little bit more now about the resilience we want to put in
place and the impact of one of the cache servers going down I can see
that running without proxy-only could be a great
Hi,
I have two reverse proxy servers using each other as neighbours. The
proxy servers are load balanced (using a least connections
algorithm) by a Netscaler upstream of them.
A small amount of URLs account for around 50% or so of the requests.
At the moment there's some imbalance in the hit