I have gotten squidguard 1.2.0 setup and running under squid2.4
However it seems that squid doesn't attempt to do whatever it needs to
do to have squidguard check the urls - therefore all sites are
accessible.

When I do a:
echo "http://ibm.com 10.0.0.1/- - GET" | /usr/local/bin/squidGuard -d

(and ibm.com is in my 'bad guys' list ;-) it comes back with the correct
redirect to google as I've got it set to in my .conf file and I get the
blank line for 'good sites'

I've read the FAQ and checked the directory permissions for my
blacklists and squidguard - all have read access.  And squid actually
does start the 5 squidguard processes...the last thing is that squid is
running as a transparent proxy

relp!

Reply via email to