src lansource within workhours {
#iplist lansource/lan
ip 10.0.0.26
}
Nope - src definitions are not supposed to involve time restrictions.
dest ads {
domainlist ads/domains
urllist ads/urls
redirect http://www.yahoo.com
}
I suggest you redirect advertisements to
redirect http://local.web.server/blank.gif
where blank.gif is a 1x1 transparent image
# ACLs
acl {
lansource {
pass !adult !audio-video !forums !hacking !redirector !warez
!ads !aggressive !drugs !gambling !publicite !violence !banneddestination
!advertising all
redirect
http://127.0.0.1/cgi-bin/squidGuard.cgi?clientaddr=%a&srcclass=%s&targetclas
s=%t&url=%u
}
default {
pass none
redirect
http://127.0.0.1/cgi-bin/squidGuard.cgi?clientaddr=%a&srcclass=%s&targetclas
s=%t&url=%u
}
}
Remember 127.0.0.1 is the local address of the client, not the server. I
suggest you use the IP of your linux box there.
You should use time restrictions inside the ACLs, like this
acl {
lansource within workhours {
pass !adult !audio-video !forums !hacking !redirector !warez
!ads !aggressive !drugs !gambling !publicite !violence !banneddestination
!advertising all
redirect
http://127.0.0.1/cgi-bin/squidGuard.cgi?clientaddr=%a&srcclass=%s&targetclas
s=%t&url=%u
} else {
pass all
}
Also check the file /var/log/squidGuard/squidGuard.log and see whats going
on there. If it says "emergency pass-all mode" then theres a problem with
your db files.
> ----------
> From: Stephen Torri[SMTP:[EMAIL PROTECTED]]
> Sent: Tuesday, December 18, 2001 10:43 AM
> To: Craig Falconer
> Cc: Squidguard Mailing List
> Subject: RE: Blacklist updating
>
> <<File: squidGuard.conf.txt>>
> On Tue, 18 Dec 2001, Craig Falconer wrote:
>
> > One of the problems with squidguard is how it is awkward to do an update
> of
> > the diff files. I use this little script (
> /usr/local/squidGuard/db/reload
> > ) to recreate db files and kick squid over.
> >
> > My editor makes a backup file with a ~ appended to the end, so the
> script
> > looks for that file and regenerates the db as necessary. I do not use
> the
> > supplied black-lists, however periodically (once in 6-12 months) I
> simply
> > append the new pornography domains file to mine... then the uniq sorts
> out
> > double-ups.
> >
> > I also use categories that are not in the supplied blacklists:
> > approved a list of sites that are always allowed, for when
> > we're using severe time-based restrictions.
> > excessive-volume sites that seem to create a lot of
> > traffic... we have a cap of 5 Gb a month.
>
> Right now I have a more pressing problem than the updating the blacklist.
>
> The present databases that come with squidGuard are not being used
> apparently. I setup squid and apache for a basic setup. I then configured
> squidGuard. The problem I found was on a test to see if I could load a
> porn site (www.playboy.com) the porn rules were not tripped. I used the
> standard RedHat 7.2 install with a few slight modifications. If I point
> Mozilla to use HTTP traffic on port 3128 (squid's default) I can surf the
> web. I see squid's cache fill up but I am not seeing the squidGuard being
> used. How can I test to verify that squidGuard is being used to check all
> web traffic?
>
> My squidGuard configuration file is attached.
>
> Stephen
>