I have been dealing with web proxing fro a while. I am using squid
2.3-stable3 and I have used jesred as a filter for a while.
As I read some documentation about squidguard's efficiency, I am in the
middle of doing some tests on it.
I use Linux RedHat 7.1 on a PC with 128MB RAM, 256 MB swap and 10 GB disk
space.
I tested the blacklist from the squidguard site, essentially trying to
filter out porn sites, but a lot of pages that get filtered seems to have no
relationship with the one listed in the domainlist and urls, so I passed,
also recommended by some other webmasters, to use only the regular
expressions.
I have exprerienced jesred for a while (which uses only regular expressions)
and what happens is that, if you filter out every urls that contain the
string 'xxx', qou can filter out something like
'www.exxxeptionalHardwarePrices.com which could be interesting to look at.
So, whith jesred, you put an acl before the one that denied urls with the
string 'xxx', to say that all entries that contains the string
'exxxeptionalHardware' are allowed.
I thought to do the same with squidguard and I tried:
:::::::::::::::::::::::
dest noporn {
        urllist noporn/npurls
}

dest porn {
        expressionlist  porn/expressions
        log             porn.log
}

acl {
        default {
                pass noporn !porn all
                redirect http://www.usl6.toscana.it/oslissdenied.html
        }

}
:::::::::::::::::::::::
the file porn/expressions is:
:::::::::::::::::::::::
^http://.*sex.*\..*
:::::::::::::::::::::::
the file noporn/npurls is:
:::::::::::::::::::::::
sussex.com
:::::::::::::::::::::::
When I try to access www.sussex.com I get denied.

Does anybody has seriously tried to use the 'pass' record with some posive
dest before a negative one?
Thanks in advance

---------------------------------------------
Dr. Andrea Barghigiani
AUSL6 - U.O.Te.P.I.
Via di Monterotondo, 49
57128 Livorno
Tel. 0586-223843 Fax 0586-223842
---------------------------------------------

Reply via email to