Yes, it works. Still, there are two points:
a) I am not understanding how the arguments following the 'pass' tag in the
acl specification are processed. What I expected is they were processed in
the usual way for acl, from left to right, top to bottom, and the first
matching entry is the one that is considered: the '!' should specify if the
match is positive or negative, that is my 'pass noporn !porn all' should
say: if it matches noporn, goahead, if it matches porn, redirect, otherwise
goahead.
b) if squidGuard's behavior in case a) is like I imagined, I don't see any
reason why it could make any difference to use expressions, domains or urls.
c) when I realize that I excluded the urls that contain the string 'sussex',
I would just like to force the OK to any such an url: but, for now, the only
way to enforce an OK before the deny list is with a domain.
c) plus, I tried to produce a database for the expressions, or domains or
anything with 'squidGuard -C <filename>': it seems happy to do it, and exit
value is 0, but it does just nothing.



---------------------------------------------
Dr. Andrea Barghigiani
AUSL6 - U.O.Te.P.I.
Via di Monterotondo, 49
57128 Livorno
Tel. 0586-223843 Fax 0586-223842
---------------------------------------------

> -----Messaggio originale-----
> Da: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
> Inviato: marted� 18 settembre 2001 15.06
> A: [EMAIL PROTECTED]
> Oggetto: RE: multiple destinations in pass record
>
>
> try also adding a domain list with the specified domain (sussex.com)
>
> e.g.
>
> dest noporn {
>       domainlist      noporn/domains
>       urllist noporn/urls
> )
> where domains file has sussex.com in it.
>
> have a go with this
>
> -----Original Message-----
> From: Andrea Barghigiani [mailto:[EMAIL PROTECTED]]
> Sent: 18 September 2001 12:30
> To: [EMAIL PROTECTED]
> Subject: multiple destinations in pass record
>
>
> I have been dealing with web proxing fro a while. I am using squid
> 2.3-stable3 and I have used jesred as a filter for a while.
> As I read some documentation about squidguard's efficiency, I am in the
> middle of doing some tests on it.
> I use Linux RedHat 7.1 on a PC with 128MB RAM, 256 MB swap and 10 GB disk
> space.
> I tested the blacklist from the squidguard site, essentially trying to
> filter out porn sites, but a lot of pages that get filtered seems
> to have no
> relationship with the one listed in the domainlist and urls, so I passed,
> also recommended by some other webmasters, to use only the regular
> expressions.
> I have exprerienced jesred for a while (which uses only regular
> expressions)
> and what happens is that, if you filter out every urls that contain the
> string 'xxx', qou can filter out something like
> 'www.exxxeptionalHardwarePrices.com which could be interesting to look at.
> So, whith jesred, you put an acl before the one that denied urls with the
> string 'xxx', to say that all entries that contains the string
> 'exxxeptionalHardware' are allowed.
> I thought to do the same with squidguard and I tried:
> :::::::::::::::::::::::
> dest noporn {
>         urllist noporn/npurls
> }
>
> dest porn {
>         expressionlist  porn/expressions
>         log             porn.log
> }
>
> acl {
>         default {
>                 pass noporn !porn all
>                 redirect http://www.usl6.toscana.it/oslissdenied.html
>         }
>
> }
> :::::::::::::::::::::::
> the file porn/expressions is:
> :::::::::::::::::::::::
> ^http://.*sex.*\..*
> :::::::::::::::::::::::
> the file noporn/npurls is:
> :::::::::::::::::::::::
> sussex.com
> :::::::::::::::::::::::
> When I try to access www.sussex.com I get denied.
>
> Does anybody has seriously tried to use the 'pass' record with some posive
> dest before a negative one?
> Thanks in advance
>
> ---------------------------------------------
> Dr. Andrea Barghigiani
> AUSL6 - U.O.Te.P.I.
> Via di Monterotondo, 49
> 57128 Livorno
> Tel. 0586-223843 Fax 0586-223842
> ---------------------------------------------
>
>
> ***Private and Confidential Notice***
>
> The information contained in this E-Mail is intended for the
> named recipients only.
> It may contain privileged and confidential information and if you
> are not the intended
> recipient, you must not copy, distribute or take any action or
> reliance on it.
> If you have received this E-Mail in error, please notify the
> sender immediately by
> using the E-Mail address or on +44 (0) 1733 452411.
>

Reply via email to