Thanks your for the help... But this way I need to create one rule for each website permitted to each group of users instead of a single rule by group.
Nowadays we are using rules like the one bellow (two text files where one list the IP address and the other the websites). Right now we have 36 groups using squid (+- 3000 users). acl general url_regex -i "/etc/squid/data/txtgeneral.txt" http_access allow txtlan general !download ----- Original Message ----- From: "Andreas Pettersson" <[EMAIL PROTECTED]> To: "Christian Ricardo dos Santos" <[EMAIL PROTECTED]> Cc: "squid-users" <[EMAIL PROTECTED]> Sent: Thursday, September 30, 2004 4:20 PM Subject: Re: [squid-users] Blocking mixed URLs > Everybody can access the site www.telefutura.com.br, but nobody can = > access the website www.uol.com.br. > > Now if any user type one of the three strings bellow the access to this = > blocked website is grant (anyways you can only read the text of it = > through, all the other links are broken). > > www.uol.com.br/telefutura.com.br > www.uol.com.br/?telefutura.com.br > www.uol.com.br/$telefutura.com.br > > What can I do to avoid it ? You must be more specific in the url_regex acl. For example: acl Good url_regex ^www\.telefutura\.com\.br An even better way is to use the dstdomain acl instead. /Andreas
