Hello,

You could try this ... now this is my first attempt at a regular expression.

acl badurl url_regex -i .*\.(com|net|biz|org|ca|us|br).*\/.*\.(com|net|biz|org|ca|us|br)

http_access deny badurl

This pattern should match any url string that contains *.domainsuffix.*/*.domainsuffix

So the rule would go at the top.

Michael.


Christian Ricardo dos Santos wrote:
Hello,

I REALLY need a help here.

Nowadays we are using ACL system to avoid users access to some websites.

Those users can only access a limited list of sites (around 30), any =
place outside this list is blocked. I don't know how or when, but =
somebody discovery a way to cheat those ACLs.

Here is what's happeneing:

Everybody can access the site www.telefutura.com.br, but nobody can =
access the website www.uol.com.br.

Now if any user type one of the three strings bellow the access to this =
blocked website is grant (anyways you can only read the text of it =
through, all the other links are broken).

www.uol.com.br/telefutura.com.br
www.uol.com.br/?telefutura.com.br
www.uol.com.br/$telefutura.com.br

What can I do to avoid it ?

I already have some ACLs in place to avoid downloads and access to some =
type of files - ex: .*\.mp3($|\/?) -, but I still don't know how to =
handle those mixed URLs request.

Sorry for my Bad English





-- Michael Gale Lan Administrator Utilitran Corp.

Help feed the penguin !!

Reply via email to