Dear Marcus,

Thanks for your reply... But its not working for me. The thing is my acl will 
not block www.example.com. it will only block www.example.com/something.com, 
because i am using urlpath_regex instead of url_regex in the acl declaration.

Then i tried your regex also, but its not working for my problem.

My situation is, 

sensex.com is working, but when this site redirect to 
http://landing.domainsponsor.com/oplatum.plmna/sites=www.someother.com , it is 
not working, because this URL path is ending with .com

Is there any workaround for this...???

Hope now my Q is more clear...



-
--
---
Always try to find truth!!!

------------***---------------***--------------***------------

Its always nice to know that people with no understanding of technologies want 
to evaluate technical professionals based on their own lack of knowledge

------------***---------------***--------------***------------


--- On Sat, 3/28/09, Marcus Kool <[email protected]> wrote:

> From: Marcus Kool <[email protected]>
> Subject: Re: [squid-users] .com extension blocking causing blocking of 
> redirecting URL's
> To: "Truth Seeker" <[email protected]>
> Cc: "Squid maillist" <[email protected]>
> Date: Saturday, March 28, 2009, 1:53 PM
> The ACL blocks URLs that end with
> .com
> i.e. it blocks a URL which is www.example.com while it does
> not block www.example.com/index.html
> 
> If you change the patterns to include a slash you are
> fine.
> The slash must prevent that domains with .com are matched.
> e.g.
> ..*\.com$  becomes   .*\..*/.*\.com$
> 
> Marcus
> 
> 
> Truth Seeker wrote:
> > Hi Techies,
> > 
> > I have an acl which blocks download of file with
> harmful extension's. like .exe, .bat, .com, etc. This rule
> is working fine. the following is the details of it;
> > 
> > ### Blocking of Dangerous extensions to certain
> groups
> > acl dangerous_extension urlpath_regex -i
> "/etc/squid/include-files/dangerous_ext
> > ension.squid"
> > http_access allow vip_acl dangerous_extension
> > http_access allow power_acl dangerous_extension
> > http_access allow ultimate_acl dangerous_extension
> > http_access allow download_surfers_acl
> dangerous_extension
> > http_access deny dangerous_extension
> > deny_info ERR_DANGEROUS_ESTENSIONS
> dangerous_extension
> > 
> > # cat
> /etc/squid/include-files/dangerous_extension.squid
> ..*\.exe$
> > ..*\.com$
> > ..*\.vb$
> > ..*\.vbs$
> > ..*\.vbe$
> > ..*\.cmd$
> > ..*\.bat$
> > ..*\.ws$
> > ..*\.wsf$
> > ..*\.scr$
> > ..*\.shs$
> > ..*\.pif$
> > ..*\.hta$
> > ..*\.jar$
> > ..*\..js$
> > ..*\.jse$
> > ..*\.lnk$
> > ..*\.mov$
> > ..*\.3gp$
> > ...*\.avi$
> > ..*\.rar$
> > ..*\.zip$
> > 
> > 
> > 
> > If there is a site which redirect traffic to another
> .com site, will cause to trigger the above rule, which will
> result in failure of a legitimate request. How can i do a
> workaround on this issue???
> > 
> > Thanks in Advance...
> > 
> > 
> > -
> > --
> > ---
> > Always try to find truth!!!
> > 
> >
> ------------***---------------***--------------***------------
> > 
> > Its always nice to know that people with no
> understanding of technologies want to evaluate technical
> professionals based on their own lack of knowledge
> > 
> >
> ------------***---------------***--------------***------------
> > 
> > 
> >       
> > 
> > 
> 




Reply via email to