Hi Amos,
thanks for the reply. It is now working like a charm.

I hope that my CPU is not going to jump on 100% usage anymore :-).
I'll let you know how things are going on!

Thank you again.

Julien

Sent from San Francisco, California, United States

On Thu, Apr 16, 2009 at 8:32 PM, Amos Jeffries <[email protected]> wrote:
>
> Chris Robertson wrote:
>>
>> Julien P. wrote:
>>>
>>> Hi everybody,
>>> I am trying to use an external_acl_type to be able to filter internet
>>> traffic according to specific User-agent headers and destination
>>> (let's say you have the right to browse facebook only by using
>>> Firefox).
>>>
>>> this is my external acl:
>>>
>>> external_acl_type getheaders %{User-Agent} %DST /etc/squid3/getheaders
>>> acl myacl external getheaders
>>> http_access allow myacl
>>>
>>>
>>> this is my getheaders program:
>>> (I runned it, and there are no permissions problem)
>>>
>>> #!/bin/sh
>>>
>>
>> while [ 1 ]
>> do
>>>
>>> read agent
>>> read DST
>
> On Debian I'd do that as:
>
> while read agent dst ;
> do
>
> or even better to protect from whitespace errors:
>
> while read dst agent ;
> do
> ... with the matching arg reversal in the squid.conf format.
>
>>> date=`date`
>>> echo "$date $agent" >> /var/log/squid3/headers.log
>>> echo "$DST" >> /var/log/squid3/headers.log
>>> echo "OK"
>>>
>>
>> done
>>>
>>> exit 1
>>>
>>
>> That way you aren't kicking off a new helper for each request.
>>
>>> and this is what I get in the debug when I try to access facebook:
>>> 2009/04/16 21:17:16.481| aclMatchExternal: acl="getheaders"
>>> 2009/04/16 21:17:16.481| aclMatchExternal:
>>> getheaders("Mozilla/5.0%20...............0Version/4.0%20Safari/528.16
>>> www.facebook.com") = lookup needed
>>>
>>
>> This just means that we don't have a cached entry for the query 
>> "Mozilla/5.0...blah...blah www.facebook.com", and we have to ask the 
>> external helper.
>>>
>>> 2009/04/16 21:17:16.481| externalAclLookup: lookup in 'getheaders' for
>>> 'Mozilla/5.0%20(Macintosh;%20U;%20In...........Version/4.0%20Safari/528.16
>>> www.facebook.com'
>>> 2009/04/16 21:17:16.481| externalAclLookup: looking up for
>>> 'Mozilla/5.0%20(Macintosh;%20U;%20..............)%20Version/4.0%20Safari/528.16
>>> www.facebook.com' in 'getheaders'.
>>> 2009/04/16 21:17:16.481| helperDispatch: Request sent to getheaders
>>> #1, 167 bytes
>>> 2009/04/16 21:17:16.482| externalAclLookup: will wait for the result
>>> of 'Mozilla/5.0%20(Macintosh...........0Safari/528.16
>>> www.facebook.com' in 'getheaders' (ch=0x85a4760).
>>>
>>> Apparently squid is waiting for a domain lookup that my getheaders
>>> program should do.
>>>
>>
>> Squid is waiting for a reply from your helper actually.  The mystery is 
>> why...
>>
>>> I am a bit lost as I thought that the only reply options are OK/ERR
>>>
>>
>> With optional tags...
>>
>>> As I didn't find anything on google, if anybody has a clue, I would
>>> appreciate the share! :-)
>>>
>>
>> You state that you ran the script, and there were no permissions problems.  
>> Who did you run it as?  Did you give it input (and receive output in return? 
>>  Does the file "/var/log/squid3/headers.log" exist, and does the Squid user 
>> have permission to write to it?  Is there any change if you specify (in the 
>> script) the full path to "echo"?
>>
>>> I am running the latest squid3 on debian
>
> Question might be asked is: Debian what?
>  oldstable, stable, unstable, testing, experimental?
> though I don't think that matters here.
>
> Amos
> --
> Please be using
>  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE14
>  Current Beta Squid 3.1.0.7

Reply via email to