On 09/19/2012 02:12 AM, Amos Jeffries wrote:
On 19/09/2012 9:10 a.m., Eliezer Croitoru wrote:
On 9/18/2012 6:01 PM, Silamael wrote:
refresh_pattern foo.example.org 0 0% 0
refresh_pattern . 0 20% 14400
Now, if i fetch something from foo.example.org i get a
TCP_CLIENT_REFRESH_MISS/200
The
Amos Jeffries wrote:
On 19/09/2012 10:15 a.m., Eliezer Croitoru wrote:
I was thinking about squidguard and url_rewrite.
Since external_acl and deny_info I really dont see why to use squidGuard at
url_rewrite?
Right. That was the intention. URL-rewrite is a hack and a direct violation of
Hi friends,
I have a weird problem of saturation due to a broken client and I don't
know how fix it (I can force user to disable the app who cause the
problem, but I think that should be a solution for avoid that a bad
client can overload the server and affect to proxy service by itself).
I have
On 19/09/2012 6:51 p.m., Silamael wrote:
Ok, so if a response contains valid headers concerning caching, these
are taken instead of using the matching refresh_pattern? So, if i want
some URLs being served completely without caching, i have to use cache
deny, right?
Yes. Exactly so.
Amos
Hi Amos,
You are right, I didn't explain myself properly. We use ident to identify our
users. One user with IE or firefox wants to go to one Facebook. He receives a
wonderful deny message saying that he is not allowed. Same user with Chrome
does the same and he is able to access to Facebook.
On 09/19/2012 10:12 AM, Amos Jeffries wrote:
On 19/09/2012 6:51 p.m., Silamael wrote:
Ok, so if a response contains valid headers concerning caching, these
are taken instead of using the matching refresh_pattern? So, if i want
some URLs being served completely without caching, i have to use
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time with ssl-bump without problems for a year, the ca cert expired
some days ago and with the new ca cert i installed squid 3.2.1 stable.
Now the proxy exists every time 10 or more users use https
On 9/19/2012 10:00 AM, Jose-Marcio Martins da Cruz wrote:
Amos Jeffries wrote:
On 19/09/2012 10:15 a.m., Eliezer Croitoru wrote:
I was thinking about squidguard and url_rewrite.
Since external_acl and deny_info I really dont see why to use
squidGuard at url_rewrite?
Right. That was the
I do not see any facebook stuff in access.log.
I'm using squid in transparent mode.
If I connect my pc directly to the modem then facebook works.
On 18/09/2012 17:08, Eliezer Croitoru wrote:
On 9/17/2012 8:20 PM, Wilson Hernandez wrote:
Hello.
As of a couple of days ago I've been
I'm using squid in transparent mode.
make sure that you aren't use https to access. if you ware that maybe
the issue. do test with http://www.facebook.com
2012/9/19 Wilson Hernandez wil...@optimumwireless.com:
I do not see any facebook stuff in access.log.
I'm using squid in
Eliezer Croitoru wrote:
On 9/19/2012 10:00 AM, Jose-Marcio Martins da Cruz wrote:
The response for the url should be OK or ERR so the code should be changed.
I have written a helper that works with squidguard blacklists for external_acl.
It's pretty basic so it's for one big blacklist
On 9/19/2012 3:28 PM, Wilson Hernandez wrote:
I do not see any facebook stuff in access.log.
I'm using squid in transparent mode.
If I connect my pc directly to the modem then facebook works.
what are you trying to do with squid?
if it works for other sites but not for facebook check if you
On Sep 19, 2012, at 5:44 AM, Linos i...@linos.es wrote:
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time with ssl-bump without problems for a year, the ca cert expired
some days ago and with the new ca cert i installed squid 3.2.1 stable.
This were working right for months and all over a sudden there's that
problem. We haven't change anything on our server.
On 19/09/2012 9:25, Eliezer Croitoru wrote:
On 9/19/2012 3:28 PM, Wilson Hernandez wrote:
I do not see any facebook stuff in access.log.
I'm using squid in transparent
On 9/19/2012 4:41 PM, Wilson Hernandez wrote:
This were working right for months and all over a sudden there's that
problem. We haven't change anything on our server.
go for routing...nat.. firewall.
If you have output of IPTABLES squid.conf, sysctl, route and the needed
IP data throw it here
On 9/19/2012 4:36 PM, Stephane CHAZELAS wrote:
But only when providing with a User-Agent oddly enough (and also
when inserting a 10 second delay between the two requests).
Then, I tried to do the same query from a different client and
since then, I cannot reproduce it anymore (even if I do a
On Sep 19, 2012, at 9:03 AM, Linos i...@linos.es wrote:
On 19/09/12 15:30, Guy Helmer wrote:
On Sep 19, 2012, at 5:44 AM, Linos i...@linos.es wrote:
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time with ssl-bump without problems for a
On 9/19/2012 1:44 PM, Linos wrote:
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time with ssl-bump without problems for a year, the ca cert expired
some days ago and with the new ca cert i installed squid 3.2.1 stable.
Now the proxy exists
On 19/09/12 16:46, Guy Helmer wrote:
On Sep 19, 2012, at 9:03 AM, Linos i...@linos.es wrote:
On 19/09/12 15:30, Guy Helmer wrote:
On Sep 19, 2012, at 5:44 AM, Linos i...@linos.es wrote:
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time
On 19/09/12 17:26, Eliezer Croitoru wrote:
On 9/19/2012 1:44 PM, Linos wrote:
Hi,
i have been using Squid squid-3.2.0.17-20120527-r11561 in an Ubuntu
Server
12.04 some time with ssl-bump without problems for a year, the ca cert
expired
some days ago and with the new ca cert i
Hi,
In our product we're running the Squid 2.7 and Apache http server on a single
machine.
The Apache server can originate a content and is configured as Squid's
cache_peer sibling to be queried via ICP.
We can run the Cache Manager script and access it, if the Apache server is
reconfigured as
On 9/19/2012 10:26 PM, Andrew Krupiczka wrote:
In our product we're running the Squid 2.7 and Apache http server on a single
machine.
The Apache server can originate a content and is configured as Squid's
cache_peer sibling to be queried via ICP.
We can run the Cache Manager script and access
On 20/09/2012 7:46 a.m., Eliezer Croitoru wrote:
On 9/19/2012 10:26 PM, Andrew Krupiczka wrote:
In our product we're running the Squid 2.7 and Apache http server on
a single machine.
The Apache server can originate a content and is configured as
Squid's cache_peer sibling to be queried via
On 20/09/2012 1:25 a.m., Eliezer Croitoru wrote:
On 9/19/2012 3:28 PM, Wilson Hernandez wrote:
I do not see any facebook stuff in access.log.
I'm using squid in transparent mode.
If I connect my pc directly to the modem then facebook works.
what are you trying to do with squid?
if it works
Hello all,
I'm running Squid 3.2.1, with a cache_peer configured to point at
another local Squid instance as a parent proxy. The parent proxy is
not getting reliably hit, for the *exact* same url.
acl objectcache url_regex -i /path/to/regexes
cache_peer localhost parent 60084 0 proxy-only
On 19/09/2012 7:04 p.m., Fran Márquez wrote:
Hi friends,
I have a weird problem of saturation due to a broken client and I don't
know how fix it (I can force user to disable the app who cause the
problem, but I think that should be a solution for avoid that a bad
client can overload the server
On 19/09/2012 4:40 a.m., Holmes, Michael A (Mike) wrote:
More solutions to problems seen with adding the worker setting.
To fix this
2012/09/18 10:05:20 kid5| commBind: Cannot bind socket FD 12 to [::]: (13)
Permission denied
Do this
#chown squid -R .../var/run/squid/
There is something
On 20/09/2012 3:53 p.m., Nathan Hoad wrote:
Hello all,
I'm running Squid 3.2.1, with a cache_peer configured to point at
another local Squid instance as a parent proxy. The parent proxy is
not getting reliably hit, for the *exact* same url.
acl objectcache url_regex -i /path/to/regexes
28 matches
Mail list logo