Hi,
We use squid together with a content scanner connected as parent proxy
(cache_peer parent) with none of them caching any content. When
upgrading from squid 2.7 to 3.1, we observed an increased number of TCP
connections between squid and its parent. I analysed the traffic between
squid
On 27/04/2011 23:05, Sheridan Dan Small wrote:
Thanks for your reply again Amos,
I am familiar with wget and curl for fetching headers and files. I
think I can come up with a solution using either wget or curl to
create a static cache. I will write a simple server application to
serve the
Hi,
We have a proxy chain of 3 squids (v2.7, RHEL5).
In my opinion only the last one needs really to resolve the URLs name in order
to
send the request directly to the web servers ip address in the Internet.
Would it also work if the first two proxies wouldn't resolve?
And how can that be
Hello *,
For WLAN Clients ( untrusted ), i need to set up a 2nd squid proxy, since those
clients are not able / allowed
to connect to the main proxy directly( local 172.168.x.x Network for WLAN ).
-works well with http, but https target URLS do not work ( server not
reacheable )
Setup:
I currently have a squid proxy using ncsa auth.
I would also like to add an ip address auth for those users that cannot
enter usernames and passwords (some iphones, DVD players etc.|)
Can I just add an acl like this:
acl external_IP 200.123.45.23
http_access allow external_IP
Will this mess
On 28/04/2011 12:25, Jannis Kafkoulas wrote:
Hi,
We have a proxy chain of 3 squids (v2.7, RHEL5).
In my opinion only the last one needs really to resolve the URLs name in order
to
send the request directly to the web servers ip address in the Internet.
Would it also work if the first two
On 28/04/11 17:56, Morgan Storey wrote:
Hi Everyone,
Playing around with reverse proxying with cache, and I have a bit of a
problem some of the pages are using 302 redirects, so I can't cache
them. If the webserver goes down it will return an http error 400, can
I customise an error page in:
It looks like you're running squid on the same machine you are browsing
from, correct?
I don't see any acls referencing method connect so it should be
unrestricted, unless this is not the complete config file.
If it is, I'm lost.
Alex
On 27/04/11 20:53, Oscar Andrés Eraso Moncayo wrote:
Or is your proxy a transparent one? If so, you should not be seeing
CONNECT requests, as these suggest you have configured a proxy for HTTPS
in the browser.
Alex
On 27/04/2011 22:53, Oscar Andrés Eraso Moncayo wrote:
Hi,
squid.conf:
**
http_port 127.0.0.1:3030
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
On 28/04/11 17:56, Supratik Goswami wrote:
@Amos
Thanks for your reply.
Currently I am using acl to filter file extension .exe, .iso, .zip and
using with tcp_outgoing_address
I am able to change the source IP and it is working fine with source
based routing.
I want to filter by size (Ex.
On 28/04/11 20:19, Mathias Fischer wrote:
Hi,
We use squid together with a content scanner connected as parent proxy
(cache_peer parent) with none of them caching any content. When
upgrading from squid 2.7 to 3.1, we observed an increased number of TCP
connections between squid and its parent.
On 28/04/11 21:25, Jannis Kafkoulas wrote:
Hi,
We have a proxy chain of 3 squids (v2.7, RHEL5).
In my opinion only the last one needs really to resolve the URLs name in order
to
send the request directly to the web servers ip address in the Internet.
Sort of, only the proxy going DIRECT
@Amos
Thanks for the information.
There is one confusion still in my mind. How reply_body_max_size is
able detect it ?
In the Squid documentation it says:
This size is checked twice. First when we get the reply headers,
we check the content-length value. If the content length value exists
and
On 28/04/11 22:06, Michael Arndt wrote:
Hello *,
For WLAN Clients ( untrusted ), i need to set up a 2nd squid proxy, since those
clients are not able / allowed
to connect to the main proxy directly( local 172.168.x.x Network for WLAN ).
-works well with http, but https target URLS do not work
On 28/04/11 23:37, J Webster wrote:
I currently have a squid proxy using ncsa auth.
I would also like to add an ip address auth for those users that cannot
enter usernames and passwords (some iphones, DVD players etc.|)
Can I just add an acl like this:
acl external_IP 200.123.45.23
With the
Of cource Eliezer, thanks a lot!
Yes, of course, I mean dns lookup by resolve.
(It has been set up by an external company)
The chain is very simple, just one after the other:
clients (FF) --- Squid1 (LAN) Squid2 (somewhere in between) --- Squid3
(at the Internet)
This chain is being
On 04/27/2011 06:16 PM, Amos Jeffries wrote:
On Wed, 27 Apr 2011 12:04:23 -0500, Sam Klinger wrote:
Steps to reproduce:
1. Go to
http://sourceforge.net/projects/sarg/files/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz/download
2. Attempt to download
3. Squid will display error page saying The
On 29/04/11 01:38, Supratik Goswami wrote:
@Amos
Thanks for the information.
There is one confusion still in my mind. How reply_body_max_size is
able detect it ?
In the Squid documentation it says:
This size is checked twice. First when we get the reply headers,
Note at this point the TCP
On 29/04/11 01:56, Jannis Kafkoulas wrote:
Of cource Eliezer, thanks a lot!
Yes, of course, I mean dns lookup by resolve.
(It has been set up by an external company)
The chain is very simple, just one after the other:
clients (FF) --- Squid1 (LAN) Squid2 (somewhere in between) ---
On 29/04/11 00:39, Alex Crow wrote:
Or is your proxy a transparent one? If so, you should not be seeing
CONNECT requests, as these suggest you have configured a proxy for HTTPS
in the browser.
Alex
It appears not, there is no sign of NAT or TPROXY interception in that
config.
Amos
--
On 29/04/11 00:49, Eliezer Croitoru wrote:
On 27/04/2011 22:53, Oscar Andrés Eraso Moncayo wrote:
Hi,
squid.conf:
**
http_port 127.0.0.1:3030
hierarchy_stoplist cgi-bin ?
acl QUERY
On 28/04/2011 18:05, Amos Jeffries wrote:
On 29/04/11 00:49, Eliezer Croitoru wrote:
On 27/04/2011 22:53, Oscar Andrés Eraso Moncayo wrote:
Hi,
squid.conf:
**
http_port
For those testing this release and having new trouble with ssl-bump
feature please be aware of a patch:
http://bugs.squid-cache.org/show_bug.cgi?id=3205
Amos
--
Please be using
Current Stable Squid 2.7.STABLE9 or 3.1.12
Beta testers wanted for 3.2.0.7 and 3.1.12.1
Will this mess with the ncsa auth?
It will.
You have already said they cannot enter usernames and passwords. So
the interference being in the form of not asking for username/password
seems to be what you are wanting.
I would extend that a bit and maybe check for User-Agent
Thanks, fixed my problem, even on FreeBSD.
Ming
-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz]
Sent: Thursday, April 28, 2011 12:10 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Squid 3.2.0.7 beta is available
For those testing this release and
It seems to me that ACL SRC is NEVER checked when going to a Peer.
WHAT I WANT TO DO:
acl OFFICE src 1.1.1.1
request_header_access User-Agent allow OFFICE
request_header_access User-Agent deny all
request-header_replace User-Agent BOGUS AGENT
[OFFICE UA should not be modified
Date: Fri, 29 Apr 2011 01:12:55 +1200
From: squ...@treenet.co.nz
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Persistent Connections to Parent Proxy
On 28/04/11 20:19, Mathias Fischer wrote:
Hi,
We use squid together with a
Hi again :)
I try to redirect denied users (by my own external acl helper) to my
custom page.
I use kerb_auth so I pass to my helper variable %LOGIN
external_acl_type testacl %LOGIN /tmp/login.sh (login.sh will return
OK or Err - it works).
Now - in case of 'Err' i have to redirect my client
Thanks for reply so soon.
The problem was that some option was not accepted. My case:
--disable-http-violations
Afterwards it compile without problems.
--- El lun 25-abr-11, Amos Jeffries squ...@treenet.co.nz escribió:
De: Amos Jeffries squ...@treenet.co.nz
Asunto: Re: [squid-users]
On 28/04/2011 17:18, Amos Jeffries wrote:
proxy was psychic
my proxy was psychic good name for a TV show :)
Hi List
I recently installed Mysar utility for generating reports from Squid
logs, but the script does importer.php only import files in native
format logs into the database.
I modified the configuration of my squid to generate logs in native
mode, since even here there are no problems,
On 29/04/11 07:28, Jenny Lee wrote:
Date: Fri, 29 Apr 2011 01:12:55 +1200
From: squ...@treenet.co.nz
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Persistent Connections to Parent Proxy
On 28/04/11 20:19, Mathias Fischer wrote:
Hi,
We
On 29/04/11 16:03, mic...@casa.co.cu wrote:
Hi List
I recently installed Mysar utility for generating reports from Squid
logs, but the script does importer.php only import files in native
format logs into the database.
I modified the configuration of my squid to generate logs in native
mode,
Hi all
I'm having trouble getting squid to do what I need.
I'm in a test network within a corporate environment.
What I want is this:
1) squid needs to use the corporate proxy for Internet stuff
2) squid should cache
3) squid should reverse proxy several servers (but for now only one
actually
35 matches
Mail list logo