mån 2010-03-08 klockan 20:40 + skrev J. Webster:
Is NCSA auth case sensitive for the login name?
yes. The NCSA auth helper is case sensitive.
The rest as Amos said.
And a small clarification regarding the auth_param basic casesensitive
option. When this is set to off then all usernames
Hi all,
We are currently using Oracle Web cache on quite a big site. We have 18 cache
servers and caching is done by
reverse-proxy method. There is appr. 5 http req/s on our cache servers
together. System is running Linux.
Because Oracle licenses are quite expensive, we would like to know
Hi All
We are using squid + Tproxy in Bridging mode .Is there any possibility
to bypass traffic when squid stops or if the squid unable to handle
request.We are using bridge since we can't able to configure tproxy with
wccp .Kindly do me needful
Thanks and Regards
Hi,
can we redirect squid after cache miss occurs.also i want to ask
can we deliver local files using squid redirection?means suppose i
have one file corresponding to google.com,then can i use it to deliver
it to client using squid redirection?
H.Päiväniemi wrote:
Hi all,
We are currently using Oracle Web cache on quite a big site. We have 18 cache servers and caching is done by
reverse-proxy method. There is appr. 5 http req/s on our cache servers together. System is running Linux.
Because Oracle licenses are quite expensive,
senthilkumaar2021 wrote:
Hi All
We are using squid + Tproxy in Bridging mode .Is there any possibility
to bypass traffic when squid stops or if the squid unable to handle
request.We are using bridge since we can't able to configure tproxy with
wccp .Kindly do me needful
Thanks and Regards
jayesh chavan wrote:
Hi,
can we redirect squid after cache miss occurs.also i want to ask
can we deliver local files using squid redirection?means suppose i
Squid can fetch from a local web server.
Squid can bounce via 3xx deny_info to a local web server.
have one file corresponding to
Hello all,
First, thanks for your time in reading this, I really appreciate it!
I've been working with squid for awhile and it's a awesome tool but I
am trying to figure out a way to do some more customizations and one
of them has me at a road block. I've searched the FAQs/google/mailing
Hi,
Squid has always been working fine. All websites, except Google, still
work fine! As far as i know nothing changed on my part, except the
weekly Fedora updates.
My internal users get this (101) Network is unreachable error message
when they go through the proxy. My iptables allows ALL
Sorry, the squid config carriage return were gone. Here is a more
readable format of my config:
Config:
===
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl localhost src ::1/128
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
acl to_localhost dst ::1/128
acl localnet src
hi,
i have problems with ping command behind squid proxy server
ping to local area network success
but
ping to www.google.de doesent go
have i configure something in squid conf for success or have i to update
squid?
squid version 3.1.0.16
debian lenny 5
thanks greets onkel
hi,
i have problems with ping command behind squid proxy server
ping to local area network success
but
ping to www.google.de doesent go
have i configure something in squid conf for success or have i to update
squid?
squid version 3.1.0.16
debian lenny 5
thanks greets onkel
On Tue, Mar 9, 2010 at 4:58 PM, Jan Houtsma l...@houtsma.net wrote:
Hi,
Squid has always been working fine. All websites, except Google, still
work fine! As far as i know nothing changed on my part, except the
weekly Fedora updates.
My internal users get this (101) Network is unreachable
Hi,
I'm having some difficulty trying to set up my squid servers to use
cache-digests.
First of all, in order to use cache-digests, must I also use ICP?
I do not get any errors, but I do not know even if it's working. What
should I see in the access.log?
My setup is made up of 3
Op 9-3-2010 19:03, Kinkie schreef:
On Tue, Mar 9, 2010 at 4:58 PM, Jan Houtsma l...@houtsma.net wrote:
Hi,
Squid has always been working fine. All websites, except Google, still
work fine! As far as i know nothing changed on my part, except the
weekly Fedora updates.
My internal users
Op 9-3-2010 19:03, Kinkie schreef:
On Tue, Mar 9, 2010 at 4:58 PM, Jan Houtsma l...@houtsma.net wrote:
Hi,
Squid has always been working fine. All websites, except Google, still
work fine! As far as i know nothing changed on my part, except the
weekly Fedora updates.
My internal users
Every time a user try to access https web site they got and error about
certificate not been emit by certificate authority. Removing the proxy from
internet setting, i got rid of these warning. I got squid 2.16 Stable 16
with squidGuard.
Tried with 3.1.0.12 and got the same thing.
Anybody have
tis 2010-03-09 klockan 18:05 +0530 skrev senthilkumaar2021:
Hi All
We are using squid + Tproxy in Bridging mode .Is there any possibility
to bypass traffic when squid stops or if the squid unable to handle
request.We are using bridge since we can't able to configure tproxy with
wccp
tis 2010-03-09 klockan 19:07 +0530 skrev jayesh chavan:
Hi,
can we redirect squid after cache miss occurs.also i want to ask
can we deliver local files using squid redirection?means suppose i
have one file corresponding to google.com,then can i use it to deliver
it to client using squid
tis 2010-03-09 klockan 10:47 -0500 skrev rascal:
Here is my first question. I would like to present the user with a
disclaimer prompt when they attempt to go to the internet. Currently
they get the typical squid requires your username/password to proceed
prompt and I would like to know if I
tis 2010-03-09 klockan 19:49 +0100 skrev Jan Houtsma:
Yes. The wget was from the squid server itself where using the proxy it
fails, and using direct internet connection it works.
What does access.log say when it fails? Do the reported server address
match what you expect it to be for the
Op 9-3-2010 21:37, Henrik Nordström schreef:
tis 2010-03-09 klockan 19:49 +0100 skrev Jan Houtsma:
Yes. The wget was from the squid server itself where using the proxy it
fails, and using direct internet connection it works.
What does access.log say when it fails? Do the reported
tis 2010-03-09 klockan 20:13 +0200 skrev Giannis Fotopoulos:
Hi,
I'm having some difficulty trying to set up my squid servers to use
cache-digests.
Should be automatic unless you explicitly disable it.
First of all, in order to use cache-digests, must I also use ICP?
No.
I do not get
Op 9-3-2010 21:42, Jan Houtsma schreef:
Op 9-3-2010 21:37, Henrik Nordström schreef:
tis 2010-03-09 klockan 19:49 +0100 skrev Jan Houtsma:
Yes. The wget was from the squid server itself where using the proxy it
fails, and using direct internet connection it works.
Does squid keep an internal counter of requests (HTTP, etc) per second?
All I see from 'squidclient mgr:info' is a requests per minute counter
for HTTP requests:
Number of HTTP requests received: 92
Average HTTP requests per minute since start: 26.7
Thanks,
Josh
Hello!
I have a Centos 5.4 with Squid version 3.0.STABLE24 and samba Version
3.0.33-3.14.el5.
I have connfigured squid authentication with ntlm and it works fine
when the user have access allowed by an squid acl.
But when the user have the access denied by an squid acl, the browser
(internet
Thank you very much Henrik for your reply!
Everything is working. :)
A piece of my logfile for others:
1268176051.830 7 10.95.4.46 TCP_MISS/200 1315 GET
http://www.the-west.gr/images/index/mini.png - CD_SIBLING_HIT/10.95.4.50
image/png
By the way, the load balancing method I am using is
tis 2010-03-09 klockan 16:42 -0600 skrev Baird, Josh:
Does squid keep an internal counter of requests (HTTP, etc) per second?
Kind of. Squid keeps a counter of number of requests in total. You can
easily poll this via SNMP and derive the requests / second (or whatever
interval). (previous value
tis 2010-03-09 klockan 23:58 +0100 skrev alvaro perera:
But when the user have the access denied by an squid acl, the browser
(internet explorer and mozilla) shows a login pop-up window,
This depends on how you make your http_access rules.. if the last acl on
an http_access deny ... line is
On Tue, 09 Mar 2010 21:42:42 +0100, Jan Houtsma l...@houtsma.net wrote:
Op 9-3-2010 21:37, Henrik Nordström schreef:
tis 2010-03-09 klockan 19:49 +0100 skrev Jan Houtsma:
Yes. The wget was from the squid server itself where using the proxy
it
fails, and using direct internet connection it
On Wed, 10 Mar 2010 01:33:44 +0100, Henrik Nordström
hen...@henriknordstrom.net wrote:
tis 2010-03-09 klockan 23:58 +0100 skrev alvaro perera:
But when the user have the access denied by an squid acl, the browser
(internet explorer and mozilla) shows a login pop-up window,
This depends on
On Tue, 09 Mar 2010 17:39:09 +0100, da...@lafourmi.de
da...@lafourmi.de
wrote:
hi,
i have problems with ping command behind squid proxy server
ping to local area network success
but
ping to www.google.de doesent go
have i configure something in squid conf for success or have i to update
Hi,
I have written redirect program which is not working.The program
is as follow:
#!c:/perl/bin/perl.exe
$|=1;
while () {
@X = split;
$url = $X[0];
if ($url !~ /^http:\/\/www\.hostname\.com/) {
$_ = $url;
s/^http:\/\/(.*)\/(.*)/http:\/\/www.hostname.com\/\2/;
print 301:$_\n;
} else {
print
boipie01 wrote:
Every time a user try to access https web site they got and error about
certificate not been emit by certificate authority. Removing the proxy from
internet setting, i got rid of these warning. I got squid 2.16 Stable 16
with squidGuard.
Tried with 3.1.0.12 and got the same
Hi,
I have squid in front of tomcat servers as reverse proxy. The origin
servers return some files gzipped. I can confirm this by going to them
directly with header
Accept-Encoding: gzip,deflate
Origin server returns:
HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Cache-Control: max-age=1801
Elli Albek wrote:
Hi,
I have squid in front of tomcat servers as reverse proxy. The origin
servers return some files gzipped. I can confirm this by going to them
directly with header
Accept-Encoding: gzip,deflate
Origin server returns:
HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Cache-Control:
http://redbot.org/ should be able to confirm if this a problem...
On 10/03/2010, at 4:56 PM, Amos Jeffries wrote:
Elli Albek wrote:
Hi,
I have squid in front of tomcat servers as reverse proxy. The origin
servers return some files gzipped. I can confirm this by going to them
directly with
Op 10-3-2010 1:44, Amos Jeffries schreef:
On Tue, 09 Mar 2010 21:42:42 +0100, Jan Houtsma l...@houtsma.net wrote:
Op 9-3-2010 21:37, Henrik Nordström schreef:
tis 2010-03-09 klockan 19:49 +0100 skrev Jan Houtsma:
Yes. The wget was from the squid server itself where using
38 matches
Mail list logo