On Tue, Dec 23, 2008 at 4:19 AM, howard chen howac...@gmail.com wrote:
I am using Squid as reverse proxy to web server.
Sometimes (not always), when client POST something to my server, error
will be shown:
=
ERROR
The requuseted URL could not be retrieved
*
Hi,
Is it normal that squid+c-icap+clamav does not trigger an alert when
i upload a virus ?
I am using squid as a reverse proxy to accelerate a dynamic user
generated content website, and so, it does not prevent from
uploading malware to the website, only from downloading it
Hi,
After seting up squid3+c-icap+clamav in a reverse proxy
configuration, and i found it quite unreliable from an user
experience point of view. It appeared to me as being very
unreliable, as one page of ten would always seem unreachable and
report an error, despites squid alone would
On Sunday 21 December 2008 10:52:42 Imri Zvik wrote:
Hi,
On Thursday 18 December 2008 21:57:22 Adrian Chadd wrote:
Nope, I don't think the storeurl-rewriter stuff was ever integrated into
ICP.
I think someone posted a patch to the squid bugzilla to implement this.
If you can point me
On 17.12.08 20:14, Jevos, Peter wrote:
Is this ok with all these zero's in my output ?
Cache information for squid:
Request Hit Ratios: 5min: 0.0%, 60min: 0.0%
Byte Hit Ratios:5min: 87.9%, 60min: 42.4%
Request Memory Hit Ratios: 5min: 0.0%, 60min: 0.0%
Hi All,
any links on how to configure load balancing of squid
Regards,
Mario
Hi All,
any links on how to configure load balancing of squid
See the default squid.conf, :)
Hi,
I have two imageservers behind a squid.
My issue is that my imageservers are not sending any Expires headers
but
I would like to attaché one from the squid.
So by the time the image reaches the browser I have an Expires header
in
it.
if there is neither Expires nor max-age
Hi there.
I'm planning to build a new dedicated Squid-box, with amd64 and 4 gigs
of RAM, with two cache_dir's on two separate harddisks and Squid-3
doing
application level striping, all servicing around 6k users. Will two
recent IDE disks of 7200 rpm suffice, or I'm better off getting two
The squid is adding the max-age header but not the expires. So it cache them.
I was looking at the methods that are available and I think I will just modify
the code and add a hardcoded expires header ... and then compile the whole
thing ...
Alin Bugeag
Tel +1 905 761
The squid is adding the max-age header but not the expires. So it
cache
them.
are you sure? I remember Squid adds an age header, not max-age header.
but maybe I'm wrong.
Yes, you are right it's the age header ... :)
But I did some tests and it's cache them ...
Alin Bugeag
Tel +1 905 761 5301 ext 231
Home +1 416 623 9253
-Original Message-
From: Ken Peng [mailto:kenp...@rambler.ru]
Sent: Tuesday, December 23, 2008
To a degree I agree with Matus in that the type of load is important. It is
also important to keep in mind how you plan to setup cache dirs, and cache
replacement. If you configure squid to cache most stuff to RAM, then disks are
not as important as RAMalthough RAM is really always the most
Ken Peng wrote:
Hi there.
I'm planning to build a new dedicated Squid-box, with amd64 and 4 gigs
of RAM, with two cache_dir's on two separate harddisks and Squid-3
doing
application level striping, all servicing around 6k users. Will two
recent IDE disks of 7200 rpm suffice, or I'm
Hi there:
I'm running Squid to block multimedia online using something like this:
acl multimedia rep_mime_type -i /etc/squid/multimedia.txt
http_reply_access deny multimedia-online
/etc/squid/acl/multimedia.txt has these lines inside:
^application/vnd.ms.wms-hdr.asfv1$
Hi!
I have a question about delay_pools: If I make a time-based acl with a
delay-pool, does it refill in the time the acl is inactive or is the
amount stopped and continued when the acl starts again?
Like, if I have a pool acl going from 9:00 till 20:00 with a size of 3GB
and a rate of 1200
How does one deal with this scenario? It seems that when we encounter websites
that toggle between http/s the connection is broken. I can see why this
logically
happens, but I am unable to work a solution for it? Anyone have experience with
a
scenario such as this?
Thanks!
jlc
Hello,
I've got a squid 3.0 proxy that I'm trying to force to use an upstream
proxy for a specific domain to get around a path MTU problem that's
proving difficult to fix.
I have the following in my squid.conf :
cache_peer proxy.xxx.yyy.zz parent 8080 7 no-query
Carl Brewer wrote:
Hello,
I've got a squid 3.0 proxy that I'm trying to force to use an upstream
proxy for a specific domain to get around a path MTU problem that's
proving difficult to fix.
I have the following in my squid.conf :
cache_peer proxy.xxx.yyy.zz parent 8080 7
Yes, you are right it's the age header ... :)
But I did some tests and it's cache them ...
that's b/c images have a Last-Modified-Since header, squid calculate it
based on that.
you can't force squid to insert a max-age or expires headers in the
response.
Johannes Buchner wrote:
Hi!
I have a question about delay_pools: If I make a time-based acl with a
delay-pool, does it refill in the time the acl is inactive or is the
amount stopped and continued when the acl starts again?
Pools refill at the constant rate unless the are full or
Joseph L. Casale wrote:
How does one deal with this scenario? It seems that when we encounter websites
that toggle between http/s the connection is broken. I can see why this
logically
happens, but I am unable to work a solution for it? Anyone have experience with
a
scenario such as this?
I saw this in squid.conf:
# TAG: read_timeouttime-units
# The read_timeout is applied on server-side connections. After
# each successful read(), the timeout will be extended by this
# amount. If no data is read again after this amount of time,
# the request is
Define 'connection'. I suspect what you think of as a connection is not
related to HTTP connections.
Amos,
Appreciate your help here, why I theorize connection was because what happens
when an SSL session is started versus a simple HTTP session. This is all related
to our users getting yahoo
Joseph L. Casale wrote:
Define 'connection'. I suspect what you think of as a connection is not
related to HTTP connections.
Amos,
Appreciate your help here, why I theorize connection was because what happens
when an SSL session is started versus a simple HTTP session. This is all related
to
The Squid HTTP Proxy team is pleased to announce the
availability of the Squid-3.0.STABLE11 release!
The previous RC release has now completed its mandatory 14 days without
new bugs being detected, or bad reports against the tested patches. As
such the 3.0 code is once again considered stable
Hi ,
I have an interesting problem while logining in to a web site. I am
using squid 2.6 with OpenBSD 4.3 stable. Web site is opening without any
problem. When I enter the username and password it waits for some
seconds then get timeout error from the remote server. I am looking
squid logs
27 matches
Mail list logo