On Mon, 2004-12-13 at 07:06, Daniel Graupner wrote:
Henrik Nordstrom schrieb:
This is a very silly behaviour, why does squid crp URLs.
It is infact very smart reasons behind this behaviour of Squid, and
relates to many things, not just questionmarks. As you have only told
Squid you
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
So, in effect, I shouldn't even be thinking about running it on the
Squid box?
Correct.
Hmm.. Interesting. if it comes as a Live-CD, it would be even better.
It is not a Live-CD, just a very small FreeBSD which is the version used
at the latest cache
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
ps: do you prefer to have list mails (replies) sent Directly to you or
only to the list?
On replies which is directed to me and where the discussion is ongoing I
prefer to be directly addressed to me with the list as CC.
In all other cases the message
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
Is running squid w/ anti-virus scanning actually recommended? It could
really bog down accesses.
Most I have spoken to in corporate environments have the following
conclusions in this matter:
Virus scanning on the proxy does not add much in terms of virus
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
So essentially this means that whatever's being transferred from the
client (via HTTPS), once it reaches the squid box, it will be sent
un-encrypted to the server?
Lets put it this way:
any requests accepted by the https_port directive is decrypted by Squid.
On Mon, 13 Dec 2004, Joost de Heer wrote:
Hello,
I have two problems with a Linux Squid machine (Squid 2.5STABLE7, Red Hat
Enterprise Linux ES release 3 (Taroon Update 1))
Problem 1: Filedescriptors.
above 1024. I've added the following lines to /etc/security/limits.conf:
squid hard nofile 16384
On Mon, 2004-12-13 at 18:11, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
So essentially this means that whatever's being transferred from the
client (via HTTPS), once it reaches the squid box, it will be sent
un-encrypted to the server?
Lets put it this way:
any
Henrik Nordstrom schrieb:
On Mon, 13 Dec 2004, Daniel Graupner wrote:
See Squid FAQ on how to use Squid inside a firewall.
I did, but in my testing environment there is no firewall at all.
Between cache, peer and hosts is no firewall. Please give me more hints.
So your Squid which reported
On Mon, 2004-12-13 at 18:27, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
On Thu, 2004-12-09 at 06:47, Henrik Nordstrom wrote:
This question is more of an glibc question than kernel, and no, you do not
need to edit this file any more with Squid-2.5 as Squid now
Hi all,
I have an internet cafe connected on a not so fast leased line (64k).
I definately need to use a caching proxy. I currently use squid, and
it works fine. However if one of the terminals has a 'power surfer',
they tend to use all of the bandwidth leaving not much for the other
terminals.
On Mon, 13 Dec 2004, Daniel Graupner wrote:
See Squid FAQ on how to use Squid inside a firewall.
I did, but in my testing environment there is no firewall at all. Between
cache, peer and hosts is no firewall. Please give me more hints.
So your Squid which reported Network unreachable should be
hi
anyone have an ACL that enables connection from msn messenger clients
where proxy auth is in use?
The messenger client does not seem to be proxy auth aware, so it will
only work if the client has previously entered their credentials
through having requested some other site via a
Can anyone explain how Squid's httpd accelerator is supposed to work?
How does Squid's cache get updated if a web page is updated? And how do I
tell whether pages are being served from the original web page or from
cache?
--
John
Can anyone explain how Squid's httpd accelerator is supposed to work?
http://www.squid-cache.org/Doc/FAQ/FAQ-20.html#what-is-httpd-accelerator
How does Squid's cache get updated if a web page is updated?
Squid takes into account freshness info , about objects given
by the
Hello,
I have a central proxy BlueCoat and many peripheral squid proxies
the proxy chaining is the folowing:
Client-squid 2.4stable7-Bluecoat SG-internet
The bluecoat is authenticating users with Ldap and I 'm using a cookie to
authenticate users.
The trouble is that a first client
Hi,
I have a squid 2.5 stable 4 and I have some problem using http_header_access
command in my squid.conf.
I searched about any documentation on google or squid-cache site without
success.
Is there documentation about http_header_access implementation?
If you have some links about it, can you
Hi,
I have a squid 2.5 stable 4 and I have some problem using
http_header_access
command in my squid.conf.
I searched about any documentation on google or squid-cache
site without
success.
Is there documentation about http_header_access implementation?
If you have some links
I have a squid 2.5 stable 4 and I have some problem using http_header_access
command in my squid.conf.
I searched about any documentation on google or squid-cache site without
success.
You can use squid.conf.default configuration file for header_access and
header_replace TAG informations.
Sk,
We have the same problem here.
The problem with clients typing a proxy server in manually, is that if
that proxy goes offline, all browsing stops - it's much better to use
the WPAD standard (draft-ietf-wrec-wpad-01.txt) to push a proxy.pac
script
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
Right, exactly as I thought. hence, I presume, with the SSL update, then
squid can actually use the generated server-side cert and encrypt the
request to be forwareded to the backend server.
Yes, but you still won't be able to use (browser) client
You must define your http scaning box (the PC with InterScan viruswall for
example) as a parent proxy of
your Squid box.
With this tag of the squid.conf
cache_peer your.antivirus.box parent 80 7 default no-query
(80 is the port where your viruswall listen
7 for no icp dialog)
And this tag to
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
Hmmm.. I didn't know that. Does any Open-SOurced ones exists? Eg:
Clamav?
A quick Internet search on Squid ClamAV gives the following references:
http://viralator.sourceforge.net/
http://www.jackal-net.at/
Thanks Henrik
you are right i tried reverse with ssl between client and reverse
proxy ans it is working but if i need a certificate to authenticate to
the backend servers is not working
Have you another solution ?
Regards
On Mon, 13 Dec 2004 17:11:39 +0100 (CET), Henrik Nordstrom
[EMAIL
On Mon, 13 Dec 2004, Daniel Graupner wrote:
No, my squid is inside a local network so it can only reach webservers inside
this network. To access the internet (e.g. ibm.com) it has to use a peer
which is also inside the network.
Then you are per definition inside a firewall and the FAQ applies
On Mon, 13 Dec 2004, Damian-Grint Philip wrote:
If you can't get around the MTU/DF problem, you can always force the DF
bit off in a particular direction using route maps
This is more or less the same what the patch in Bug #1154 tries to do
automatically on intercepted connections, without
Please reply to the list, and not to me directly.
TopGun Technician wrote:
Adam Aube wrote:
TopGun Technician wrote:
I have spent over 30 hours reading and trying various solutions from the
documentation, FAQ's and the mail archives. No matter what I try, I am
getting access denied.
Post
Hello Teo Ma,
Please note, the updates for ADWARE is not at www.lavasoft.com. The
update server is located in the .lavasoft.de.edgesuite.net DNS domain.
If you do not open the domain .lavasoft.de.edgesuite.net the ADWARE
update will not work. The ADWARE update proxy client has no
On Mon, 13 Dec 2004, David Delamarre wrote:
you are right i tried reverse with ssl between client and reverse
proxy ans it is working but if i need a certificate to authenticate to
the backend servers is not working
I am starting to feel like a parrot now.
If you need a personal client
On Mon, Dec 13, 2004 at 05:48:36PM +0100, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, John Poltorak wrote:
Port 80
BindAddress 127.0.0.1
What is BindAddress? Can't seem to find this in my Apache documentation..
Is this perhaps an Apache 1.x thingy? Only have Apache 2.X
We currently run 20K users a day on a Sun proxy. The machine is a Solaris 8
3800 with 8 CPUs and 8 GBs of RAM. We have a 1000 Proxy processes running.
Throughput is a little slow, and I'd sure like to know why. But, we are
considering changing over to Squid, and wonder about the performance of
After getting Squid set up on the same host as Zope, I thought I was
getting Squid to serve out cached Zope pages, but it looks as though it is
simply acting as a redirector... Here are a couple of access.log entries:-
1102960638.320 37210 127.0.0.1 TCP_MISS/200 1566 GET
Hi All,
I have a bit odd question :-)
Is it possible to remotly list urls of cached objects or somehow query
squid about some urls using wildcards ?
I'd like to invalidate some objects in cache but I don't know their urls.
Regards
Konrad.
I tracked the site down by looking in the Squid access logs.
Essentially I would try to update the ADWARE, watch the proxy server's
access log for the PC's IP address.
Tim
---
Timothy E. Neto
Computer Systems Engineer Komatsu
On Mon, 13 Dec 2004, Romuald Guillou wrote:
The trouble is that a first client authenticate, through a pop up, with user
name and password user1, he has privilege to surf internet. A second client,
user2, want to go internet sites, bluecoat returns the authentication popup,
the user click cancel
Hello,
I have asked this question before, but I still can't figure it out!
When a file is written to cache, where does Squid figure out which cache
directory to store the file in?
I see in store_swapout.c where e-swap_filen and e-swap_dirn are
assigned, but if I write these out witha debug
The Suse firewall was blocking Squid requests.
No I get the standard page cannot be displayed when trying to connect to
a web page.
Kurt
Sorry about that, I responded last night but it didn't go through.
Adam Aube wrote:
Please reply to the list, and not to me directly.
TopGun Technician wrote:
On Mon, 13 Dec 2004, Konrad wrote:
Is it possible to remotly list urls of cached objects or somehow query squid
about some urls using wildcards ?
I'd like to invalidate some objects in cache but I don't know their urls.
See the purge script, linked in related software.
Regards
Henrik
On Tue, 14 Dec 2004, Glenn Baptista wrote:
Hello Henrik,
Thanks very much for your help. I was not successful in being able to do
digest authentication. Following are details of what I did. Can you please
help me overcome the problem which is reported by squid as a 'Parsing error'
[EMAIL
On 13.12 13:13, Masterson, Patrick wrote:
please configure your miler to wrap long lines.
We currently run 20K users a day on a Sun proxy. The machine is a
Solaris 8 3800 with 8 CPUs and 8 GBs of RAM. We have a 1000 Proxy
processes running. Throughput is a little slow, and I'd sure like to
On Tue, 2004-12-14 at 00:41, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
Hmmm.. I didn't know that. Does any Open-SOurced ones exists? Eg:
Clamav?
A quick Internet search on Squid ClamAV gives the following references:
http://viralator.sourceforge.net/
On Tue, 2004-12-14 at 00:48, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, John Poltorak wrote:
Port 80
BindAddress 127.0.0.1
What is BindAddress? Can't seem to find this in my Apache documentation..
Is this perhaps an Apache 1.x thingy? Only have Apache 2.X around..
This
Hello,
How to make MSN work though squid ?
Thanks
Varun
Hi John,
Squid will only output fatal messages to Syslog. If you want access/cache log
messages sent to syslog, you'll need to do this another way.
If you're doing this on non-windows, non-linux, then please mail me at ssha at
henleycol dot ac.uk. I've done a lot of work with Squid on this
Hello,
We're using a Squid 2.5-STABLE7
(./configure --enable-underscores --enable-gnuregex --disable-ident-lookups
--enable-snmp --enable-err-languages=English --with-pthreads --mandir=/usr
/share/man --enable-storeio=diskd,ufs --enable-auth=ntlm,basic --enable-ex
Hello,
How to make MSN work though squid ?
http://www.squid-cache.org/Doc/FAQ/FAQ-1.html#ss1.1
If the application can use a http proxy through proxy
settings , then it may work.
M.
Hello,
We're using a Squid 2.5-STABLE7
(./configure --enable-underscores --enable-gnuregex
--disable-ident-lookups
--enable-snmp --enable-err-languages=English
--with-pthreads --mandir=/usr
/share/man --enable-storeio=diskd,ufs
--enable-auth=ntlm,basic --enable-ex
Hi all,
we are running a Squid on a dual xeon LAMP machine (so it's a LSAMP ;-)
as an accelerator for the web page on the machine.
So far so good, but the traffic would be bigger in future so what can I
do to increase the performance or reduce the cpu usage of Squid?
Ok, I can move MySQL and
On Mon, 13 Dec 2004, TopGun Technician wrote:
Does someone out their have the answer.
I am still getting access denied when trying to use Squid cache. I have added
my network 10.10.30.0/24 to the acl and added the lines to allow access.
I have spent hours on this already and have tried
On Tue, 14 Dec 2004, Rodrigo A B Freire wrote:
[2004/12/14 02:51:40, 1] libsmb/ntlmssp.c:ntlmssp_server_auth(549)
ntlmssp_server_auth: failed to parse NTLMSSP:
[2004/12/14 02:51:41, 1] libsmb/ntlmssp.c:ntlmssp_server_auth(549)
ntlmssp_server_auth: failed to parse NTLMSSP:
[2004/12/14 02:51:44,
Hi John,
Sorry to send this reply back to the maillist, but I can't seem to send
messages to your private address.
NSLOOKUP for warpix.org fails..., and I can't send an attachement to this list
either
Unfortunately, Squid won't send activity output to syslog on its own. There are
other
Hello,
I have two problems with a Linux Squid machine (Squid 2.5STABLE7, Red Hat
Enterprise Linux ES release 3 (Taroon Update 1))
Problem 1: Filedescriptors.
I have reconfigured Squid to use 16384 file descriptors (it's a fairly
busy proxy, highest peak I've seen so far was around 7000
On Mon, 13 Dec 2004, Ow Mun Heng wrote:
On Thu, 2004-12-09 at 06:47, Henrik Nordstrom wrote:
This question is more of an glibc question than kernel, and no, you do not
need to edit this file any more with Squid-2.5 as Squid now automatically
works around the glibc limit.
Then should the FAQ be
According to:-
http://www.squid-cache.org/Doc/FAQ/FAQ.html#toc20.3
One way is to leave your httpd running on port 80, but bind the httpd
socket to a specific interface, namely the loopback interface. With Apache
you can do it like this in httpd.conf:
Port 80
BindAddress
Sorry i had forgotten it ...
There is all the information in the squid.conf default.
Thanks all.
Philippe M.
-Message d'origine-
De : Elsen Marc [mailto:[EMAIL PROTECTED]
Envoyé : lundi 13 décembre 2004 15:23
À : EXT / FOCAL MARTINET Philippe (DSIT-XA/I); [EMAIL PROTECTED]
Objet : RE:
On Mon, 13 Dec 2004, John Poltorak wrote:
Port 80
BindAddress 127.0.0.1
What is BindAddress? Can't seem to find this in my Apache documentation..
Is this perhaps an Apache 1.x thingy? Only have Apache 2.X around..
Is this correct?
Any method which makes the Apache only listen on
Sorry about that, I responded last night but it didn't go through.
Adam Aube wrote:
Please reply to the list, and not to me directly.
TopGun Technician wrote:
Adam Aube wrote:
TopGun Technician wrote:
I have spent over 30 hours reading and trying various solutions from the
OK, TIMnow all does work, the problem seemed to be the destination:
.lavasoft.de.edgesuite.net
How did you found it?
Thanks, a lot...
loop.-
- Original Message -
From: Tim Neto [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Cc: loop [EMAIL PROTECTED]
Sent: Monday, December 13,
squid2.5-Stable7, why are my store.log files not being deleted after they
are compressed? I have not edited anything beside having changed the
'rotate' to 30
Thanks.
---
cat /etc/logrotate.d/squid
/var/log/squid/access.log {
daily
rotate 30
copytruncate
compress
notifempty
On Mon, 13 Dec 2004, John Poltorak wrote:
[crit] (98)Address already in use: make_sock: could not bind to address 127.0.0.1 port 80
Then you have either
- Some other software listening to 127.0.0.1 port 80
- Some other sofrware listening to port 80 (unspecified address)
netstat -an will tell
On Mon, 13 Dec 2004, John Poltorak wrote:
Do I need to configure Squid (or Zope) in some way to keep some pages in
cache? The cache is completely empty.
Try the cacheability engine. Gives reasonably good descriptions why/why
not some content is cacheable.
Regards
Henrik
Does someone out their have the answer.
I am still getting access denied when trying to use Squid cache. I have
added my network 10.10.30.0/24 to the acl and added the lines to allow
access.
I have spent hours on this already and have tried all suggested from
this forum. No luck yet.
On Mon, 13 Dec 2004, OTR Comm wrote:
When a file is written to cache, where does Squid figure out which cache
directory to store the file in?
By cache directory, do you refer to which cache_dir (when you have
multiple in squid.conf) or which file number within the cache_dir?
If the first then
kurt,
is there any error in your cache.log? access.log? how
about when restarting squid? what's the result of
squid -k parse
from your previous post, it seems that your acl rule
is blocking your access (access denied error). it has
something to do with your acl but i don't see it from
your
I wish to implement a speed limiter for users in the NT group SLowInternet.
All other users have full speed connections. I do not notice any difference
in speed of the Internet connection using my test SlowInternet account and a
normal account in side by side comparisons in spite of what I
On Mon, 13 Dec 2004, John Poltorak wrote:
What do I need to do to create such headers?
See your web server documentation / FAQs etc.
Regards
Henrik
Hello Henrik,
Thanks very much for your help. I was not successful in being able to do
digest authentication. Following are details of what I did. Can you
please help me overcome the problem which is reported by squid as a
'Parsing error'
[EMAIL PROTECTED] sbin]# ./squid
2004/12/14 11:10:34|
On Tue, 2004-12-14 at 01:02, Henrik Nordstrom wrote:
On Mon, 13 Dec 2004, David Delamarre wrote:
you are right i tried reverse with ssl between client and reverse
proxy ans it is working but if i need a certificate to authenticate to
the backend servers is not working
I am
67 matches
Mail list logo