2011/6/3 Amos Jeffries squ...@treenet.co.nz:
On 02/06/11 23:10, E.S. Rosenberg wrote:
Hi all,
Does having different maximum sizes for objects in memory and objects
saved on the disk cache have a negative influence on performance or is
using squid that way good/recommended?
Only indirectly.
On Friday 03 June 2011, Amos Jeffries wrote:
On 03/06/11 10:46, E.S. Rosenberg wrote:
If you want them to have a direct connection to the internet you could
use always_direct (or never_direct) (which also exists in squid 2.x).
Something like this:
acl servers src [ips/fqdns]
acl
Hello,
I found a lot of UDP connections that is coming to my proxy servers.
I don't find the cause of such one-way traffics to my servers.
The sample UDP traffic is as :-
14:00:07.506612 IP 41.209.69.146.10027 x.x.x.x.65453: UDP, length 30
14:00:07.518118 IP 121.218.37.254.41597
Check the hostname of these IP addresses. They could be DNS replies,
using random ports for source/destinations. Squid can generate tons of
DNS traffic.
Bal Krishna Adhikari balkris...@subisu.net.np 6/3/2011 6:13 AM
Hello,
I found a lot of UDP connections that is coming to my proxy servers.
Regarding the discussion at
http://www.mail-archive.com/squid-users@squid-cache.org/msg79611.html
The ipv6 issues you are seeing are related to the hardware load
balancers in place that govern the response to www.carfax.com. They
are not aware of, and do not respond correctly (per RFC) to ipv6
Amos, is there a way to tell Squid to stop asking for records/IPv6?
We are having problems with other sites not working in the same way.
On Thu, Jun 2, 2011 at 2:53 AM, Amos Jeffries squ...@treenet.co.nz wrote:
On 02/06/11 10:07, William Bakken wrote:
The second log line on the last email
On Thursday, June 02, 2011 01:03:06 AM Amos Jeffries wrote:
On 02/06/11 19:41, errno wrote:
Just to confirm:
If I have multiple ip aliases assigned to the same physical nic, will
there still be port conflicts on an ip (aliased) based multi-instanced
squid server?
There is rarely a
My logrotating done by logrotated doesn't work anymore...
/var/log/squid/*.log {
weekly
rotate 52
size 100M
compress
notifempty
missingok
sharedscripts
postrotate
# Asks squid to reopen its logs. (logfile_rotate 0 is set in squid.conf)
# errors
Setup: Gentoo linux OS on squid and privoxy home lan server
Squid-3.1.12
privoxy-3.0.17
I'm not running an html server, just trying to use squid and privoxy
for my own browsing.
I'm attempting to get started with squid and privoxy. So far using
nearly original config files in both
anyone getting my mails?
Ok Ive had squid3 running rock solid for months, I recently migrated from
Ubuntu 9 to 10.04 and now Squid is clearly not caching, but traffic IS
passing through it, my conf is the same as it was before but now im getting
an error on cache.log every time squid gets a request, any help would be
On Friday, June 03, 2011 02:10:09 PM MrNicholsB wrote:
anyone getting my mails?
affirmative
Setup: Gentoo linux OS on squid and privoxy home lan server
Squid-3.1.12
privoxy-3.0.17
I'm not running an html server, just trying to use squid and privoxy
for my own browsing.
Why not to use ICAP or URL rewriter functionality built into Squid to
achieve the same results
On 04/06/2011 02:05, sichent wrote:
Setup: Gentoo linux OS on squid and privoxy home lan server
Squid-3.1.12
privoxy-3.0.17
I'm not running an html server, just trying to use squid and privoxy
for my own browsing.
Why not to use ICAP or URL rewriter functionality built
well you are giving nice cache log..
what version of squid you are using? from ubuntu repos?
also did you tried to see the access.log? (for HIT)
i have never seen these errors on squid cache.log
but if you will have more info that i asked you, i thing we will manage
to get some...
Regards
The answer is to disable IPV6 on squid and on the linux machine and
software.
but we do not know that this is the case..
do you have a local DNS server on the machine for caching and forwarding?
you can setup on the squid to use the local dns server and on the dns
server setup specific
well if you do want to push an object you can do it in a more elegant way:
export http_proxy=http://localhost:3128 ; wgethttp://fqdn/object;
and use it on a big site using recursive download and on ram drive.
also another tip is to use --delete-after
this will pull the file into the squid
17 matches
Mail list logo