Thanks Chris! This helped me a lot!


Regards



Pablo Romero


RE: [squid-users] Squid slows down when a file with more than 250 00 URLs to block is loaded


   * This message: [ Message body ] [ More options ]
   * Related messages: [ Next message ] [ Previous message ]

From: Chris Robertson <[EMAIL PROTECTED]>
Date: Fri, 28 Jan 2005 14:34:49 -0900

-----Original Message-----
From: Pablo Romero [mailto:[EMAIL PROTECTED]
Sent: Friday, January 28, 2005 2:20 PM
To: [email protected]
Subject: [squid-users] Squid slows down when a file with more than 25000
URLs to block is loaded


Hello

I am running Squid 2.5 Stable 6 on a Pentium4 1.8 Ghz and 256 Mbytes RAM.
I
am trying to put this proxy in a production environment soon, but,
although
squid is performing just fine all tasks, when I began to try the ACL
tests,
it seems to slow down (very much) when a blacklist file is loaded. The
configuration is the following:

acl denegar url_regex "/opt/squid/blacklist"


This blacklist file has 25000 sites on it, and it seems to take down
squid.
I don't know if you guys can provide me some tips so I can tune up my
squid
proxy. Can you tell me if I am using the wrong hardware configuration, if
I
need more RAM, or if I just have to change some stuff in the squid.conf
file.


I'd appreciate your help


Regards




Pablo Romero

url_regex is a last resort, and very CPU intensive. Use dstdomain instead in any instance that you can (i.e. instead of "url_regex site\.domain" use "dstdomain .site.domain") and you'll find performance improves greatly.

Chris

_________________________________________________________________
Express yourself instantly with MSN Messenger! Download today it's FREE! http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/




Reply via email to