On Sun, 9 Jan 2005, Daniel Navarro wrote:
is working, I just wonder how many lines can support because the big file is not supported.
How big acl lists are supported is very much dependent on the type of the acl, and how you specify the data.
Common mistakes you should stay away from:
1. Very large regex based lists (url_regex, urlpath_regex etc). These are quite expensive, foremost in CPU usage which is linear to the number of entries in the list but also in memory usage which is rather high per entry. In addition regex patters are very hard to get correct in most real-life situations.
2. Specifying IP based ACLs by name (src, dst). If you specify IP addresses by name then Squid will need to make a DNS lookup on each name while parsing the configuration and this will take quite some time if the list is large, probably longer than anyone are willing to wait.
The main acl types in Squid can support very many entries efficiently:
src (client IP) dst (server IP) dstdomain (server hostname / domain) proxy_auth (username)
For these the memory usage is approximately 4 times the size of the list, and parsing speed on a P3 450 MHz is approximately 20K entries per second. Runtime lookup time is not very dependent on the size of the list.
For the other ACL types the runtime lookup time is linear to the size of the list which makes them unsuitable to be used with very large lists. The memory usage and parsing speed is about the same as above, except for the regex based acls where both memory usage and parsing time is significantly higher.
While talking about large acls it is also worth mentioning the external acl interface of Squid. This allows you to instruct Squid to automatically query a backend database of your choice to perform large scale acl lookups in an dynamic and reasonably efficient manner.
Regards Henrik
