IMHO you are much better of using squidGuard instead of squid acl's when operating with large blacklists.

There are several blacklists available on the web sorted by category, you can combine them with your file if you want (sort|uniq).

Dont know if squidGuard is parsing something like *.domain.tld correctly, but it should be possible to get rid of those '*' with a single line shell script.

squidGuard is really fast, but try to improve perfomance by adding more RAM to your machine. Second disk can't harm either, but this depends on the load.

Regards, Hendrik.


Flavio Borup wrote:

I have a .TXT file with aprox =~ 1.7 MB of denied URLs
The denied URLs .TXT were imported from MS ISA Server Web-Cache-Firewall solution
The .TXT have one URL per line, sometimes in the format: domain.tld, but sometimes in 
the format *.domain.tld

Question 1) Can i use the .TXT as BlackList in Squid ?

Question 2) A huge file like that can impose severe restrictions in performance?
(Pentium III, 128 RAM, IDE 5.200 RPM disk)

Reply via email to