I use a file called bad_url.squid to represent sites I want blocked.  I
think I have reached a limit to what it can hold as when I do a reconfigure
it could take a few minutes for the data to be scanned and processing power
gets sucked up.  I know there is dansguardian and a few other ways to
achieve a list but can squid handle such a list like that?  I think I have
19,000 sites to have blocked.

Thanks


Reply via email to