If you run "squidGuard -dC all" from the command line it will create
.db's for all the domainlist and urllists. Since this only needs to be
done once, startup time under squid should be improved considerably.

 Also, I'm not sure why you're suiding squidGuard but that aside the
permissions I've been using happily so far look something like this:

chown -R root:root /usr/local/squidGuard
chmod -R 755 /usr/local/squidGuard
/usr/local/squidGuard/bin/squidGuard -dC all
chown proxy /usr/local/squidGuard/log/*
chown blmaint /usr/local/squidGuard/db/*/*.db

 .. squidGuard only needs write access to its log directories (or log
files to be precise), while read access to the db's is adequate, provided
they are pregenerated with -C all as another user. I use the blmaint
userid for a blacklist maintainer who can update the .db files using the
supplied perl scripts from the documentation. Of course, the .db's quickly
lose synchronisation with the text urllists, but they become irrelevant
when blmaint has proper tools to manipulate .db's directly.

On Tue, 26 Feb 2002, Chris Hedemark wrote:
>
> 2) Furthermore I had to "chown www /usr/local/bin/squidGuard" and "chmod 4755
> /usr/local/bin/squidGuard" (though 4500 may have been more appropriate).
> Also I had to "chown -R www /usr/local/squidGuard".
>
> ...
>
> While I am clearly no expert on this software, I have to wonder if
> performance could be dramatically improved if it were doing lookups against a
> proper RDBMS and not having every child process parse the whole blacklist in
> one go on startup.  This would certainly improve startup time, and possibly
> improve lookup time as well.
>

Reply via email to