Hi,
I'm the lead programmer for PC & Web Xperience, Inc. and we are
working on building a squidGuard Database Administration application
(web based). Its sole purpose is to update entries in your copy of the
blacklists databases and also in an allowed database. It is currently
being designed to run out of our PCX Portal (pcxportal.sf.net) and will
use a PostgreSQL database to store each users additions to their allowed
and blacklist databases. The web based app will not modify the
squidGuard.conf file (may never do that). We are anticipating either
sending you a tarball with your "update" files or actually pushing it to
a server you specified and instructing it to update itself.
Since the scenario we are working on is that we will have the
blacklist tarball (minus all .diff and .db files) and want to keep any
additions you would have made between updates of the blacklist, etc. I
have come up with a different "update" solution. I'm asking for
feedback to see if anyone is interested and/or sees ways that the
process could be improved.
The current solution I've come up with is that for every blacklist
group (porn, ads, warez, etc.) the entries that you want to make sure
exist (that aren't in the robot generated version) go into a
domains.local or urls.local file. My script then walks the blacklist
directory and any groups that it finds a domains.local or urls.local
file in will check the contents of the domains|urls.local file against
its corresponding text file. If the entry doesn't exist it is tacked
onto the end of the master text file (domains or urls). I then rebuild
the db file with squidGuard -C.
The script I currently use is attached for review and/or use by
anyone interested.
--
James A. Pattie
[EMAIL PROTECTED]
Linux -- SysAdmin / Programmer
PC & Web Xperience, Inc.
http://www.pcxperience.com/
#!/bin/sh
# squidGuard.cron - Updated by James A. Pattie <PC & Web Xperience, Inc.>
# 04/30/2001 - http://www.pcxperience.com/
# Added support to traverse the directories and for any .local files, merge
# in any entries that were defined by the system administrator and are not
# in the distributed blacklist files. The db file for the .local file is
# then regenerated.
# Ex. porn/domains.local would update porn/domains and rebuild porn/domains.db
# no validation is being done to verify the downloaded file, etc.
# This is designed to run from /etc/cron.daily, etc.
echo squidGuard Blacklist Download
cd /usr/local/squidGuard/db
echo "Running ftp....."
echo "open ftp.ost.eltele.no
user anonymous [EMAIL PROTECTED]
cd /pub/www/proxy/squidGuard/contrib
bin
get blacklists.tar.gz blacklists.tar.gz
close
quit" | ftp -n
echo "File downloaded, extracting..."
gzip -dc /usr/local/squidGuard/db/blacklists.tar.gz | tar xvf -
echo "Blacklists updated"
echo "Changing Ownership"
chown -R squid:squid /usr/local/squidGuard/db/blacklists
echo "Checking for local changes to merge in."
# now check and see if there are any local changes that need to be merged in.
cd blacklists
# get a list of directories to work with
DIRS=`ls -1`
#echo $DIRS
for d in $DIRS;
do
# echo "d = $d"
if [ -d $d ]; then
echo "Processing directory '$d'..."
cd $d
if [ -e domains.local -o -e urls.local ]; then
LOCALS=`ls -1 *.local`
# echo $LOCALS
for a in $LOCALS;
do
echo " Local file '$a' in '$d'"
db=`echo $a | sed 's@\.local@@'`
while read LINE
do
EXISTS=`grep "$LINE" $db`
if [ -z "$EXISTS" ]; then
# append the LINE to the end of the database file.
echo $LINE >> $db
# echo " Adding $LINE to $db..."
fi
done < $a
# now rebuild the database file
echo " Database rebuilt for '$db'"
/usr/local/bin/squidGuard -C $db
done
fi # no *.local files exist in this directory.
cd .. # go back to the parent directory
fi
done
echo "Restarting Squid..."
# restart squid
/etc/rc.d/init.d/squid restart