Yes, I'd like to see your scripts; please send them to me.

I've wondered if it would be worthwhile to check for and remove
redundancies? If xxxpics.com is in the domains file, why should
there be 50 urls in that domain in the urls file? (If the entire
domain is blocked, blocking urls in that domain are superfluous,
right?)

I really wish the owner/operator of the robot would break silence
and talk about how the blacklists are maintained. There are items in
the lists and changes that happen week to week that don't seem to
follow any logic (unless it's "random" logic).

Thanks for the offer.

Rick Matthews


-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Ruben
Fagundo
Sent: Thursday, March 28, 2002 6:02 PM
To: Squidguard Mailing List
Subject: Contributed Code - howto



I wrote a few scripts that do the following:

1) Find the domain name for every IP number in the domains file of
any of
the blacklists
2) Add the domain name to a local blacklist (which I call addendum)
3) Check for duplicate entries, and delete them.

This helps me keep a master list of domains to match the ip numbers
in the
domains file.  This gives me the ultimate coverage again domain
names, and
matching ip numbers.  I suspect the reverse is true also, but a
program
like that already exists in the contrib directory.

That's it.  If anyone wants these scripts, let me know and I'll send
it
to  you.  I would be happy to have this included in the contrib
directory
of the distribution as well, should that be of value to the group.

I'm happy to contribute back for such a great product.  I love
squidGuard!

Ruben



Reply via email to