We are trying to adapt Squidguard to use for filtering for K-12 school kids
at our local school districts.  One problem we have is that the kids are
very creative and come up with new domains to visit every day.

One idea we had to solve this is to produce a "Whitelist" of perfectly fine
websites that the kids go to everyday, and then use the combined Whitelist
and Blacklists to come up with a daily list of new domains that the kids
have gone to that we have never seen before (the "Unknown list").  Then we
could run a robot against the "Unknown list", and see if there are any sites
in that list that need to be added to one of the Blacklists.

Before we start writing a bunch of code, my questions are:

Does anyone think there are some problems with this approach?
  
Has somebody already tried it and rejected it?

What sort of "pattern" file should we use for this robot?  I looked here ->
http://ftp.teledanmark.no/pub/www/proxy/squidGuard/contrib/squidGuardRobot/
for a pattern file, but I didn't find one.

Thank you,


Rob McCarthy
[EMAIL PROTECTED]


Reply via email to