Greetings! Your project looks interesting, and prompts a few questions /
comments.

> Our goal is to provide up-to-date acl-list with means to
> keep them updated on squid-servers too.

I'd like to make sure that I understand this statement. The squidGuard
configuration page (http://www.squidguard.org/config/) defines the
following components of the squidGuard configuration file:

Path declarations
Time space declarations
Source group declarations
Destination group declarations
Rewrite group declarations
Access control rule declarations

Using those definitions, what will your "up-to-date acl-list" contain?

> I believe that we have found a way to do it, using apt,
> at the moment only on redhat, but more distributions
> could be added.

What tasks, specifically, will "apt" be handling for you?

> By including .diff-files in the db-directory with lines
> with + in front for adding sites and - for removing.
> Should be easy to keep between packages.

While the .diff file system does an excellent job of making changes on
the fly (without downtime), it doesn't address the library issues and
maintenance requirements of a production system. I'll see if I'm capable
of explaining my thoughts on only one cup of coffee. ;-)

In my opinion, blacklist maintenance would be greatly simplified if the
.diff file contents were processed as:

"+" = Ensure that this entry exists in the db
"-" = Ensure that this entry does NOT exist in the db

as opposed to the way they are processed today:

"+" = Add this entry
"-" = Delete this entry

But, we must work with what we have.

I would expect that you would make use of the blacklists that are
already "maintained" by others, for example:

<http://ftp.ost.eltele.no/pub/www/proxy/squidGuard/contrib/blacklists.ta
r.gz>

I'd suggest that you merge/dedupe blacklists from at least a couple of
sites to give better coverage. As an example, the blacklist from
univ-tlse1.fr includes a large number of porn sites that are not
included in the ost.eltele.no blacklist:

<ftp://ftp.univ-tlse1.fr/pub/reseau/cache/squidguard_contrib/adult.tar.g
z>

You'll need to download the updated blacklists and merge / dedupe them
on a regular basis, then create new db files. Local changes will then
need to be applied to the new db. Additional local changes will probably
be needed between the scheduled major updates. If there is no process in
place to incorporate the local changes into the text files during major
updates (and prior to creating the new db), the local changes will need
to be maintained and applied to each new db (but NOT re-applied with
each interim local update).

I have found that research is highly recommended before performing a
local update, so you'll need to provide tools and document the process
for the local admins. Remember that your local updates will include both
additions and deletions, and deletion requests will probably be more
urgent (e.g. the morning after a major update when cnn.com or yahoo.com
somehow snuck into the porn list).

When the "big boss" calls on Monday and wants to know why you've locked
him out of the news, you'll need to know the url that was blocked.  From
that, you'll need to determine if he was blocked by an expressionlist or
a destination group, and if destination group, which one. [SIDEBAR: I
find that a porn expressionlist is invaluable in blocking unlisted porn
sites and stopping searches for porn terms, but you must remember the
expressionlist when researching a blocked url.]

Getting back to the need for research, let's suppose that the boss
complains that this link is blocked as porn:

<http://news.cnet.com/news/0-1005-200-8171528.html?tag=mn_hd>

What local update will you make to give him access?  If you created an
"allow" destination group (and structured your access control
declaration so that "allow" takes precedence over the negative
destinations), you could quickly add the requested url, run an update
and let the boss know the good news in record time. But what if the
listed url was denied because the /porn/urls file contains
"news.cnet.com/news"? How long do you think it will be before the boss
calls you back?
The correct answer is that you do not know what local update is needed
until you find what entry is blocking the url. [The "allow" destination
group can be useful in correcting rare inadvertent blocks caused by an
expressionlist. (The expressionlist probably needs to be modified if the
inadvertent blocks are frequent.) I've had this occur on news stories
that contain specific text in the url; the last one that I remember was
a mainstream news story that included "sex" and "girlfriend" in the url.
It was much easier to add the url to an "allow" list than to try and
rewrite the expressionlist to allow this url without reducing its
effectiveness.]

Once you know which destination group is blocking it, how do you
determine which specific entry is blocking any given url (so that you
can create the correct local update to reverse it)? Well, to research
the blocked cnet link above I would use grep to search /porn/domains and
/porn/urls for "cnet.com", then evaluate the results to determine the
correct entry. Note that this method *will not work* if the
"news.cnet.com/news" entry was inadvertently added through a diff file
in a local update, *unless* you also search your archived .diff file
maintenance. [If it was added locally you will need a
"-news.cnet.com/news" diff file entry to remove it, but do you really
want to apply "+news.cnet.com/news" and "-news.cnet.com/news" after the
next major update? Probably not, so you'll need to delete the
"+news.cnet.com/news" from the archive and you don't want to archive the
"-news.cnet.com/news".]

I apologize for the length of this message, but hopefully I've given you
a sense of some of the issues that arise in maintaining the blacklists.
I'd appreciate feedback on any of these issues where my logic may be
off-base. Thanks.

Rick Matthews



-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Roy-Magne Mo
Sent: Friday, December 14, 2001 3:22 AM
To: Ivan E. Moore II
Cc: [EMAIL PROTECTED]
Subject: Re: New project: chastity


On Fri, Dec 14, 2001 at 02:13:49AM -0700, Ivan E. Moore II wrote:
> I'm working on it right now actually.  The only thing I'd like to work
out
> (which I haven't bothered to do yet with my current blacklist package)
is
> a way for local admins to modify (add/remove) entries and be able to
> maintain those changes through an upgrade.
>
> It would be nice if we had a way for local admins to have local
overrides
> for the db files.  Not sure if that's possible or not but it would be
nice. :)

By including .diff-files in the db-directory with lines with + in front
for adding sites and - for removing. Should be easy to keep between
packages.

> anyways..I'm committed to the Debian portion of this.  In fact I'll do
an
> upload to Debian here soon so that it can make it into the 3.0 release
before
> the freeze hits.

:)

--
carpe noctem

Reply via email to