> ... we are working on building a squidGuard Database
> Administration application (web based).

James, is this something you are creating for a client? Or is it a
public subscription service that you will market? Or maybe you will be
selling the application, or even open-source?

It sounds like it will be a subscription service. Will you be running
some sort of robot on a regular basis to update the blacklists? I would
see that as the primary service that one would be willing to pay for.
The web-based administration would be icing on the cake, but I wouldn't
want to pay for the icing without the cake.

I like the direction that you are heading, but I'm not sure I understand
the purpose of the .local files that you've described. Am I correct that
in this scenario my .local files would be utilized *instead* of updating
my personal database at your site? What does the .local file process
give you that you wouldn't have with the existing .diff files?

In fact, the .local files seem to ignore the fact that it is equally
important that we are able to *remove* entries from the robot-generated
file. For example:

-netidentity.com
-email.com
-internettrafficreport.com
-ancestry.com
-bankmergers.com
-bankrate.com
-bestbuy.com
-web-search.com
-zwire.com/site/news.cfm
-wfaa.com/wfaa/articledisplay
-wired.com/news/business
-wired.com/news/culture
-wired.com/news/politics
-wired.com/news/print
-wired.com/news/technology
-wirednews.com/news/politics
-ananova.com/news/story
-zdnet.com/filters/printerfriendly
-zdnet.com/pcmag
-zdnet.com/zdnn/stories/news
-zdnet.com/zdsubs/yahoo/tree

Carrying that thought a bit further, I think there's another feature
that would be popular (if you are maintaining our personal +/- lists on
your server).

If cnn.com turns up in the porn list most of us would take the steps
necessary to remove it from our personal lists. In a perfect world, I'd
like for it to be removed from my list before I have a user complain
about it. The problem is, I wouldn't want to carte blanche accept
someone else's +/- as my own.

I'd like to have a web function that would allow me to review all users'
+/- entries, by group (porn, ads, warez, etc). All of the entries would
be alphabetized as a single group with no indication of the contributing
user. Entries that I am already using would not show on this list. There
would be a check box (or radio button) next to each entry, and I could
click the ones that I wanted and when I submitted the page they would be
added to my personal +/- list.

I think the menu hierarchy could look something like this:

<Review System-Wide Blacklist Changes>
        <Porn>
                <Additions> *Note 1*
                        <Complete List> *Note 2*
                        <Since my Last Update>
                        <Since a specific date>
                <Deletions>
        <Warez>
        <Drugs>
        <Gambling>
        <etc...>

Note 1 - I don't see a need to have separate menu branches for domains
and urls, as long as you can keep them straight on the backside. In
fact, it would be helpful for the urls and domains to be displayed
sorted together.

Note 2 - In each of the lists, there would be some indication of the
"most popular" entries (bold font, or most popular icon, etc...) "Most
popular" would be determined as follows: More than xx% (50%?) of the
users that reviewed the list *and* made at least one selection, also
selected this entry. The default would be to have the "most popular"
intermingled with the other entries in that section (in alpha order).
I'd like to have a clickable option on that page to sort the "most
popular" ahead of the others.

I guess that's enough for now. ;-)

Are you sorry you asked?

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of James A.
Pattie
Sent: Friday, November 02, 2001 11:19 AM
To: [EMAIL PROTECTED]
Subject: squidGuard db update service


Hi,

    I'm the lead programmer for PC & Web Xperience, Inc. and we are
working on building a squidGuard Database Administration application
(web based).  Its sole purpose is to update entries in your copy of the
blacklists databases and also in an allowed database.  It is currently
being designed to run out of our PCX Portal (pcxportal.sf.net) and will
use a PostgreSQL database to store each users additions to their allowed
and blacklist databases.  The web based app will not modify the
squidGuard.conf file (may never do that).  We are anticipating either
sending you a tarball with your "update" files or actually pushing it to
a server you specified and instructing it to update itself.

    Since the scenario we are working on is that we will have the
blacklist tarball (minus all .diff and .db files) and want to keep any
additions you would have made between updates of the blacklist, etc. I
have come up with a different "update" solution.  I'm asking for
feedback to see if anyone is interested and/or sees ways that the
process could be improved.

    The current solution I've come up with is that for every blacklist
group (porn, ads, warez, etc.) the entries that you want to make sure
exist (that aren't in the robot generated version) go into a
domains.local or urls.local file.  My script then walks the blacklist
directory and any groups that it finds a domains.local or urls.local
file in will check the contents of the domains|urls.local file against
its corresponding text file.  If the entry doesn't exist it is tacked
onto the end of the master text file (domains or urls).  I then rebuild
the db file with squidGuard -C.

    The script I currently use is attached for review and/or use by
anyone interested.

--
James A. Pattie
[EMAIL PROTECTED]

Linux  --  SysAdmin / Programmer
PC & Web Xperience, Inc.
http://www.pcxperience.com/


Reply via email to