Rick,

Rick Matthews wrote:

>That's what we need, somebody who will
>run the robot on a consistent, regular basis.
>
If we had to, that would be something we will look into doing.  It's 
just a matter of having the bandwidth for all the hits to go out and 
crawl and also for everyone wanting to download the resulting tarball.  :)

>cat domains domains.local | sort | uniq > domains.temp
>mv -f domains.temp domains
>

Thanks for the shortcut.  I'll incorporate it into the script.

>
>So instead of entering my changes into domains.diff and urls.diff files,
>I pay you so that I can enter them into a web page instead, right?
>
>I just don't see who would see value in that.
>
We are targeting the people who want to run squidGuard and are not as 
technically savy and are familiar with "Winblows" interfaces and don't 
want to have to login to a box, use vi, joe, pico, etc., etc.

Our "service" would include a set of scripts that would run on your box 
to download the resulting tarball of changes to integrate into your 
database.  The scripts would do all the work of updating your squidGuard 
setup (database files only, of course and restart squid) which is what 
90% of the people want.

-- 
James A. Pattie
[EMAIL PROTECTED]

Linux  --  SysAdmin / Programmer
PC & Web Xperience, Inc.
http://www.pcxperience.com/



Reply via email to