That was it... I had *ONE* directory in the path that wasn't allowing me 
through.  I am amazed at how fast this is.  The blacklist is also fairly 
accurate... I have done web searches and it blocks a lot of stuff that 
is questionable.  Although, it like many other web filters also blocks 
"breast cancer", whereas I can see some validity to this search.  Is 
there anyone working on a manual blacklist distribution?  One where 
people enter in URLs that squeeze through the filter?  Or does that even 
happen?

Ruben Fagundo wrote:

> Do you have file permission all the way down into the directory.  As a 
> test, the tcsh the shell to nobody in /etc/passwd, then login as nobody, 
> and see if you can change directory all the way down to the domains 
> file.  Then see if you can read it.
> 
> Chances are you have a unix file permission problem somewhere down the 
> path where one of the directories does not have the right privileges.
> 
> At 12:46 AM 3/29/02 -0600, Paul Lauss wrote:
> 
>> I have beat my head against the cache directory for 4 days now and I'm 
>> at a loss.  My squidGuard.log says:
>> 2002-03-28 23:51:28 [3625] init domainlist /var/db/squidGuard/ads/domains
>> 2002-03-28 23:51:28 [3625] /var/db/squidGuard/ads/domains: Permission 
>> denied
>> 2002-03-28 23:51:28 [3625] going into emergency mode
>>
>>
>> I have given full permissions to the directory:
>> [/var/db/squidGuard/ads]$ ls -la
>> total 256
>> drwxrwxrwx   2 nobody   nobody        512 Mar 28 23:35 .
>> drwxr-s---  16 root     wheel         512 Mar 27 16:45 ..
>> -rwxrwxrwx   1 nobody   nobody      58682 Mar 27 16:45 domains
>> -rwxrwxrwx   1 nobody   nobody     172032 Mar 27 16:45 domains.db
>> -rwxrwxrwx   1 nobody   nobody       3570 Mar 27 16:45 urls
>> -rwxrwxrwx   1 nobody   nobody      16384 Mar 27 16:45 urls.db
>>
>>


Reply via email to