Yeah, I looked at that, BTW, searched for an existing script, couldn't find one, so guess now I actually have to _work_ and _hack out my own_
Oh, well.... Something to keep me out of trouble for a while. </whine> Joe ----- Original Message ----- From: "Rick Matthews" <[EMAIL PROTECTED]> To: "Joe Newby" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]> Sent: Thursday, August 29, 2002 8:16 AM Subject: RE: SquidGuard with SARG / Analyzing the log > I've attached an 800 byte log file clip so that you can see the format. > No, SARG cannot produce reports from it, but average script writing > skills can get you a pretty good feel for what is going on. > > FWIW. > > Rick > > > > -----Original Message----- > > From: [EMAIL PROTECTED] > > [mailto:[EMAIL PROTECTED]]On Behalf Of Joe Newby > > Sent: Thursday, August 29, 2002 6:38 AM > > To: [EMAIL PROTECTED] > > Subject: Re: SquidGuard with SARG / Analyzing the log > > > > > > This is all well and good, however, SARG will only accept one log variable > > (i.e., /var/log/squid/access.log), and squidGuard does not write logs in the > > same format as squid. > > > > We have also used SARG very successfully for nearly 2 years, and it is > > frustrating to not be able to see the sites that were blocked. > > > > Joe > > ----- Original Message ----- > > From: "Rick Matthews" <[EMAIL PROTECTED]> > > To: "Elmar" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]> > > Sent: Thursday, August 22, 2002 10:00 PM > > Subject: RE: SquidGuard with SARG / Analyzing the log > > > > > > > > -----Original Message----- > > > > From: Elmar > > > > Sent: Thursday, August 22, 2002 4:57 PM > > > > > > > > For analyzing the squid-log SARG works very fine, but I use squidGuard > > at > > > > school and would like to monitor, which"forbidden" sites have been tried > > to > > > > being accessed and this is just written down in squidGuard''s log AFAIK. > > > > > > You can accomplish that by adding logfile and redirect statements to > > > each destination declaration block. For example: > > > > > > dest porn { > > > domainlist blacklists/porn/domains > > > urllist blacklists/porn/urls > > > redirect ...... > > > logfile /usr/local/squidGuard/log/porn.log > > > } > > > > > > Use a different logfile for each group: aggressive.log, gambling.log, > > > etc. You will need to first create each of those log files in your > > > logfile directory and set the proper ownership and permissions, then > > > bounce squid. Every redirect will then be logged with information that > > > is extremely useful in research and debugging. Each time a person is > > > redirected, the entry logged will include the following (if available): > > > ident, ip, the source group squidGuard placed them in, the destination > > > group that redirected them, and the url requested. > > > > > > Name them with a 'log' extension and then make sure you have a > > > /etc/logrotate.d that looks something like this: > > > --- /etc/logrotate.d -------- > > > /usr/local/squidGuard/log/*.log { > > > notifempty > > > missingok > > > sharedscripts > > > weekly > > > rotate 5 > > > copytruncate > > > postrotate > > > /usr/sbin/squid -k reconfigure > > > endscript > > > } > > > ---- end ------------------- > > > Then all of your squidGuard logs will be rotated together. > > > > > > After adding the logfile and redirect statements to your destination > > > blocks, the only redirect statement needed in your acl blocks is the > > > one in the default acl. > > > > > > I think you will find those helpful. > > > > > > Rick > > > >
