You could use grep/sed, etc, and a some shell scripts (or perl or python
scripts).  Instead of sed and grep (and regexes, and inconsistent results,
and a lot of manual work), what perhaps a lot of folks need is unified,
aggregated logging that actually looks into Zope's architecture and uses
mod_perl for static stuff and CGIs served in Apache.

I've thought about this a bit, and it seems to me that many sites (at least
bigger ones) often use a combination of technologies (as well as multi-node
server farms) to serve up their site(s).  As such, if you are running stuff
served from Zope, Apache, CGI, other app servers, etc, you want a unified
logging platform.  Add to the complication HA/load-balanced/multi-server
farms, and you want to be able to aggregate logs from all boxes on all
server platforms to log to a unified system.

The answer seems to me to be use of a relational database, mod_perl, and the
example DBI Apache logger code in O'Reilly's _Writing Apache Modules_ with
the following tweaks:

1 - Add persistence to the mod_perl DB connection with Apache::DBI
2 - Tweak the tables to add the following fields:
        box (server node number, for HA farms)
        object_id (used, along with URL to determine what object is 
                being used by X method)
3 - mime-types determined for Apache-served static pages with RegEx
        on URL
4 - Zope-served pages are excluded from Apache's logging, use ZSQL
        methods inside Zope to log (below)
5 - ZSQL method for logging is created.  Logs the relevant stats to
        the relational database
        AUTHENTICATED_USER is logged to user, and <dmtl-var id> is 
                logged to object_id - mime-type is the object's mime-type
6 - Insert <dtml-call expr="mylogmethod()"> into standard_html_header
        in top level (all acquired use of standard_html_header will log)
7 - Roll your own report-system using Zope, perhaps...

I've already started working on something of this sort for a long-term
project I have to implement logging for my company's new HA server farm.
Such an approach might also be a good thing if one is using a ZEO setup.  So
far, I have set up the Apache side of things, logging to MySQL, and I am
working on the Zope side of things.   The real work, though is not in
logging, but report generation, which would be a nice Zope product.  Perhaps
in a few months, I might have something worth showing...

Any thoughts?


Sean Upton
Senior Programmer/Analyst
Web Infrastructure
The San Diego Union-Tribune

-----Original Message-----
From: Martin Winkler [mailto:[EMAIL PROTECTED]]
Sent: Sunday, December 10, 2000 5:26 PM
Subject: [Zope] access log analyzer (analog, webalizer etc.)

Hi listies,

since we all use names for urls that are not "standardized" - ending with 
".html", ".gif" etc -  I am wondering how to do a good logfile analyzing 
with standard tools like those in the subject line. I am usually running 
zope behind a http-accelerated squid, so I've configured squid to use 
"emulate_httpd_log on". But how webalizer doesn't recognize whether it's an 
image or html output.

Did someone already write a tool that converts these logfiles like this (or 

logfileconverter reads the logfile line by line --> checks against the ZODB 
whether the url outputs a "jpg", "gif" or "html" --> converts the line in 
the logfile accordingly (e.g. "/mydir/mymethod" to "/mydir/mymethod.html" 
and "mydir/mypic" to "mydir/mypic.jpg") --> then saves the modified logfile 
to disk or pipes it to webalizer.

Thanks for your input!


Zope maillist  -  [EMAIL PROTECTED]
**   No cross posts or HTML encoding!  **
(Related lists - )

Zope maillist  -  [EMAIL PROTECTED]
**   No cross posts or HTML encoding!  **
(Related lists - )

Reply via email to