I run the site Nutz.Org and have been using the analog weblog parser for a
while now and love it. I went through the archives and saw a brief
discussion which has similar problems that I am having, but no real solution
to it so I figured I would bring it up and see if anyone has come up with
anything.
My access_log file gains about 125-150MB a day which needless to say hits
the max file limits in linux of 2GB or so pretty fast. Pretty much it hits
it within a couple of weeks depending on bursts of traffic. The problem is
with this limit, even if I clean up the logs and gzip a week at a time after
running a weekly report I can't put the files together to do a monthly
because the file is too big. Has anyone effectively dealt with this problem?
When I wasn't getting too much traffic I had a "real time" SQL based logging
system which was great, but with the increase in traffic the massive number
SQL accesses brought the system to it's knees for major anal rapage. I saw
the suggestion for the SQL database logging and know that is not a solution.
Any help would be mucho appreciated. Maybe in the near future the guys who
maintain analog can incorporate pulling in multiple logfiles into a report.
This would remedy my problem with the filesize issues.
------------------------------------------------------------------------
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe" in the main BODY OF THE MESSAGE.
List archived at http://www.mail-archive.com/[email protected]/
------------------------------------------------------------------------