I apologize if this shows up on list twice. I posted to the
newsgroup yesterday, but it never posted. 

        First of all I'd like to say that Analog is the fastest log
analysis software I have ever seen. On a fast intel it chews through our
~12M lines of apache logs in about 2 minutes. Great software :)

        Now for my problem. We process ~3GB of logs daily for our main
domain. The management likes to see cumulative numbers from day to day.
So, what I'm doing is processing the log files and making a computer
output file for Report Magic, as well as a cachefile. Then the next day,
when my scripts run, they move the CACHEOUTFILE from the day before to
the CACHEFILE filename, and proless the logfiles and the CACHEFILE 
together to create a cumulative report. 

        The box I'm doing this on has 4GB of ram, but it's still using
it all and blowing out with a "Ran out of memory error". I read the 
docs on cachefiles and low memory usage, but even with HOSTLOWMEM 3 
and a FILEALIAS for our commonly accessed filenames, I'm still running 
out of memory after about 2 days worth of data. 

        Am I doing something wrong? I see people that have over a years
worth of data in their reports, but at this rate, I'll not be able to
get a week. Can anyone offer some advice?
        
Sam
+------------------------------------------------------------------------
|  TO UNSUBSCRIBE from this list:
|    http://lists.isite.net/listgate/analog-help/unsubscribe.html
|
|  Digest version: http://lists.isite.net/listgate/analog-help-digest/
|  Usenet version: news://news.gmane.org/gmane.comp.web.analog.general
|  List archives:  http://www.analog.cx/docs/mailing.html#listarchives
+------------------------------------------------------------------------

Reply via email to