cron may not be the nicest way to go.... I imagine having a problem (lots
of logging) and just at that time, some size limit hits, or cron is
triggered by some other criteria and the site loses useful logging
information.
A better solution would be to have a "round-robin" writing of the log file,
and a way to set the max log-file size.
For example:
if os.getfilesize(log_file) > log.setting.filesize_limit:
# seek to an overwrite position in the file; write from there
There are several strategies that make this possible, like leaving an "end
of log" marker line, finding that, and always writing from there, or
rewriting the entire file...
In any case, there are solutions out there... in the long run, we can just
think if there is an appropriate solution to put in web2py.
I personally like the option of round-robin or scrolling logfiles, as the
concept minimizes information loss.
On Sun, May 3, 2009 at 9:45 AM, mdipierro <[email protected]> wrote:
>
> I just os.unlink()
>
> On May 3, 7:40 am, Iceberg <[email protected]> wrote:
> > Is there anyone who deploy web2py on production server? How do you
> > prune httpserver.log? Otherwise this file grows bigger and bigger.
> > Currently I am planing to os.unlink() it via the cron feature. Is
> > there any more gentle way? Thanks.
> >
>
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"web2py Web Framework" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/web2py?hl=en
-~----------~----~----~----~------~----~------~--~---