Mike - I have more questions for you.. sorry - you've opened the floodgates by being helpful first time :)
I'd like to set up my multi-process perl app to support chainsaw as a log viewer.. I followed the instructions in the FAQ, and it worked.. but only if chainsaw was up and running. But if chainsaw wasn't started, log4perl would die, complaining it couldn't establish the connection. I discovered the 'silent_recovery' flag - but although this keeps the application from dying if chainsaw isn't running, it does very significantly slow down, as each log call is attempting (and failing) to establish a connection to the non-existant chainsaw port. Any ideas how to configure things so I can use chainsaw, but avoid impact to the performance of the application if it's not running? Ideally I'd like to leverage the chainsaw's 'SocketHub' receiver, in order to support multiple/remote chainsaw connections to my perl service.. I suspect that I'll need some sort of process that serves as a socket hub - to accept multiple connections on PortA from my log4perl app, and zero or more connections on portB from chainsaw - and route log4perl messages to any/all chainsaw connections? Have you done anything like this before? Any existing modules I can reuse for this? All pointers gratefully received.. Thanks Bob -----Original Message----- From: Mike Schilli [mailto:[EMAIL PROTECTED] Sent: Sunday, November 04, 2007 3:55 PM To: Strahan, Bob Cc: Mike Schilli; log4perl-devel@lists.sourceforge.net Subject: RE: [log4perl-devel] log4perl causing perl process to die (fwd) On Sun, 4 Nov 2007, Strahan, Bob wrote: > We do use the 'close_after_write' option... As I mentioned, there are > multiple concurrent processes continually being spawned by the > service, each using log4perl to log to the same logfile. So we > figured we needed to use File::Locked along with close_after_write to > ensure each process got an exclusive lock on the logfile before > writing to it. I see -- the recommended ways of synchronizing access to an appender are listed in the Log4perl FAQ: http://log4perl.sourceforge.net/d/Log/Log4perl/FAQ.html#23804 I'm not sure how well they work on Windows, though, but give the 'syswrite' option a try, that should be the easiest. -- Mike Mike Schilli [EMAIL PROTECTED] > Let me know if there is a better (more efficient) way to handle > multiple concurrent processes logging to the same file e.g. Would > using socket appenders to route log messages to single log server > process which handles file i/o from one process be a better option? > > > > Which version of Windows are you running by the way? On regular XP, it > > seems to work as expected. > > Windows 2003 64-bit server.. I haven't tried it on other flavors of Windows. > > > For now I have worked around the problem by inserting the open() call into a > retry loop.. > #open $fh, "$self->{mode}$self->{filename}" > # or die "Cannot write to '$self->{filename}': $!"; > while (1) { > last if open $fh, "$self->{mode}$self->{filename}" ; > } > > > > > > > > > -----Original Message----- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Mike Schilli > Sent: Saturday, November 03, 2007 6:32 PM > To: Mike Schilli > Cc: log4perl-devel@lists.sourceforge.net > Subject: Re: [log4perl-devel] log4perl causing perl process to die (fwd) > > On Fri, 2 Nov 2007, Bob Strahan wrote: > > > However, it seems that if certain filesystem operations are > > performed on the logfile it can cause the logger to execute die(), > > causing my service to die, with the following error > > > > Cannot write to 'D:/Program Files (x86)/My App/logs/logfile.txt': > > Permission denied at D:\Program Files (x86)\My > > App\lib\perllibs\lib/Log/Dispatch/File.pm line 86. > > Hmm, this is Log::Dispatch::File's _open_file() function complaining > that an open() failed. Does your service open a files after it's been > running for a while? Typically, Log::Dispatch::File(::Locked) opens the > file only once unless 'close_after_write' is given. > > Which version of Windows are you running by the way? On regular XP, it > seems to work as expected. > > -- Mike > > Mike Schilli > [EMAIL PROTECTED] > > > I am using log4perl in a Win32 service that needs to run forever.. > > However, I have encountered a situation where the logger call is executing > > a die() and causing my service to die... > > > > > > The service spawns multiple child processes which run concurrently but all > > log to the same logfile.. We're using File::Locked to avoid contention.. > > Extract from our logger config below.. > > > > "log4perl.appender.myapp" => "Log::Dispatch::File::Locked", > > "log4perl.appender.myapp.filename" => "D:/Program Files (x86)/My > > App/logs/logfile.txt", > > "log4perl.appender.myapp.mode" => "append", > > "log4perl.appender.myapp.close_after_write" => "true", > > "log4perl.appender.myapp.permissions" => "0660", > > Etc.. > > > > > > > I can reproduce the problem sporadically by simply opening the logfile in > > Wordpad.. > > I can reproduce it reliably by repeatedly copying the logfile using test > > script below > > > > #!perl -w > > use File::Copy ; > > while (1) { > > copy ("D:/Program Files (x86)/My App/logs/logfile.txt", "D:/Program > > Files (x86)/My App/logs/logfileCOPY.txt") ; > > print "." ; > > } > > > > > > Any suggestions on how to defend against users copying or opening the > > logfile? We should block and retry until open() suceeds, rather than > > die(), I think. > > > > Please let me know if you can help with a patch, workaround, or suggestion. > > > > Regards > > > > > > > > Bob Strahan > > ------------------------------------------------------------------------- > > This SF.net email is sponsored by: Splunk Inc. > > Still grepping through log files to find problems? Stop. > > Now Search log events and configuration files using AJAX and a browser. > > Download your FREE copy of Splunk now >> http://get.splunk.com/ > > _______________________________________________ > > log4perl-devel mailing list > > log4perl-devel@lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > > > > ------------------------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. > Still grepping through log files to find problems? Stop. > Now Search log events and configuration files using AJAX and a browser. > Download your FREE copy of Splunk now >> http://get.splunk.com/ > _______________________________________________ > log4perl-devel mailing list > log4perl-devel@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/log4perl-devel > ------------------------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/ _______________________________________________ log4perl-devel mailing list log4perl-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/log4perl-devel