Re: perl output from cron jobs mixing stdout and stderr

2001-06-03 Thread Sisyphus


- Original Message -
From: "Bennett Haselton" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Sunday, June 03, 2001 6:23 PM
Subject: RE: perl output from cron jobs mixing stdout and stderr


> Anybody have any ideas on how to fix this?  I'm growing increasingly
> desperate as the logs generated by this program continue to pile up, with
> no known way to (1) single out the programs that generate warnings/errors,
> and (2) preserve the order of the STDOUT and STDERR statements, at the
same
> time.
>
> Is it possible to (a) redirect STDERR to a file from *within* a perl
script
> (i.e. not on the command line like "perl somefile.pl 2>file.txt"), and (b)
> redirect STDERR to two different places simultaneously?
>
> That would solve my problem -- then I could (1) redirect STDOUT and STDERR
> to the same file (whose name would depend on the date and time), and
> unbuffer them both, so that the order of errors and outputs would be
> preserved, and (2) have STDERR simultaneously printed to another file
> (whose name would also depend on the date and time), so that I could tell
> which jobs generated error output, simply by looking for the existence of
> those files.
>
> -Bennett
>
Hi,
It's an intriguing situation. Let us know what you find. My thinking is that
it would be best to check for the presence of errors and have their
existence set a flag.
There may even be a good way of doing this, but I don't know what it is.
:-))
The best I can come up with is as follows:
On my machine (Win 2k, AS build 626) it seems that if there are no errors,
'$!' returns 'No such file or directory'. If there are any errors, then
unless they are of the type that sets '$!', it returns '' ( ie empty).
So my thinking is that:
if ($! eq 'No such file or directory') {
$error_flag = 0; # there are no errors
}
else {
$error_flag = 1; # errors exist
}

But I don't know how reliable that is, or under what, if any, circumstances
it is reliable - perhaps someone can enlighten me on that. If it works for
you ( and you're game enough ), use it to detect the presence of errors in a
script.
I had a play around with the other error variables and also
'Win32::GetLastError' but found nothing useful.
Surely someone knows of a way of determining whether any of a script's
output came from STDERR.

Cheers,
Rob

___
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users



Packages and scope

2001-06-03 Thread Lee Goddard

Over the past year I've got myself into the habit of using strict and
warnings, but have never come across this before: can someone please help me
clarify it?

package MIDI::Generate::FromText;
use warnings;
use strict;
use MIDI::Generate::FromFoo;
our @ISA = qw(MIDI::Generate::FromFoo);

my $scale_length = $#{$SCALES{ $scale }}-1;

Perl says, Global symbol "%SCALES" requires explicit package name.

But %SCALES is a variable defined as 'our' in MIDI::Generate::FromFoo - is
this not inherited along with subroutines and methods?  Am I going to have
to explicitly import/export these ?

Thanks in anticipation
lee

___
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users



Re: perl output from cron jobs mixing stdout and stderr

2001-06-03 Thread $Bill Luebkert

Bennett Haselton wrote:
> 
> Anybody have any ideas on how to fix this?  I'm growing increasingly
> desperate as the logs generated by this program continue to pile up, with
> no known way to (1) single out the programs that generate warnings/errors,
> and (2) preserve the order of the STDOUT and STDERR statements, at the same
> time.
> 
> Is it possible to (a) redirect STDERR to a file from *within* a perl script
> (i.e. not on the command line like "perl somefile.pl 2>file.txt"), and (b)
> redirect STDERR to two different places simultaneously?
> 
> That would solve my problem -- then I could (1) redirect STDOUT and STDERR
> to the same file (whose name would depend on the date and time), and
> unbuffer them both, so that the order of errors and outputs would be
> preserved, and (2) have STDERR simultaneously printed to another file
> (whose name would also depend on the date and time), so that I could tell
> which jobs generated error output, simply by looking for the existence of
> those files.
> 
> -Bennett
> 
> At 11:03 AM 5/26/2001 -0700, Bennett Haselton wrote:
> This worked (although I switched the order of the "select(STDOUT);" and
> "select(STDERR);" statements so that STDOUT would be select()'ed at the
> end) -- now I get stdout and stderr statements in the right order.
> 
> I'm still trying to solve a problem though: I want the jobs to generate
> output under normal functioning, so I can trace through and see what they
> did.  *And* I want to be alerted if one of them produces some output send
> to stderr, so I can see what went wrong.  *And* when that happens, I want
> the stdout and stderr output to be in the correct order, so I can figure
> out when the error occurred.
> 
> None of the following methods achieve that:
> 
> 1) Run the jobs with stdout sent to one file and stderr sent to another
> file.  That way, I can easily see which jobs produced error output, and
> even when they don't, I'll have the stdout output if I want to look at
> it.  The problem is that there's no way to tell in which order the program
> printed the statements sent to stdout and stderr.  Even if the output to
> stdout were printed with a timestamp before each line, the output to stderr
> would not be, so there would be no way to merge the output in the two files
> into one file where everything was listed in the order in which it
> happened.
> 
> 2) (What I'm doing now) -- sent stdout and stderr both to the terminal, so
> that when they're run as scheduled jobs, the output gets emailed to
> me.  The problem here is, I want to distinguish between jobs that generated
> error output, and jobs that didn't -- without reading through each email to
> see if any error output is shown.  Right now I have my email filter set to
> detect messages containing the text "Use of uninitialized", since almost
> all of my warning messages contain that text, but that's not an ideal
> solution; I want to catch all kinds of messages sent to stderr, not just
> "Use of uninitialized variable" warnings.
> 
> 3) Run the jobs with stdout and stderr directed into the same log file --
> has the same problem as (2), no easy way to detect jobs that generated
> error output.
> 
> Any suggestions?  Can STDERR be redirected to two different places at the
> same time (without piping it through a script that prints input to two
> separate outputs)?  In that case, I could have both STDOUT and STDERR
> printed to one log file (where everything is stored in the correct order),
> and have STDERR also sent to a separate log file where I get alerted if any
> errors or stored -- or just have STDERR sent to the terminal so it gets
> emailed to me.  But is there anything simpler, any standard solution to
> this kind of problem?

mailto:[EMAIL PROTECTED] has a module: Local::TeeOutput

Not sure where you can download from - maybe check CPAN and AS or email him.

-- 
  ,-/-  __  _  _ $Bill Luebkert   ICQ=14439852
 (_/   /  )// //   DBE Collectibles   http://www.todbe.com/
  / ) /--<  o // //  Mailto:[EMAIL PROTECTED] http://dbecoll.webjump.com/
-/-' /___/_<_http://www.freeyellow.com/members/dbecoll/
___
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users



RE: perl output from cron jobs mixing stdout and stderr

2001-06-03 Thread Bennett Haselton

Anybody have any ideas on how to fix this?  I'm growing increasingly 
desperate as the logs generated by this program continue to pile up, with 
no known way to (1) single out the programs that generate warnings/errors, 
and (2) preserve the order of the STDOUT and STDERR statements, at the same 
time.

Is it possible to (a) redirect STDERR to a file from *within* a perl script 
(i.e. not on the command line like "perl somefile.pl 2>file.txt"), and (b) 
redirect STDERR to two different places simultaneously?

That would solve my problem -- then I could (1) redirect STDOUT and STDERR 
to the same file (whose name would depend on the date and time), and 
unbuffer them both, so that the order of errors and outputs would be 
preserved, and (2) have STDERR simultaneously printed to another file 
(whose name would also depend on the date and time), so that I could tell 
which jobs generated error output, simply by looking for the existence of 
those files.

-Bennett

At 11:03 AM 5/26/2001 -0700, Bennett Haselton wrote:
This worked (although I switched the order of the "select(STDOUT);" and 
"select(STDERR);" statements so that STDOUT would be select()'ed at the 
end) -- now I get stdout and stderr statements in the right order.

I'm still trying to solve a problem though: I want the jobs to generate 
output under normal functioning, so I can trace through and see what they 
did.  *And* I want to be alerted if one of them produces some output send 
to stderr, so I can see what went wrong.  *And* when that happens, I want 
the stdout and stderr output to be in the correct order, so I can figure 
out when the error occurred.

None of the following methods achieve that:

1) Run the jobs with stdout sent to one file and stderr sent to another 
file.  That way, I can easily see which jobs produced error output, and 
even when they don't, I'll have the stdout output if I want to look at 
it.  The problem is that there's no way to tell in which order the program 
printed the statements sent to stdout and stderr.  Even if the output to 
stdout were printed with a timestamp before each line, the output to stderr 
would not be, so there would be no way to merge the output in the two files 
into one file where everything was listed in the order in which it 
happened.

2) (What I'm doing now) -- sent stdout and stderr both to the terminal, so 
that when they're run as scheduled jobs, the output gets emailed to 
me.  The problem here is, I want to distinguish between jobs that generated 
error output, and jobs that didn't -- without reading through each email to 
see if any error output is shown.  Right now I have my email filter set to 
detect messages containing the text "Use of uninitialized", since almost 
all of my warning messages contain that text, but that's not an ideal 
solution; I want to catch all kinds of messages sent to stderr, not just 
"Use of uninitialized variable" warnings.

3) Run the jobs with stdout and stderr directed into the same log file -- 
has the same problem as (2), no easy way to detect jobs that generated 
error output.

Any suggestions?  Can STDERR be redirected to two different places at the 
same time (without piping it through a script that prints input to two 
separate outputs)?  In that case, I could have both STDOUT and STDERR 
printed to one log file (where everything is stored in the correct order), 
and have STDERR also sent to a separate log file where I get alerted if any 
errors or stored -- or just have STDERR sent to the terminal so it gets 
emailed to me.  But is there anything simpler, any standard solution to 
this kind of problem?

 -Bennett

At 11:43 AM 5/18/2001 +0100, Martin Moss wrote:
Hi,

It looks like you've not selected unbuffered output on your STDERR & STDOUT
filehandles. Try this:-

select(STDOUT);
$| = 1;
select(STDERR);
$| = 1;

I would suggest that as a rule of thumb you shouldn't write ANY cron tasks
which print output to STDOUT. If you are using those email messages sent to
root, perhaps you could have a success/failure message printed to STDOUT
when the script has completed. I always find it more conducive to redirect
STDOUT & STDERR to a log file. Then you can keep all your script's logging
information in one place.

Regards

Marty

 > -Original Message-
 > From: [EMAIL PROTECTED]
 > [mailto:[EMAIL PROTECTED]]On Behalf Of
 > Bennett Haselton
 > Sent: Friday 18 May 2001 11:25
 > To: [EMAIL PROTECTED]
 > Subject: perl output from cron jobs mixing stdout and stderr
 >
 >
 > [slightly OT; more UNIX than Windows; send flames by postcard to 1600
 > Pennsylvania Ave, Washington, D.C.]
 >
 > I think this is probably an issue with cron jobs in general and not just
 > perl scripts that are run as cron jobs, but --
 >
 > I have some perl scripts that run as cron jobs and usually generate a 
lot
 > of output sent to stdout, plus some error output sent to stderr.  The
 > output gets stored as an email message in the mail account file
 > of the u