on 02/07/2001 01:22 PM, Jim Correia at [EMAIL PROTECTED] wrote:
> I know this is the MacPerl list, but I have an anyperl question.
>
> I have a script like so:
>
> #!/usr/bin/perl -w
>
> use strict;
>
> open(OUTPUT, ">out");
>
> my $count = 0;
>
> while ( 1 )
> {
> print "$count\n";
> print OUTPUT "$count\n";
> $count++;
> }
>
> # we terminate before we ever get here
>
> close(OUTPUT);
>
> While it is looping, I hit command-. in MacPerl to nuke it. While it
> runs data is flushed to the terminal window, and the same data ends up
> flushed to the file after the script is nuked even though I didn't
> explicitly auto-flush the file handle.
>
> Problem #1:
>
> Take that same script and run it on a unix machine. Run it from the
> terminal. Hit control-c while it is executing. Notice that all the
> data isn't flushed to the output file.
>
> Problem #2:
>
> Run this same script by exec'ing it from another c program. Kill the
> subprocess at some point that you know it is executing the loop. Notice
> that all of the data isn't flushed to the output file, and that you
> don't get all of the data back on your stdout pipe.
>
> Is there any way to avoid this?
>
> In particular, I want to nuke a program in mid-execution and have perl
> clean up as best as it can at this point, which includes flushing the
> buffers of any open files.
>
> Thanks,
> Jim
>
put a
$| = 1;
before you start printing, and outside the while loop. that basically sets
your pips to run 'hot', autoflushing as it goes. :)
just think of it as turning the hot water pipe ON :)
--
Scott R. Godin | e-mail : [EMAIL PROTECTED]
Laughing Dragon Services | web : http://www.webdragon.net/