I'm still working on the script, though it is cleaning up even better, I
even have it running cleaner by dumping anything not matching some specs. 
What I am trying to figure out though is this:

I have 42 places for the output to go, 1 of those is a constant dump, the
others are all based on whether or not there is data in a field.  If the
data is in the field, the data writes to a file with the same name as the
data checked.  If not then it writes to a global catch-all.

<!-- snip -->

  open OUTFILE, ">/home/multifax/everyone" or die "Can't open ${infile}.out
at home: $!";
  open OUTFILE1, ">/home/multifax/102" or die "Can't open ${infile}.out at
home: $!";

   print OUTFILE "[EMAIL PROTECTED]";

   if ($fields[11] eq 102){ 
    print OUTFILE1 "[EMAIL PROTECTED]";
   }

<!-- snap -->

I am wondering if it is more processor intensive to open the 42 separate
files at one time, parse the data, and then close all the files, or if I
should try to re-write the parse to open the correct file, dump the data,
and then close that file, then repeat the process.  I know programmatically
it is probably better to open and close the files as there would be no more
copy and pasting, but was thinking processor intensive.  As it is right now
it takes the script about 5 seconds to parse the 557 lines of data.

If done in a loop, would it look something like this???:

<!-- snip -->
   @filees = ('102','104',118');

if (grep $fields[4] eq $_, @filees) {
 open OUTFILE1, ">/home/multifax/$_" or die "Can't open $_ !";
 print OUTFILE1 "[EMAIL PROTECTED]";
 close OUTFILE1;
}

<!-- snap -->


I'm trying to apply what I have learned from you guys and from breaking my
code in the last few days in every script I am writing.

Thanks,
Robert

--------------------------------------------------------------------
mail2web - Check your email from the web at
http://mail2web.com/ .



--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to