On Friday, May 3, 2002, at 09:17 , Teresa Raymond wrote:

> Someone mentioned that sucking a file into an array is not a good idea 
> and I read the Perl fact on it but still am not sure why this is not a 
> good idea, especially because a lot of code posted uses this method.
[..]
{ since wags and mark got the cool stuff, I'll take the left over tech 
stuff.}

the concern is about 'in core memory management' - if the file that
one is slurping in with

        my @input_from_file =<FH>;
        
is 'not that big to begin with' - then it is max-nix either way.

The problem comes about when suddenly JoeBob who 'has always done that'
gets handed some 2-gig file...... not everyone naturally builds their
machine with over 2gig of memory - so the system tries to do as well
as it can.... and depending upon OS - this can lead to some interesting
'page faults' and/or swapping cases.

        Can you say 'thrashing'?????

Hence as a general principle we advocate doing the more 'traditional'
line parsing rules - since this keeps things intrinsically saner - and
in the main generally avoids the need to build your instance of perl
with the USE_LARGE_FILE componentry.

Thus once one has grown accustom to the stock

        while(<FH>) {
                next if (/^\s*$/);  #blow off blank lines
                chomp;                          # remove ONLY the EOL token
                my $line = $_ ;         # incase we need that whole line later

                ......

        } # end getting in the <FH>

such animals as

                open(TMPFILE,"> $fileName") ||
                 die "unable to open tempfile:$fileName \n";

         while ( $len = sysread(  $dtk_sock , $buf, $mtu_size )) {
                 unless ( defined $len ) {
                         next if $! =~ /^Interrupted/;
                         die "System read error:$!\n";
                 }
                 $ReadCount += $len;

                 $offset = 0;

                 while ($len) {
                         $written = syswrite(TMPFILE, $buf, $len, $offset);
                         die "System write error: $!\n"
                                 unless defined  $written ;
                         $len -= $written;
                         $offset +=$written;
                         $MostWrote +=$written;
                 }
         }

do not seem quite as complex and as scary.

We shift from the line oriented 'read' to the sysread for
a given buffer size....

Given that the size of the file being 'downloaded' is bigger
than the incore memory this 'no slurpie' approach sorta helps....

But to be honest - there are times when that 8-line config file
could just as easily be slurped... but why not do it right the
first time ala:

http://www.wetware.com/drieux/CS/lang/Perl/Beginners/readConfigFile.txt

Your Mileage May Vary,
Void where prohibited by law,
do not bend, fold, spindle or mutilate.


ciao
drieux

---


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to