Hi all:

I have a script that is merging many files into a new file, by reading one
at a time and adding to the new one. The error I received was "Out of memory
during request for 1016 bytes, total sbrk() is 1186891848 bytes!" (I did not
add the "!"...). This is on a Sun box with 4 Gig, typically between 1 and 2
Gig free. Is there anything I can do to reduce memory usage? None of my
error messages were displayed, so I'm not even sure which statement caused
the error (I check for errors on all but the "closedir" and
"close(INFILE)"). I should mention that I'm processing 100,000 files of an
average of about 6 K each. (Plan B is to reduce the number of files being
processed.) Here is the code (stripped of the error checking):

opendir(INDIR, $indir);
@logfile=grep(/$mask/i, readdir INDIR); 
closedir(INDIR);
$nbrfiles=@logfile; # number of files matching mask
open(OUTFILE, ">$outdir$outfile");
for ( $ctr=0; $ctr<$nbrfiles; $ctr++ ) {
    open(INFILE, "$indir$logfile[$ctr]");
    print OUTFILE <INFILE>;
    close(INFILE);
} 
close(OUTFILE);

Thanks

Rob

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to