Oops, I accidentally sent this only to Bart. Still getting used to OS X's Mail.app quirks...

On Wednesday, January 15, 2003, at 08:16 AM, Bart Lateur wrote:

On Wed, 15 Jan 2003 03:25:11 +0100 (MET), Louis Pouzin wrote:

Is there a way to handle the error condition "out of memory" ?
This may occur when reading a whole file of unpredictable size.
Recovering control could allow to switch to a different method.
You might try to wrap it all in an eval { ... } , but I have some doubts
if it'll work well. "Out of memory" would seem to be one of those very
low level errors, that leaves MacPerl behind in a state of shock. I
doubt if you can do much of use with it, after that.

$_ = do {local $/; <IN> or warn 'problem with IN'};
	$_ = eval { local $/; <IN> };
	warn "problem with in: $@" if $@;

--
	Bart.
If large files are anticipated, I try to determine their size before slurping, and not slurp if a file seems too large.

$fsize = (stat $file)[7];
if ($fsize < $byte_max) {
# slurp...
} else {
# read-by-line...
}

# less precise...
open IN, "$file" or die "Can't. $!\n"
while (<IN>) { $line_count++ }
close IN;
if ($line_count < $line_max) {
# slurp...
} else {
# read-by-line...
}

# another way...
open IN, "$file" or die "Can't. $!\n"
while (<IN>) { $line_count++ }
seek IN, 0, 0; # reset to start of file
if ($line_count < $line_max) {
# slurp...
} else {
# read-by-line...
}
close IN;

[# All examples basic but untested; proper variable scoping assumed.]

You can experiment with values for the maxes, and you'll probably find that MacPerl hits its limit before Perl on UNIX would, because of differences in OS memory management. But you can raise the memory allocation for MacPerl (or your cgi, if that's what it is). As Bart suggests, though, out-of-memory problems rough up MacPerl fairly severely...

Look before you slurp!

- Bruce

__bruce__van_allen__santa_cruz__ca__

Reply via email to