Steve Fink wrote:

It's late here, but I'll try an answer ;-)


... I was thinking that it might be okay
to say that any single operation (even those that could conceivably
trigger many other operations) must fit all of its allocations within
the total amount of available memory when it starts.

We don't know that. There could be a clone of a PerlArray with size 10^6 elements. Or one element containing another element which is a PerlArray nested (until (program (syntax (or (I (dont (know (ends));-)))))

... Or, in short, I'm
saying that maybe it's all right for it to crash even though it could
actually succeed if you ran a DOD immediately before every single
allocation.

We can only increase allocating resources by some heuristic. It was by 4, now by 1.75, but you can never allocate for the future, you might have more crahses - and a DOD run does not help, when you just clone and you have no free headers.

You just get increasing crash numbers causing more DOD runs and bigger allocations, until you reach the needed header limit.

Or for short, it doesn't work (all IMHO of course).

leo

Reply via email to