Leopold Toetsch wrote:
[snip]
> [1] when we want to thaw/clone big structures, we should have some means
> to estimate the amount of needed headers. If we will not have enough, we
> do a DOD run before clone/thaw and then turn DOD off - it will not yield
> any more free headers anyway. This can avoid a couple of DOD runs that
> do just nothing except burning a lot of cycles and massive cache
> pollution.
> To achieve this, we might call aggregates.elements() first by means of
> the iterator again or with some depth restriction and returning, when we
> reach the free-header limit.

Even with a depth restriction, a recursive estimate can produce
misleading results due to circular references.  Only actually walking
the structure can get the number right.  However, walking the structure
*twice* would be silly.  So, begin with an estimate of "unknown"
(serialize an integer -1), and then, after the whole thing has been
frozen, we seek backwards (if that's possible) and replace that
"unknown" with the actual number of pmcs that were serialized.

-- 
$a=24;split//,240513;s/\B/ => /for@@=qw(ac ab bc ba cb ca
);{push(@b,$a),($a-=6)^=1 for 2..$a/6x--$|;print "[EMAIL PROTECTED]
]\n";((6<=($a-=6))?$a+=$_[$a%6]-$a%6:($a=pop @b))&&redo;}

Reply via email to