Begin forwarded message:
From: Eliot Miranda <[email protected]>
Date: August 17, 2009 8:09:25 PM CEDT
To: Henrik Johansen <[email protected]>
Cc: Stéphane Ducasse <[email protected]>, Marcus Denker <[email protected]
>, John McIntosh <[email protected]>
Subject: Re: [Pharo-project] SequenceableCollection do: causes GCs
with closure bytecodes
On Mon, Aug 17, 2009 at 7:02 AM, Stéphane Ducasse <[email protected]
> wrote:
Begin forwarded message:
From: Henrik Johansen <[email protected]>
Date: August 17, 2009 3:21:53 PM CEDT
To: [email protected]
Subject: [Pharo-project] SequenceableCollection do: causes GCs with
closure bytecodes
Reply-To: [email protected]
Something I noticed when looking at ByteArray>>hex, anyone know the
reason why there's GC going on in do: with closure images, but not
using the old bytecodes?
Because with the old bytecodes a block activation simply reuses the
BlockContext allocated to implement the block whereas with the new
bytecodes a block activation allocates a new context. The stack VM
reduces context allocation markedly and so simple examples like the
below won't allocate. In the stack VM context allocation is
deferred until the context object is actually used. Contexts are
allocated on process switch, when a block is created or when
thisContext is used.
HTH
Even with no closed-over variables in the block, it seems it creates
extra fodder for the Garbage collector...
Is it something the stack VM / Improved Garbage Collector will change?
Cheers,
Henry
Simple test:
intArray := (1 to: 10000000) asArray.
TimeProfileBrowser onBlock: [intArray do: [:ix | ix yourself]]
(It's not due to sampling overhead, Time millisecondsToRun: return
similiar runtimes)
Without closures (I used 250):
-941 tallies, 965 msec.
**GCs**
incr 4 totalling 0ms
With closures (I used 414):
-1938 tallies, 1938 msec.
**GCs**
inc 2510 totalling 614ms
_______________________________________________
Pharo-project mailing list
[email protected]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project
_______________________________________________
Pharo-project mailing list
[email protected]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project