Hello,
I'm writing a library which will require many blocks of binary data
of various sizes (some very large) to be stored in the heap. I'm a
little worried about the effect this will have on the efficiency of
garbage collection. I'm not sure how ghc gc works these days, but
I seem to remember
GHC does not copy big objects, so don't worry about the copying cost.
(Instead of copying, it allocates big objects to (a contiguous series
of) heap blocks, with no other objects in those blocks. Then the object
can move simply by swizzling the heap-block descriptor.)
Simon
| -Original
I had wanted to CC this to the list, but of course I forgot:
Stephen Pitts [EMAIL PROTECTED] wrote:
Is there an easy way to profile stack usage without rebuilding with
ticky-ticky profiling? I have two implementations of an algorithm;
the one with straight lists seems to use constant stack,