I'm not sure I can suggest very much. The most likely cause is a change in Isabelle but it could be something to do with Poly/ML.

PolyML.objSize PolyML.rootFunction will tell you the size of the data that is reachable from the root. It won't include any data that is only reachable from the stacks of running threads although these will be part of the active heap. There is a function PolyML.objProfile that prints the number of cells of different sizes and that may give you a clue to what's going on. It distinguishes mutable and immutable cells but not between different sorts of immutable data.

If you're having difficulty saving the state it could be that there's insufficient space left to allocate the buffers needed to do the saving. You could try using a smaller value for the -H option which will cause the heap to be spread over more but smaller blocks and would increase the chances that some space can be released before the state is saved.

I've been considering some way to improve instrumentation of the garbage-collector but so far I haven't got round to implementing anything.

Regards,
David

On 07/07/2011 08:07, Gerwin Klein wrote:
What is the best way to measure memory consumption in polyml?

Background: we're running out of memory after porting some of our
tools and proofs to Isabelle-2011. It seems that the main problem is
that we are now producing more objects that can later be shared, i.e.
we think we were able to get memory consumption down by sharing more
often (using PolyML.shareCommonData PolyML.rootFunction, not sure if
this is the sane thing to do). It's still too much for saving the
state at the end of the session on a 32bit system, though. On 64bit
systems, we seem to need about 12GB for something that fit into about
2G on 32bit before.

We tried various measurements and tried to bisect where in Isabelle
or our code the problem was introduced, but we're only getting
strange results - seemingly completely unrelated changes.

My guess is that we didn't measure right. We used PolyML.objSize
PolyML.rootFunction as an approximation of the overall memory
consumption, but that doesn't always seem to correspond.

What is the best way to figure out more precisely what is going?

More specifically, I'd like to figure out: - maximum total memory
consumption of the polyml process in a complete program run - current
non-garbage heap consumption at a given point in the program - more
detailed view of which type of object is how big and occurs how often
at a given point

Does anyone have more experience with this kind of thing?

Cheers, Gerwin


_______________________________________________ polyml mailing list
[email protected]
http://lists.inf.ed.ac.uk/mailman/listinfo/polyml
_______________________________________________
polyml mailing list
[email protected]
http://lists.inf.ed.ac.uk/mailman/listinfo/polyml

Reply via email to