What is the best way to measure memory consumption in polyml?

Background: we're running out of memory after porting some of our tools and 
proofs to Isabelle-2011. It seems that the main problem is that we are now 
producing more objects that can later be shared, i.e. we think we were able to 
get memory consumption down by sharing more often (using PolyML.shareCommonData 
PolyML.rootFunction, not sure if this is the sane thing to do). It's still too 
much for saving the state at the end of the session on a 32bit system, though. 
On 64bit systems, we seem to need about 12GB for something that fit into about 
2G on 32bit before.

We tried various measurements and tried to bisect where in Isabelle or our code 
the problem was introduced, but we're only getting strange results - seemingly 
completely unrelated changes.

My guess is that we didn't measure right. We used PolyML.objSize 
PolyML.rootFunction as an approximation of the overall memory consumption, but 
that doesn't always seem to correspond.

What is the best way to figure out more precisely what is going?

More specifically, I'd like to figure out:
 - maximum total memory consumption of the polyml process in a complete program 
run
 - current non-garbage heap consumption at a given point in the program 
 - more detailed view of which type of object is how big and occurs how often 
at a given point 

Does anyone have more experience with this kind of thing? 

Cheers,
Gerwin


_______________________________________________
polyml mailing list
[email protected]
http://lists.inf.ed.ac.uk/mailman/listinfo/polyml

Reply via email to