> (By the way, this this could be done automatically - when the real > time system runs out of memory, it could simply delete some cached > values, as they can almost always be recalculated.)
People keep toying with this idea but no-one ever seems to implement it. One of the problems is that heap sizes can go up as well as down. A thunk like: length [1,2,3,4,5,6,7,8,9,10] :: Int or 1+1+1+1+1+1+1+1+1+1+1+1+1+1 :: Int will get smaller when fully evaluated. A thunk like: enumFromTo 1 10 :: [Int] will get larger when fully evaluated. Another problem is that if you want to revert an object to its original form you have to keep the unevaluated thunk. That thunk may be larger than the evaluated object. Even if it is smaller, it is still an overhead that we would normally strive to avoid. Finally, it is hard to determine the size of a thunk because it often shares some of its structure with other thunks. -- Alastair Reid [EMAIL PROTECTED] Reid Consulting (UK) Limited http://www.reid-consulting-uk.ltd.uk/alastair/ _______________________________________________ Haskell mailing list [EMAIL PROTECTED] http://www.haskell.org/mailman/listinfo/haskell
