Mon, 06 Aug 2001 15:40:50 -0700, Thomas Hallgren <[EMAIL PROTECTED]> pisze:

> Regarding the maximum heap size, to avoid letting the heap grow too 
> large, you could perhaps take into account the number of page faults 
> that occur during garbage collection, or the ratio between CPU time and 
> real time...

A disadvantage of taking many factors into account is that the same
program will non-deterministically run or fail.

IMHO a default maximum heap size should be well-defined: either based
on an environment variable or without limits or a fixed value. It would
be bad for example to *automatically* set it depending on the free
physical memory, because it would lead to the following:
   - your program doesn't compile
   - sorry, works for me
   - compilation dies with out of memory
   - you must have low physical memory; please set a flag: it will
     swap a lot but will finally compile
   - thanks, I don't know why but now it compiled without setting
     any flags.

It would not be a problem if the limit was reached very rarely.
Unfortunately it's not the case.

-- 
 __("<  Marcin Kowalczyk * [EMAIL PROTECTED] http://qrczak.ids.net.pl/
 \__/
  ^^                      SYGNATURA ZASTĘPCZA
QRCZAK


_______________________________________________
Glasgow-haskell-users mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users

Reply via email to