Christoph,
Actually, I don't like programs that abort gracefully at the drop of
a hat. While not necessary, it would be nicer to have some sort of wait
state, where if the program runs out of memory, something is written like
that to the screen (with all the memory for this allocated out), and then it
waits there so that the user can either free up memory or kill the program
himself.
While aborting gracefully is more aesthetically pleasant, when you
get down to it, what does it matter if your program just dies when your
system runs out of memory and your program segfaulting? I mean, it's not a
good thing that programs can't be hardy enough to withstand a temporary
memory shortage.
Think of it this way:
You're a user. Your box runs out of memory. Your box then crashes.
You're a user. Your box runs out of memory. Every program dies gracefully,
but at least the kernel is still running and the login prompt respawns once
everything is done dying.
Which is better? I mean, I realize that a simple game isn't really that
important, but how hard would it be to do our own sort of sci_malloc() which
does all of the appropriate checking, etc. and does the whole waiting thing
instead of dying on error conditions? We'd still be at the mercy of the
libraries that we're using, but it would be better than necessarily dying on
not-necessarily fatal conditions. I mean really. Who of us hasn't run
some program which had a bug in it which caused it to eat all memory, then
we kill it? Does it really make the most sense for everything else to die
because of one errant program?
-Chris
--
[EMAIL PROTECTED]
"If I had had more time I would have written a shorter letter." - Pascal
Linux Programs: http://cs.alfred.edu/~lansdoct/linux/
Linux - Get there. Today.
Evil Overlord Quote of the Day (www.eviloverlord.com):
99. Any data file of crucial importance will be padded to 1.45Mb in size.