On Thu, 24 May 2012 15:33:07 -0400, Sean Kelly <s...@invisibleduck.org>
wrote:
On May 24, 2012, at 11:39 AM, Steven Schveighoffer wrote:
Can "Out of memory" be an Error? No, because e.g. if I read a user
file that require me to create a large array (> 100 MiB, e.g.) I don't
want to crash, but just tell, that "Dear user, the file can't be
opened because it requires..."
Right, out of memory is only an error if your program's invariant
depends on the memory allocation. You can plan for the above easily
enough, but not realistically for all tasks and all library code that
require allocation.
For example, let's say you are restructuring a hash table, and you
reallocate some nodes. You have half transferred over the old
structure to the new structure, and you run out of memory. How to
recover from this?
I think it's fair to expect code that allocates be exception-safe in the
face of allocation errors. I know I'm always very careful with
containers so that an allocation failure doesn't result in corruption,
for example.
I don't think it's fair to expect *all* code to be able to safely recover
from an out of memory exception. I pretty much *never* write code that
worries about out of memory errors. One cannot always expect an operation
involving hundreds of allocations to be atomic.
That being said, we should provide a mechanism so you can handle it, as
it's reliably detectable and very recoverable in many situations.
-Steve