Michel Fortin wrote:
On 2009-10-06 20:26:48 -0400, Andrei Alexandrescu <seewebsiteforem...@erdani.org> said:

The matter has been discussed quite a bit around here and in other places. I'm not having as much time as I'd want to explain things. In short, destroying without freeing memory avoids dangling references and preserves memory safety without impacting on other resources.

It's a safety hack, not a performance hack.

In my opinion, it's mostly an illusion of safety. If you call the destructor on an object, the object state after the call doesn't necessarily respects the object invariants and doing anything with it could result in, well, anything, from returning wrong results to falling into an infinite loop (basically undefined behaviour). What you gain is that no object will be allocated on top of the old one, and thus new objects can't get corrupted. But it's still undefined behaviour, only with less side effects and more memory consumption.

I don't think it's a so bad idea on the whole, but it'd be more valuable if accessing an invalidated object could be made an error instead of undefined behaviour. If this can't be done, then we should encourage "destructors" to put the object in a clean state and not leave any dirt behind. But should that still be called a "destructor"?

Perhaps we could change the paradigm a little and replace "deletion" with "recycling". Recycling an object would call the destructor and immeditately call the default constructor, so the object is never left in an invalid state. Objects with no default constructor cannot be recycled. This way you know memory is always left in a clean state, and you encourage programmers to safely reuse the memory blocks from objects they have already allocated when possible.

Yes, recycling is best and I'm considering it. I'm only worried about the extra cost.

Andrei

Reply via email to