Barry Warsaw wrote: > It's been a long while since I programmed on the NeXT, so Mac folks > here please chime in, but isn't there some Foundation idiom where > temporary Objective-C objects didn't need to be explicitly released > if their lifetime was exactly the duration of the function in which > they were created?
I think you're talking about the autorelease mechanism. It's a kind of delayed decref, the delay being until execution reaches some safe place, usually the main event loop of the application. It exists because Cocoa mostly manages refcounts on a much coarser-grained scale than Python. You don't normally count all the temporary references created by parameters and local variables, only "major" ones such as references stored in an instance variable of an object. The problem then is that an object might get released while in the middle of executing one or more of its methods, and there are still references to it in active stack frames. By delaying the decref until returning to the main loop, all these references have hopefully gone away by the time the object gets freed. You couldn't translate this scheme directly into Python, because there are various differences in the way refcounts are used. There's also not really any safe place to do the delayed decrefs. The interpreter loop is *not* a safe place, because there can be nested invocations of it, with C stack frames outside the current one holding references. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | Carpe post meridiem! | Christchurch, New Zealand | (I'm not a morning person.) | [EMAIL PROTECTED] +--------------------------------------+ _______________________________________________ Python-3000 mailing list Python-3000@python.org http://mail.python.org/mailman/listinfo/python-3000 Unsubscribe: http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com