On Feb 16, 2008 3:12 PM, Amaury Forgeot d'Arc <[EMAIL PROTECTED]> wrote: > Should we however intensively search and correct all of them? > Is there a clever way to prevent these problems globally, for example > by delaying finalizers "just a little"?
A simple way to do this would be to push objects whose refcounts had reached 0 onto a list instead of finalizing them immediately, and have PyEval_EvalFrameEx periodically swap in a new to-delete list and delete the objects on the old one. A linked list would cost an extra pointer in PyObject_HEAD, but a growable array would only cost allocations, which would be amortized over the allocations of the objects you're deleting, so that's probably the way to go. A fixed-size queue that just delayed finalization by a constant number of objects would usually work without any allocations, but there would sometimes be single finalizers that recursively freed too many other objects, which would defeat the delay. Jeffrey _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com