On 05/18/2012 12:57 AM, Nick Coghlan wrote:
I think the main things we'd be looking for would be:
- a clear explanation of why a new metaclass is considered too complex a
solution
- what the implications are for classes that have nothing to do with the
SciPy/NumPy ecosystem
- how subclassing would behave (both at the class and metaclass level)
Yes, defining a new metaclass for fast signature exchange has its
challenges - but it means that *our* concerns about maintaining
consistent behaviour in the default object model and avoiding adverse
effects on code that doesn't need the new behaviour are addressed
automatically.
Also, I'd consider a functioning reference implementation using a custom
metaclass a requirement before we considered modifying type anyway, so I
think that's the best thing to pursue next rather than a PEP. It also
has the virtue of letting you choose which Python versions to target and
iterating at a faster rate than CPython.
This seems right on target. I could make a utility code C header for
such a metaclass, and then the different libraries can all include it
and handshake on which implementation becomes the real one through
sys.modules during module initialization. That way an eventual PEP will
only be a natural incremental step to make things more polished, whether
that happens by making such a metaclass part of the standard library or
by extending PyTypeObject.
Thanks,
Dag
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com