On 9 August 2016 at 06:18, Guido van Rossum <gu...@python.org> wrote:
> I think Nick would be interested in understanding why this is the case. > What does the decorator do that could be so expensive? > Reviewing https://hg.python.org/cpython/file/default/Lib/contextlib.py#l57, Chris's analysis seems plausible to me: most of the cost is going to be in the fact that instead of a single function call for __exit__ and __enter__, we have a function call *and* a generator frame resumption, and in the case of __exit__, the expected flow includes catching StopIteration. The object creation is also marginally more expensive, since it's indirect through a factory function rather than calling the type directly. There's also currently some introspection of "func" being done in __init__ as a result of issue #19330 that should probably be moved out to the contextmanager decorator and passed in from the closure as a separate argument rather than being figured out on each call to _GeneratorContextManager.__init__. So I don't see any obvious reason we shouldn't be able to get the standard library version at least to a similar level of performance to Chris's simpler alternatives. There are also some potential object allocation efficiencies we could consider, like using __slots__ to eliminate the __dict__ allocation (that does have backwards compatibility implications, but may be worth it if it buys a non-trivial speed improvement). Beyond that, while I think a C accelerator may be worth trying, I'm not sure how much it will actually gain us, as it seems plausible that a function call + generator frame resumption will inherently take twice as long as just doing a function call. Regards, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com