Nick Coghlan added the comment:

As Raymond notes, the main downside here is in terms of code complexity. 
However, the concrete gain is that APIs that rely on callable pickling, such as 
concurrent.futures with a ProcessPoolExecutor, would be consistently compatible 
with functools.partial:

>>> from concurrent.futures import ProcessPoolExecutor
>>> from functools import partial
>>> with ProcessPoolExecutor() as pool:
...     pool.submit(print, "hello")
...     pool.submit(partial(print, "hello"))
... 
<Future at 0x7f4fdb47ce48 state=running>
<Future at 0x7f4fd4f9cb00 state=pending>
hello
hello

At the moment, such code will fail if _functools is unavailable, since closures 
don't support pickling (unpickling functions involves looking them up by name, 
which isn't possible with closures)

The other main benefit is retaining the custom __repr__ when falling back to 
the Python implementation:

>>> partial(print, "hello")
functools.partial(<built-in function print>, 'hello')

At the moment, the closure-based version instead gives:

>>> partial(print, "hello")
<function functools.partial.<locals>.newfunc at 0x7f4fd6e0aea0>

Preserving those two capabilities seems sufficiently worthwhile to me to 
justify the extra code complexity and the greater speed penalty when the 
accelerator module isn't available (I'm assuming that in runtimes using a JIT 
compiler the speed difference should be negligible in practice)

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue27137>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to