On Thu, Nov 12, 2015 at 9:00 PM, Guido van Rossum <[email protected]> wrote:
> Vladimir, I would appreciate it if you filed a 3.4 bug -- if you have
> a patch that would be even better! Thanks for reporting this.

Guido, I have opened following bug reports considering two mentioned
in this discussion issues:

https://bugs.python.org/issue25647
https://bugs.python.org/issue25648

Regards,

Vladimir Rutsky

>
> On Thu, Nov 12, 2015 at 2:01 AM, Vladimir Rutsky
> <[email protected]> wrote:
>> On Mon, Nov 9, 2015 at 11:14 PM, Guido van Rossum <[email protected]> wrote:
>>> The intention of this code is to make it possible (and legal) to decorate a
>>> non-generator function with `@asyncio.coroutine`, and to make that
>>> effectively equivalent to the same decorated function with an unreachable
>>> yield inside it (which makes it a generator). IOW these two should be
>>> equivalent:
>>>
>>> @asyncio.coroutine
>>> def foo():
>>>     return 42
>>>
>>> @asyncio.coroutine
>>> def foo():
>>>     return 42
>>>     yield  # dummy to make it a coroutine
>>>
>>> You should be able to write in some other coroutine
>>>
>>>     x = yield from foo()
>>>     assert x == 42
>>>
>>> regardless of which definition of foo() you've got.
>>>
>>> Note that for the first version the decorator always matters, while for the
>>> second version it only matters in debug mode.
>>>
>>> But it looks like the code has some subtle bug. :-(
>>>
>>> I think one of the problems is that we don't always test with debug mode.
>>>
>>> There are probably also some cases that the code is trying to support, e.g.
>>> for generators in general (outside the context of asyncio) there is usually
>>> no difference between these two examples:
>>>
>>> def foo():
>>>     yield 1
>>>     yield 2
>>> def bar():
>>>     return foo()
>>>
>>> vs.
>>>
>>> def bar():
>>>     yield 1
>>>     yield 2
>>>
>>> -- in either case, calling bar() returns a generator that when iterated over
>>> generates the sequence [1, 2]. But what you've discovered is that if we
>>> decorate both functions with @asyncio.coroutine, there *is* a difference.
>>> And I think the code is making a half-hearted attempt to paper over the
>>> difference, but failing due to debug mode.
>>
>> Thanks for the explanation. Reading asyncio documentation I thought about
>> asyncio-coroutines as some special kind functions (implemented using
>> generators) that can be run as coroutines by the event loop.
>> Almost always I develop asyncio-based programs with enabled
>> PYTHONASYNCIODEBUG, which in some cases doesn't allow to mix
>> usual functions and asyncio.coroutine-functions due to the discussed bug ---
>> this is one of the reason I thought that asyncio-coroutines can't be used as
>> simple generator functions.
>>
>> It would be nice if asyncio has better documentation for this case, or at
>> least link on a good tutorial.
>>
>> Other bug I experienced in Python 3.4 about different behaviour when
>> PYTHONASYNCIODEBUG is enabled is when wrapped with
>> asyncio.coroutine function doesn't have special attributes like
>> "__name__". In Python 3.4.3 with enabled asyncio debug a function is
>> wrapped using following code:
>>
>> @functools.wraps(func)
>> def wrapper(*args, **kwds):
>>     w = CoroWrapper(coro(*args, **kwds), func)
>>     if w._source_traceback:
>>         del w._source_traceback[-1]
>>     w.__name__ = func.__name__
>>     if hasattr(func, '__qualname__'):
>>         w.__qualname__ = func.__qualname__
>>     w.__doc__ = func.__doc__
>>     return w
>>
>> note the unconditional access to "__name__" and "__doc__" attributes,
>> which function can not have in some conditions.
>>
>> As this use case looks strange and unrealistic I met him in a real
>> application that used mocking in tests. The code there was something like
>> following:
>>
>> def f():
>>     return "f result"
>>
>> mocked_f = Mock(wraps=f)
>> coro_func = asyncio.coroutine(mocked_f)
>> print(loop.run_until_complete(coro_func()))
>> mocked_f.assert_called_once_with()
>>
>> which failed only in the debug mode:
>>
>> $ python3 asyncio_debug_attr_access.py
>> f result
>> $ PYTHONASYNCIODEBUG=X python3 asyncio_debug_attr_access.py
>> Traceback (most recent call last):
>>   File "asyncio_debug_attr_access.py", line 21, in <module>
>>     print(loop.run_until_complete(coro_func()))
>>   File "/usr/lib/python3.4/asyncio/coroutines.py", line 154, in wrapper
>>     w.__name__ = func.__name__
>>   File "/usr/lib/python3.4/unittest/mock.py", line 570, in __getattr__
>>     raise AttributeError(name)
>> AttributeError: __name__
>> Exception ignored in: $
>>
>> Here is complete example:
>> https://gist.github.com/rutsky/65cee7728135b05d49c3
>>
>> It can be bug in the unittest.mock library, that mock doesn't have special
>> attributes, but still it's unexpected to have such different behaviour in 
>> debug
>> and not debug modes.
>>
>> This issue is not reproduced in Python 3.5, since it has more accurate
>> special attribute access:
>>
>> @functools.wraps(func)
>> def wrapper(*args, **kwds):
>>     w = CoroWrapper(coro(*args, **kwds), func=func)
>>     if w._source_traceback:
>>         del w._source_traceback[-1]
>>     # Python < 3.5 does not implement __qualname__
>>     # on generator objects, so we set it manually.
>>     # We use getattr as some callables (such as
>>     # functools.partial may lack __qualname__).
>>     w.__name__ = getattr(func, '__name__', None)
>>     w.__qualname__ = getattr(func, '__qualname__', None)
>>     return w
>>
>> I workarounded both of the issues using monkeypatching in my project.
>>
>> Should I file bug reports against Python 3.4 about them on the Python bug
>> tracker?
>>
>>
>> Regards,
>>
>> Vladimir Rutsky
>>
>>>
>>> On Mon, Nov 9, 2015 at 10:25 AM, Vladimir Rutsky <[email protected]>
>>> wrote:
>>>>
>>>> Hello!
>>>>
>>>> Can anybody explain to me this part of asyncio.coroutine code:
>>>>
>>>> <https://github.com/python/asyncio/blob/3b6a64a9fb6ec4ad0c984532aa776b130067c901/asyncio/coroutines.py#L201>
>>>>
>>>> # asyncio/coroutines.py:201:
>>>> def coroutine(func):
>>>>     """Decorator to mark coroutines.
>>>>     If the coroutine is not yielded from before it is destroyed,
>>>>     an error message is logged.
>>>>     """
>>>>     if _inspect_iscoroutinefunction(func):
>>>>         # In Python 3.5 that's all we need to do for coroutines
>>>>         # defiend with "async def".
>>>>         # Wrapping in CoroWrapper will happen via
>>>>         # 'sys.set_coroutine_wrapper' function.
>>>>         return func
>>>>
>>>>     if inspect.isgeneratorfunction(func):
>>>>         coro = func
>>>>     else:
>>>>         @functools.wraps(func)
>>>>         def coro(*args, **kw):
>>>>             res = func(*args, **kw)
>>>>             if isinstance(res, futures.Future) or
>>>> inspect.isgenerator(res): # <--- This part
>>>>                 res = yield from res
>>>>             ...
>>>>
>>>> What does this code: if wrapped by asyncio.coroutine function is not
>>>> generator and returns generator, Future or awaitable object,
>>>> then this coroutine function will yield from returned value.
>>>>
>>>> E.g. following coroutine function will actually return result of the
>>>> future, not future itself:
>>>>
>>>> @asyncio.coroutine
>>>> def return_future():
>>>>     fut = asyncio.Future()
>>>>     fut.set_result("return_future")
>>>>
>>>>     return fut
>>>>
>>>> I can't find where such behaviour is documented? Is this a desired
>>>> behavior?
>>>>
>>>> According to
>>>> <https://docs.python.org/3/library/asyncio-task.html?highlight=iscoroutine#coroutines>:
>>>> > Coroutines used with asyncio may be implemented using the async def
>>>> > statement, or by using generators.
>>>> > ...
>>>> > Generator-based coroutines should be decorated with @asyncio.coroutine,
>>>> > although this is not strictly enforced.
>>>>
>>>> So technically, according to this part of documentation, it's not
>>>> explicitly allowed to use asyncio.coroutine with non-generator functions.
>>>>
>>>> Special behaviour when asyncio.coroutine-wrapped non-generator function
>>>> returns generator or Future object
>>>> leads to inconsistent behaviour on Python 3.4 when asyncio debugging is
>>>> enabled and not enabled.
>>>>
>>>> Consider following function:
>>>>
>>>> @asyncio.coroutine
>>>> def return_coroutine_object():
>>>>     @asyncio.coroutine
>>>>     def g():
>>>>         yield from asyncio.sleep(0.01)
>>>>         return "return_coroutine_object"
>>>>
>>>>     return g()
>>>>
>>>> When debugging is disabled following code will work:
>>>>
>>>> assert loop.run_until_complete(return_coroutine_object()) ==
>>>> "return_coroutine_object"
>>>>
>>>> When debugging is enabled same code will not work on Python 3.4,
>>>> because result of "g()" is not a generator or Future --- it's debugging
>>>> CoroWrapper.
>>>> Outer asyncio.coroutine will not yield from CoroWrapper and will return
>>>> inner "g()" CoroWrapper.
>>>>
>>>> I made complete example demonstrating this issue:
>>>> <https://gist.github.com/rutsky/c72be2edeb1c8256d680>
>>>>
>>>> This issue is not reproduced in Python 3.5, since CoroWrapper is awaitable
>>>> in Python 3.5,
>>>> but still I can't find is this is a desired behaviour and why?
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Vladimir Rutsky
>>>>
>>>
>>>
>>>
>>> --
>>> --Guido van Rossum (python.org/~guido)
>
>
>
> --
> --Guido van Rossum (python.org/~guido)

Reply via email to