Nick, Nathaniel, I'll be replying in full to your emails when I have
time to do some experiments.  Now I just want to address one point
that I think is important:

On Sat, Aug 12, 2017 at 1:09 PM, Nick Coghlan <> wrote:
> On 12 August 2017 at 17:54, Nathaniel Smith <> wrote:
>> ...and now that I've written that down, I sort of feel like that might
>> be what you want for all the other sorts of context object too? Like,
>> here's a convoluted example:
>> def gen():
>>     a = decimal.Decimal("1.111")
>>     b = decimal.Decimal("2.222")
>>     print(a + b)
>>     yield
>>     print(a + b)
>> def caller():
>>     # let's pretend this context manager exists, the actual API is
>> more complicated
>>     with decimal_context_precision(3):
>>         g = gen()
>>     with decimal_context_precision(2):
>>         next(g)
>>     with decimal_context_precision(1):
>>         next(g)
>> Currently, this will print "3.3 3", because when the generator is
>> resumed it inherits the context of the resuming site. With PEP 550, it
>> would print "3.33 3.33" (or maybe "3.3 3.3"? it's not totally clear
>> from the text), because it inherits the context when the generator is
>> created and then ignores the calling context. It's hard to get strong
>> intuitions, but I feel like the current behavior is actually more
>> sensible -- each time the generator gets resumed, the next bit of code
>> runs in the context of whoever called next(), and the generator is
>> just passively inheriting context, so ... that makes sense.
> Now that you raise this point, I think it means that generators need
> to retain their current context inheritance behaviour, simply for
> backwards compatibility purposes. This means that the case we need to
> enable is the one where the generator *doesn't* dynamically adjust its
> execution context to match that of the calling function.

Nobody *intentionally* iterates a generator manually in different
decimal contexts (or any other contexts). This is an extremely error
prone thing to do, because one refactoring of generator -- rearranging
yields -- would wreck your custom iteration/context logic. I don't
think that any real code relies on this, and I don't think that we are
breaking backwards compatibility here in any way. How many users need
about this?

If someone does need this, it's possible to flip
`gi_isolated_execution_context` to `False` (as contextmanager does
now) and get this behaviour. This might be needed for frameworks like
Tornado which support coroutines via generators without 'yield from',
but I'll have to verify this.

What I'm saying here, is that any sort of context leaking *into* or
*out of* generator *while* it is iterating will likely cause only bugs
or undefined behaviour. Take a look at the precision example in the
Rationale section of  the PEP.

Most of the time generators are created and are iterated in the same
spot, you rarely create generator closures. One way the behaviour
could be changed, however, is to capture the execution context when
it's first iterated (as opposed to when it's instantiated), but I
don't think it makes any real difference.

Another idea: in one of my initial PEP implementations, I exposed
gen.gi_execution_context (same for coroutines) to python as read/write
attribute. That allowed to

(a) get the execution context out of generator (for introspection or
other purposes);

(b) inject execution context for event loops; for instance
asyncio.Task could do that for some purpose.

Maybe this would be useful for someone who wants to mess with
generators and contexts.

>     def autonomous_generator(gf):
>         @functools.wraps(gf)
>         def wrapper(*args, **kwds):
>             gi = genfunc(*args, **kwds)
>             gi.gi_back = gi.gi_frame
>             return gi
>         return wrapper

Nick, I still have to fully grasp the idea of `gi_back`, but one quick
thing: I specifically designed the PEP to avoid touching frames. The
current design only needs TLS and a little help from the
interpreter/core objects adjusting that TLS. It should be very
straightforward to implement the PEP in any interpreter (with JIT or
without) or compilers like Cython.

> Given that, you'd have the following initial states for "revert
> context" (currently called "isolated context" in the PEP):
> * unawaited coroutines: true (same as PEP)
> * awaited coroutines: false (same as PEP)
> * generators (both sync & async): false (opposite of current PEP)
> * autonomous generators: true (set "gi_revert_context" or
> "ag_revert_context" explicitly)

If generators do not isolate their context, then the example in the
Rationale section will not work as expected (or am I missing
something?). Fixing generators state leak was one of the main goals of
the PEP.

Python-ideas mailing list
Code of Conduct:

Reply via email to