On 12 August 2017 at 17:54, Nathaniel Smith <n...@pobox.com> wrote:
> ...and now that I've written that down, I sort of feel like that might
> be what you want for all the other sorts of context object too? Like,
> here's a convoluted example:
>
> def gen():
>     a = decimal.Decimal("1.111")
>     b = decimal.Decimal("2.222")
>     print(a + b)
>     yield
>     print(a + b)
>
> def caller():
>     # let's pretend this context manager exists, the actual API is
> more complicated
>     with decimal_context_precision(3):
>         g = gen()
>     with decimal_context_precision(2):
>         next(g)
>     with decimal_context_precision(1):
>         next(g)
>
> Currently, this will print "3.3 3", because when the generator is
> resumed it inherits the context of the resuming site. With PEP 550, it
> would print "3.33 3.33" (or maybe "3.3 3.3"? it's not totally clear
> from the text), because it inherits the context when the generator is
> created and then ignores the calling context. It's hard to get strong
> intuitions, but I feel like the current behavior is actually more
> sensible -- each time the generator gets resumed, the next bit of code
> runs in the context of whoever called next(), and the generator is
> just passively inheriting context, so ... that makes sense.

Now that you raise this point, I think it means that generators need
to retain their current context inheritance behaviour, simply for
backwards compatibility purposes. This means that the case we need to
enable is the one where the generator *doesn't* dynamically adjust its
execution context to match that of the calling function.

One way that could work (using the cr_back/gi_back convention I suggested):

- generators start with gi_back not set
- if gi_back is NULL/None, gi.send() and gi.throw() set it to the
calling frame for the duration of the synchronous call and *don't*
adjust the execution context (i.e. the inverse of coroutine behaviour)
- if gi_back is already set, then gi.send() and gi.throw() *do* save
and restore the execution context around synchronous calls in to the
generator frame

To create an autonomous generator (i.e. one that didn't dynamically
update its execution context), you'd use a decorator like:

    def autonomous_generator(gf):
        @functools.wraps(gf)
        def wrapper(*args, **kwds):
            gi = genfunc(*args, **kwds)
            gi.gi_back = gi.gi_frame
            return gi
        return wrapper

Asynchronous generators would then work like synchronous generators:
ag_back would be NULL/None by default, and dynamically set for the
duration of each __anext__ call. If you wanted to create an autonomous
one, you'd make it's back reference a circular reference to itself to
disable the implicit dynamic updates.

When I put it in those terms though, I think the
cr_back/gi_back/ag_back idea should actually be orthogonal to the
"revert_context" flag (so you can record the link back to the caller
even when maintaining an autonomous context).

Given that, you'd have the following initial states for "revert
context" (currently called "isolated context" in the PEP):

* unawaited coroutines: true (same as PEP)
* awaited coroutines: false (same as PEP)
* generators (both sync & async): false (opposite of current PEP)
* autonomous generators: true (set "gi_revert_context" or
"ag_revert_context" explicitly)

Open question: whether having "yield" inside a with statement implies
the creation of an autonomous generator (synchronous or otherwise), or
whether you'd need a decorator to get your context management right in
such cases.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to