[Python-Dev] Re: PEP 654 except* formatting

2021-10-06 Thread Yury Selivanov
I don't like `except group` or any variant with soft keywords.

I'll list a few reasons here:

1. `try: .. except group:` is a valid syntax today. And it will continue to
be valid syntax. Having both `try: .. except group:` (catch exception
`group`) and `try: .. except group E:` (catch exceptions of E into a group)
in the same grammar worries me.

1a. It can be especially confusing if someone has a local/global variable
called `group`.

1b. Or, for example, if a user forgets to type `E` and leaves just `except
group` it would fallback to the regular try..except behavior. And it would
be a runtime error ("group" is undefined).

1c. This will be all even more complicated because syntax highlighters in
IDEs and on sites like GitHub will likely just always highlight `except
group` as a pair of keywords (even in `except group:` variant).

2. I'm not sure I like the "sound" of it. IMO it would make more sense to
write `except all E`, but `all()` is a built-in and so this would be at
odds with (1).

3. This is a niche feature. People who use async/await will get used to
`except*` in no time. `except*` is also about unpacking in some
metaphysical sense (looks similar enough to `*args` in function signatures
to me) so I think it reads just fine.

So I'm -1 on `except group` or any variant that uses soft keywords. If the
SC considers making `group` a proper keyword I can possibly change my mind
on this.

Yury


On Tue, Oct 5, 2021 at 6:28 PM Barry Warsaw  wrote:

> What do the PEP authors think about `except group`?  Bikeshedding aside,
> that’s still the best alternative I’ve seen.  It’s unambiguous,
> self-descriptive, and can’t be confused with unpacking syntax.
>
> -Barry
>
> > On Oct 5, 2021, at 11:15, sascha.schlemmer--- via Python-Dev <
> python-dev@python.org> wrote:
> >
> > I agree that *(E1, E2) looks like unpacking, how about
> >
> > except *E1 as error: ...
> > except (*E1, *E2) as error: ...
> >
> > even better would be if we could drop the braces:
> > except *E1, *E2 as error: ...
> > ___
> > Python-Dev mailing list -- python-dev@python.org
> > To unsubscribe send an email to python-dev-le...@python.org
> > https://mail.python.org/mailman3/lists/python-dev.python.org/
> > Message archived at
> https://mail.python.org/archives/list/python-dev@python.org/message/PFYQC7XMYFAGOPU5C2YVMND2BQSIJPRC/
> > Code of Conduct: http://python.org/psf/codeofconduct/
>
> ___
> Python-Dev mailing list -- python-dev@python.org
> To unsubscribe send an email to python-dev-le...@python.org
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-dev@python.org/message/SZNDJPKT7WNWJHG4UDJ6D3BU6IN5ZXZO/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
 Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/UTWXVURISUQ4HC4SZQV3MN6R6U2FCQKA/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Accepting PEP 654.

2021-09-24 Thread Yury Selivanov
Ah, that's no problem, both spellings are good. Since I'm replying on
Pyhton-dev, I'll quote my Discord response here:

"Thank you Thomas and the SC. I’ll start working on incorporating
TaskGroups into asyncio in the next few weeks."

Thanks,
Yury


On Fri, Sep 24, 2021 at 3:13 PM, Thomas Wouters  wrote:

> On Sat, Sep 25, 2021 at 12:06 AM Thomas Wouters  wrote:
>
>
> Irit, Guido, Yuri, Nathaniel,
>
>
> I do apologise for the typo in your name, Yury... somehow none of us
> caught it in our proofreading of the response :( I fixed it on Discourse
> with an edit, but email, alas...
>
> --
> Thomas Wouters 
>
> Hi! I'm an email virus! Think twice before sending your email to help me
> spread!
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/CQQNVGRPL7JJSNAHM3CNSUZUDDLSCIWF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Discrepancy between what aiter() and `async for` requires on purpose?

2021-09-08 Thread Yury Selivanov
 We have already merged it, the fix is part of the rc2.

yury


On Wed, Sep 8 2021 at 12:48 PM, Brett Cannon  wrote:

> On Thu, Sep 2, 2021 at 7:43 PM Yury Selivanov 
> wrote:
>
>> Comments inlined:
>>
>> On Thu, Sep 2, 2021 at 6:23 PM Guido van Rossum  wrote:
>>
>>> First of all, we should ping Yury, who implemented `async for` about 6
>>> years ago (see PEP 492), and Joshua Bronson, who implemented aiter() and
>>> anext() about 5 months ago (see https://bugs.python.org/issue31861).
>>> I've CC'ed them here.
>>>
>>
>> Looks like PyAiter_Check was added along with the aiter/anext builtins.
>> I agree it's unnecessary to check for __aiter__ in it, so I let's just fix
>> it.
>>
>>
>>
>>>
>>> My own view:
>>>
>>> A. iter() doesn't check that the thing returned implements __next__,
>>> because it's not needed -- iterators having an __iter__ methor is a
>>> convention, not a requirement.
>>>
>>
>> Yeah.
>>
>>
>>> You shouldn't implement __iter__ returning something that doesn't
>>> implement __iter__ itself, because then "for x in iter(a)" would fail even
>>> though "for x in a" works. But you get an error, and anyone who implements
>>> something like that (or uses it) deserves what they get. People know about
>>> this convention and the ABC enforces it, so in practice it will be very
>>> rare that someone gets bitten by this.
>>>
>>> B. aiter() shouldn't need to check either, for exactly the same reason.
>>> I *suspect* (but do not know) that the extra check for the presence of
>>> __iter__ is simply an attempt by the implementer to enforce the convention.
>>> There is no *need* other than ensuring that "async for x in aiter(a)" works
>>> when "async for x in a" works.
>>>
>>
>> I agree.
>>
>
> [SNIP]
>
>
>
>> Bottom line: let's fix PyAiter_Check to only look for __anext__. It's a
>> new function so we can still fix it to reflect PyIter_Check and not worry
>> about anything.
>>
>
> I don't know if Pablo wants such a change in 3.10 since we are at rc2 at
> this point, so this might have to wait for 3.11 (although there's no
> deprecation here since it's a loosening of requirements so it could go in
> straight away).
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/BT4NCXAD7PLRYST7OOWYDCPOIG4OU6A6/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Discrepancy between what aiter() and `async for` requires on purpose?

2021-09-02 Thread Yury Selivanov
Comments inlined:

On Thu, Sep 2, 2021 at 6:23 PM Guido van Rossum  wrote:

> First of all, we should ping Yury, who implemented `async for` about 6
> years ago (see PEP 492), and Joshua Bronson, who implemented aiter() and
> anext() about 5 months ago (see https://bugs.python.org/issue31861). I've
> CC'ed them here.
>

Looks like PyAiter_Check was added along with the aiter/anext builtins. I
agree it's unnecessary to check for __aiter__ in it, so I let's just fix it.



>
> My own view:
>
> A. iter() doesn't check that the thing returned implements __next__,
> because it's not needed -- iterators having an __iter__ methor is a
> convention, not a requirement.
>

Yeah.


> You shouldn't implement __iter__ returning something that doesn't
> implement __iter__ itself, because then "for x in iter(a)" would fail even
> though "for x in a" works. But you get an error, and anyone who implements
> something like that (or uses it) deserves what they get. People know about
> this convention and the ABC enforces it, so in practice it will be very
> rare that someone gets bitten by this.
>
> B. aiter() shouldn't need to check either, for exactly the same reason. I
> *suspect* (but do not know) that the extra check for the presence of
> __iter__ is simply an attempt by the implementer to enforce the convention.
> There is no *need* other than ensuring that "async for x in aiter(a)" works
> when "async for x in a" works.
>

I agree.

>
> Note that PEP 525, which defines async generators, seems to imply that an
> __aiter__ returning self is always necessary, but I don't think it gives a
> reason.
>

PEP 525 implies that specifically for asynchronous generators, not
iterators. That's due to the fact that synchronous generators return self
from their __iter__.

>
> I do notice there's some backwards compatibility issue related to
> __aiter__, alluded to in both PEP 492 (
> https://www.python.org/dev/peps/pep-0492/#api-design-and-implementation-revisions)
> and PEP 525 (
> https://www.python.org/dev/peps/pep-0525/#aiter-and-anext-builtins). So
> it's *possible* that it has to do with this (maybe really old code
> implementing the 3.5 version of __aiter__ would be caught out by the extra
> check) but I don't think it is. Hopefully Yury and/or Joshua remembers?
>

That wasn't related.

In the first iteration of PEP 492, __aiter__ was required to be a
coroutine. Some time after shipping 3.5.0 I realized that that would
complicate asynchronous generators for no reason (and I think there were
also some bigger problems than just complicating them). So I updated the
PEP to change __aiter__ return type from `Awaitable[AsyncIterator]` to
`AsyncIterator`. ceval code was changed to call __aiter__ and see if the
object that it returned had __anext__. If not, it tried to await on it.

Bottom line: let's fix PyAiter_Check to only look for __anext__. It's a new
function so we can still fix it to reflect PyIter_Check and not worry about
anything.

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/BRHMOFPEKGQCCKEKEEKGSYDR6NOPMRCC/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 654: Exception Groups and except* [REPOST]

2021-04-29 Thread Yury Selivanov
On Wed, Apr 28, 2021 at 8:53 PM Nathaniel Smith  wrote:
> Looking at the relevant section of the PEP again [1], it notes the
> same fatal flaw with my first suggestion, and then says that the
> multiple-except-executions option should be rejected because users
> have written code like 'except SomeError: ...' with the expectation
> that the 'except' clause would run exactly once. That's definitely
> true, and it's a downside of the multiple-except-executions approach,
> but I don't think it's convincing enough to rule this out on its own.
> The problem is, *all* our options for how 'except' should interact
> with ExceptionGroups will somehow break previous expectations.

Well, this is where we respectfully disagree. The case of changing how
the regular try..except works in a backwards incompatible way is a
very convincing blocker to us.

>
> Concretely: imagine you have a pre-existing 'except SomeError', and
> some new code inside the 'try' block raises some number of
> 'SomeError's wrapped in an ExceptionGroup. There are three options:
>
> - Execute the 'except' block multiple times. This breaks the
> expectation that it should be executed at most once.
> - Execute the 'except' block exactly once. But if there are multiple
> SomeError's, this require they be grouped and delivered as a single
> exception, which breaks typing.
> - Execute the 'except' block zero times. This is what the current PEP
> chooses, and breaks the expectation that 'except SomeError' should
> catch 'SomeError'.
>
> So we have to pick our poison.

We did. The PEP talks at length as to why this isn't a problem.

> > I'm confused about the flattening suggestion - above you talk about "flat 
> > EG", but below about tracebacks. It's not clear to me whether you want EG 
> > to be flat (ie no nesting of EGs) or just the traceback to be flat (but you 
> > can still have a nested EG).
>
> Hmm, I was thinking about making both of them flat, so no nested EGs.
> In all my designs, the only reason I ever had nesting was because I
> couldn't figure out any other way to make the tracebacks work. Do you
> have some other motivation for wanting nesting? If so that would be
> interesting, because it might point to why we're talking past each
> other and help us understand the problem better...
>
> > I also don't know what problem you are trying to solve with this.
>
> I'm not saying that there's some fatal problem with the current PEP.
> (In my first message I explicitly said that it would be an improvement
> over the status quo :-).) But I think that nesting will be really
> counterintuitive/confusing for users in some ways. And concurrency
> APIs will be offputting if they force you to use a different special
> form of 'except' all the time. Basically the 'flat' version might be a
> lot more ergonomic, and that's important for a language like Python.

You keep saying that your idea of flat EGs is "ergonomic" and
"important for a language like Python". The problem is that after all
these emails I still have absolutely no idea about:

- what exactly are you trying to propose?
- what specific problem do you want to address? (that the current PEP,
in your opinion, does not address)
- why do you think that what you propose would be more ergonomic or simple?
- what "flat EGs" or "flat tracebacks" even mean; I can't figure out
not only the high level API, I don't even understand what you're
talking about at the data structures level.

Nathaniel, at this point it's clear that this thread somehow does not
help us understand what you want. Could you please just write your own
PEP clearly outlining your proposal, its upsides and downsides?
Without a PEP from you this thread is just a distraction.


Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/AR737RYK2KCTATCBW3QSDK5SSF7CH75M/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 654: Exception Groups and except* [REPOST]

2021-04-21 Thread Yury Selivanov
On Wed, Apr 21, 2021 at 11:50 AM srku...@mail.de  wrote:
>
> Removing two concepts and preserving semantics simplifies the matter for 
> users. People need less to memorize and less to learn.
>
> Or am I missing something here? Couldn’t we achieve our goal without these 
> two new classes?

No, we can't. What you are proposing would make it very hard for users
to understand at a glance if what you have in an innocently looking
`except Exception` is correct or not.  In my async/await code I'd have
to always check the `__group__` attribute to make sure it's not an
exception group in disguise.

So while you're "simplifying" the proposal by removing a couple of
types, you're complicating it in all other places. Besides, I don't
think that adding the ExceptionGroup type is a controversial idea that
needs any simplification.

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/IXHLXQYTEFPRC2GLFGAM6GHCYAAQG4DQ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 654: Exception Groups and except* [REPOST]

2021-04-05 Thread Yury Selivanov
Just wanted to elaborate a little bit on StopIteration to add to Irit's reply:

On Mon, Apr 5, 2021 at 9:52 AM Irit Katriel via Python-Dev
 wrote:
> On Mon, Apr 5, 2021 at 11:01 AM Nathaniel Smith  wrote:
>> - There are a number of places where the Python VM itself catches exceptions 
>> and has hard-coded handling for certain exception types. For example:
>>
>>   - Unhandled exceptions that reach the top of the main thread generally 
>> cause a traceback to be printed, but if the exception is SystemExit then the 
>> interpreter instead exits silently with status exc.args[0].
>>
>>   - 'for' loops call iter.__next__, and catch StopIteration while allowing 
>> other exceptions to escape.
>>
>>   - Generators catch StopIteration from their bodies and replace it with 
>> RuntimeError (PEP 479)
>>
>>   With this PEP, it's now possible for the main thread to terminate with 
>> ExceptionGroup(SystemExit), __next__ to raise ExceptionGroup(StopIteration), 
>> a generator to raise ExceptionGroup(StopIteration), either alone or mixed 
>> with other exceptions. How should the VM handle these new cases? Should they 
>> be using except* or except?
>>
>>   I don't think there's an obvious answer here, and possibly the answer is 
>> just "don't do that". But I feel like the PEP should say what the language 
>> semantics are in these cases, one way or another.

There's no need to do anything about ExceptionGroups potentially
wrapping StopIteration or StopAsyncIteration exceptions. The point of
PEP 479 was to solve a problem of one of nested frames raising a
StopIteration (often by mistake) and the outer generator being
stopped. That lead to some really tricky situations to debug. In our
case, a rogue StopIteration wrapped in an EG would not stop a
generator silently, it would do that loud and clear.

As for SystemExit, we'll change "asyncio.run()" to unpack SystemExit
and propagate them unwrapped, potentially allowing to dump the
exception tree into a log file for later debug, if configured. Trio
should do the same. Problem solved.

It's important to understand that the PEP doesn't propose a magical
mechanism to turn all exceptions into EGs automatically, it's up to
the framework/user code how to build them and what to propagate out.
In Python 3.9 you can just as well write `except BaseException: pass`
and silence a SystemExit (and people do that from time to time!)

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/E5RMN6CSKXSIFXBKRIHYLPK2FRXIM4AE/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 654: Exception Groups and except* [REPOST]

2021-03-29 Thread Yury Selivanov
Just a few comments to add to Irit's response.

On Sat, Mar 27, 2021 at 11:03 AM Paul Sokolovsky  wrote:
[..]
> Bottom line: this seems like a Trio's special-purpose feature, with
> good wishes of becoming the de facto standard.

The bottom line is that Trio nurseries were proven to be a very useful
and intuitive primitive. But the error handling API around them is
unintuitive and hard to use. Now that we want to add an equivalent of
nurseries to asyncio (we'll likely call them Task Groups) we need to
sort out the error handling mechanism, finally.

> From my PoV, a solution which doesn't add particular complex behavior
> into the language core, but allows to plug it in, and keep whole thing
> more explicit, is something that "works better".

I understand your PoV, but there's a point where building explicit
complex APIs becomes so detrimental to the overall usability that
there's a warrant for solving the problem in syntax and with builtin
types.

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/YIOMBHTEF4VHXSOW5CKSXAK5ELFYQMFO/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: aiter/anext review request

2021-03-20 Thread Yury Selivanov
Hi Daniel,

I agree that coding async in C is complicated, I've done a fair share
of that and can attest that the code is not straightforward or easily
maintainable. But in this very case I think we care more about
discoverability of these two functions and the overall developer
experience. Having them as builtins makes a bit more sense than
exiling them into the operators module. Given that they are the last
two missing pieces I think we can merge those couple hundred lines of
C. And if we drop the 2-args version of aiter() we'll have a more
reasonable diff.

Yury

On Sat, Mar 20, 2021 at 4:45 PM Daniel Pope  wrote:
>
> As someone who was involved in implementing these, I think they should not be 
> in builtins if that means they have to be in C.
>
> My argument is from a point of maintainability. Writing them was plenty of 
> effort in the first place; Josh had written them in idiomatic async Python in 
> the first place, my contribution was to unroll that to sync Python code, and 
> then port that to (sync) C code. It was a lot of effort and a lot of code - 
> several hundred lines and 4(?) new types. The Python code was a few lines - 
> very readable and likely to be practically as fast. We weren't writing this 
> in C to speed it up or to make the code better, but because we *had to*.
>
> Implementing async functionality in C is a pain because to implement an 
> awaitable type you need not just that awaitable type, but a new type to 
> represent the iterator that am_await returns. I could imagine having generic 
> type objects and other helpers for implementing async PyObjects in C but I 
> don't really envisage anyone doing that; if you want to write async helpers 
> for Python the best framework is Python.
>
> As Josh can attest I was in two minds while implementing this change; I 
> argued firstly that having them in the operator module is fine, and later, 
> that if we want async builtins in general, maybe we could implement them in 
> Python and freeze them into the binary. We pushed on with the C approach 
> mostly because we were already 70% done, and this was what Yury asked for, so 
> it seemed more likely that this would get merged.
>
> But, if we're still discussing whether this should be merged in builtins or 
> operator, and that dictates whether it is in Python or C, I'm 100% behind 
> having this code be Python.
> ___
> Python-Dev mailing list -- python-dev@python.org
> To unsubscribe send an email to python-dev-le...@python.org
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at 
> https://mail.python.org/archives/list/python-dev@python.org/message/VZFDUDH3NIBZX4ADPJ4E7VG2WAWOUBAA/
> Code of Conduct: http://python.org/psf/codeofconduct/



-- 
 Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/CFQ7Y677N3A5GQWU3M3AGMFZDIVWWP7K/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: aiter/anext review request

2021-03-20 Thread Yury Selivanov
On Sat, Mar 20, 2021 at 2:35 PM Guido van Rossum  wrote:

>
> However I'm still skeptical about the two-argument version of aiter() (see
> my previous message about this). Do you have any indication that a use case
> for that exists?
>
>
In my experience this isn't a popular feature. Now that I looked into the
docs, I *think* I remember using it once myself. Generally if I need this
functionality I'd just write a simple generator. Not that this is a
definitive indicator of the popularity of this thing; I'm just sharing my
experience.

That said I wouldn't mind aiter() supporting the two-arguments mode as it
could make it easier to convert some sync code bases (that use greenlets,
for example) to async. And given that async iteration mirrors the sync
iteration protocol pretty closely, I think that aiter() fully mirroring
iter() is expected. I do realize that my arguments here are weak, so my
vote is +0 to support the two-arguments mode.

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/ZEACYVZYCTPZ5PYTLOWGYXB5WAUHTFNO/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: aiter/anext review request

2021-03-20 Thread Yury Selivanov
Hi Joshua,

First of all, thanks for working on this! I quickly looked over the PR and
it looks ready to be merged, great work.

I've been oscillating between wanting to have aiter/anext as builtins and
putting them into the operators module for quite a while. On the one hand
asynchronous iteration is a niche thing compared to regular iteration, on
the other, 'async for' and asynchronous generators are language constructs.
Overall, I'm leaning towards having them as builtins. That would make them
more discoverable and slightly more convenient to use with things like
'aclosing', especially if the code is following the "import only modules"
convention. And in my opinion there's almost no overhead with regards to
how big is the list of builtins (especially with the globals opcode cache).

So my personal vote would be to make them builtins and merge your PR as is.

Yury


On Fri, Mar 19, 2021 at 3:18 PM Joshua Bronson  wrote:

> Thanks for all the feedback so far (and for the kind words, Guido! ).
>
> Discussion here so far is converging on resurrecting my original PR from
> 2018 adding these to operator. Anyone else we should hear from before
> considering the more recent PR not worth pursuing for now? Would be good to
> hear from Yury given his previous feedback, but seems like he’s been too
> busy to respond. Should we wait (for some limited amount of time, in light
> of the upcoming 3.10 feature freeze?) for more feedback?
>
> I’m ready to update whichever PR we’re going ahead with, once I know which
> one that is.
>
> Thanks,
> Josh
>
>
> On Fri, Mar 19, 2021 at 17:23 Brett Cannon  wrote:
>
>> I personally would be okay with aiter() (with the modern API ) and
>> next() in the `operator` module. There's already precedent in having things
>> there that are rarely used directly but still implement the use of a
>> special method, e.g. operator.index() (
>> https://docs.python.org/3/library/operator.html#operator.index).
>>
>> On Fri, Mar 19, 2021 at 10:29 AM Guido van Rossum 
>> wrote:
>>
>>> I assume that one of the concerns is that these functions are trivial.
>>> aiter(x) is just x.__aiter__(), and anext(it) is just it.__next__(). I’m
>>> not convinced that we need aiter(x, sentinel) at all — for iter() it’s
>>> mostly a legacy compatibility API.
>>>
>>> If you use these a lot it’s simple enough to add one-liners to the top
>>> of your module or to your project’s utilities.
>>>
>>> I also feel (but I may be alone in this) that maybe we went overboard
>>> with the design of async for (and async with).
>>>
>>> That said the work itself is impeccable. While you’re waiting for a
>>> resolution you may want to try working on other contributions!
>>>
>>> —Guido
>>>
>>> On Fri, Mar 19, 2021 at 09:59 Luciano Ramalho 
>>> wrote:
>>>
 OK, but it seems clear to me that if there are any lingering doubts it
 would be better to add the functions to a module than to the built-ins, and
 later promote them to built-ins if people actually find them widely useful.

 On the other hand, adding something to built-ins that turns out to be
 rarely useful adds unnecessary noise and is much harder to fix later
 without causing further problems.

 Best,

 Luciano


 On Fri, Mar 19, 2021 at 1:22 PM Joshua Bronson 
 wrote:

> Thanks for taking a look at this, Luciano.
>
> Yury immediately replied
>  to the comment from
> Jelle that you quoted with the following:
>
> > Do these really need to be builtins?
>>
>> We're only beginning to see async iterators being used in the wild,
>> so we can't have a definitive answer at this point.
>>
>> > They seem too specialized to be widely useful; I've personally
>> never needed them in any async code I've written. It would make more 
>> sense
>> to me to put them in a module like operators.
>>
>> I think putting them to the operators module makes sense, at least
>> for 3.8.  Do you want to work on a pull request?
>
>
>
> That was on 2018-06-14. On 2018-08-24, I submitted
> https://github.com/python/cpython/pull/8895, "Add operator.aiter and
> operator.anext". On 2018-09-07, Yury left the following comment
> 
> on that PR:
>
> Please don't merge this yet. I'm not convinced that aiter and anext
>> shouldn't be builtins.
>
>
>
> So there has been some back-and-forth on this, and some more years
> have passed, but all the latest signals we've gotten up to now have
> indicated a preference for adding these to builtins.
>
> In any case, as of my latest PR
> , the Python core
> developers now have both options to choose from.
>
> As community contributors, is there anything further we can do to help
> drive 

[Python-Dev] Re: PEP 622 version 2 (Structural Pattern Matching)

2020-07-17 Thread Yury Selivanov
On Fri, Jul 17, 2020 at 3:54 PM Guido van Rossum  wrote:
>
> On Fri, Jul 17, 2020 at 1:45 PM Yury Selivanov  
> wrote:
>>
>> I've built the reference implementation and I'm experimenting with the
>> new syntax in the edgedb codebase. It seems to have plenty of places
>> where pattern matching adds clarity. I'll see if I find particularly
>> interesting examples of that to share.
>>
>> So far I'm +1 on the proposal, and I like the second iteration of it.
>> Except that I'm really sad to see the __match__ protocol gone.
>
>
> It will be back, just not in 3.10. We need more experience with how 
> match/case are actually used to design the right `__match__` protocol.

Makes sense.

>
>>
>> Quoting the PEP:
>>
>> > One drawback of this protocol is that the arguments to __match__ would be 
>> > expensive to construct, and could not be pre-computed due to the fact 
>> > that, because of the way names are bound, there are no real constants in 
>> > Python.
>
>
> Note: That's not referring to the `__match__` protocol from version 1 of the 
> PEP, but to a hypothetical (and IMO sub-optimal) `__match__` protocol that 
> was discussed among the authors prior to settling on the protocol from 
> version 1.
>
>>
>> While it's not possible to precompute the arguments ahead of time, it
>> certainly should be possible to cache them similarly to how I
>> implemented global names lookup cache in CPython. That should
>> alleviate this particular performance consideration entirely.
>
>
> Where's that global names lookup cache? I seem to have missed its 
> introduction. (Unless you meant PEP 567?)

Here are the related bpos of where Inada-san and I worked on this:

https://bugs.python.org/issue28158
https://bugs.python.org/issue26219

>
>>
>> Having __match__ would allow some really interesting use cases. For
>> example, for binary protocol parsers it would be possible to replicate
>> erlang approach, e.g.:
>>
>>   match buffer:
>> case Frame(char('X'), len := UInt32(), flags := Bits(0, 1, flag1,
>> flag2, 1, 1))
>>
>> would match a Frame of message type 'X', capture its length, and
>> extract two bit flags. This perhaps isn't the greatest example of how
>> a full matching protocol could be used, but it's something that I
>> personally wanted to implement.
>
>
> I see, you'd want the *types* of the arguments to be passed into 
> `Frame.__match__`. That's interesting, although I have a feeling that if I 
> had a real use case like this I'd probably be able to come up with a better 
> DSL for specifying messages than this.

Yeah, it's an open question if this is a good idea or not. FWIW here's
a relevant quick erlang tutorial:
https://dev.to/l1x/matching-binary-patterns-11kh that shows what it
looks like in erlang (albeit the syntax is completely alien to
Python).

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/LMWGQX4RNZE2WS34OM6QU7SLFVQIKYT3/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 622 version 2 (Structural Pattern Matching)

2020-07-17 Thread Yury Selivanov
I've built the reference implementation and I'm experimenting with the
new syntax in the edgedb codebase. It seems to have plenty of places
where pattern matching adds clarity. I'll see if I find particularly
interesting examples of that to share.

So far I'm +1 on the proposal, and I like the second iteration of it.
Except that I'm really sad to see the __match__ protocol gone.

Quoting the PEP:

> One drawback of this protocol is that the arguments to __match__ would be 
> expensive to construct, and could not be pre-computed due to the fact that, 
> because of the way names are bound, there are no real constants in Python.

While it's not possible to precompute the arguments ahead of time, it
certainly should be possible to cache them similarly to how I
implemented global names lookup cache in CPython. That should
alleviate this particular performance consideration entirely.

Having __match__ would allow some really interesting use cases. For
example, for binary protocol parsers it would be possible to replicate
erlang approach, e.g.:

  match buffer:
case Frame(char('X'), len := UInt32(), flags := Bits(0, 1, flag1,
flag2, 1, 1))

would match a Frame of message type 'X', capture its length, and
extract two bit flags. This perhaps isn't the greatest example of how
a full matching protocol could be used, but it's something that I
personally wanted to implement.

Yury
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/SCJ6H6KAE2WNHIVOEQ7M5YCZMU4HCYN6/
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-Dev] Lost sight

2019-01-19 Thread Yury Selivanov
Sorry to hear this, Serhiy.  Hope you'll get better soon.

Yury

On Sat, Jan 19, 2019 at 5:15 AM Serhiy Storchaka  wrote:
>
> I have virtually completely lost the sight of my right eye (and the loss
> is quickly progresses) and the sight of my left eye is weak. That is why
> my activity as a core developer was decreased significantly at recent
> time. My apologies to those who are waiting for my review. I will do it
> slowly.
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com



-- 
 Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Documenting the private C API (was Re: Questions about signal handling.)

2018-09-25 Thread Yury Selivanov
On Tue, Sep 25, 2018 at 3:27 PM Barry Warsaw  wrote:
>
> On Sep 25, 2018, at 12:09, Yury Selivanov  wrote:
> >
> > My main concern with maintaining a *separate* documentation of
> > internals is that it would make it harder to keep it in sync with the
> > actual implementation.  We often struggle to keep the comments in the
> > code in sync with that code.
>
> Well, my goal is that the internal API would show up when I search for 
> function names on docs.python.org.   Right now, I believe the “quick search” 
> box does search the entire documentation suite.  I don’t care too much 
> whether they would reside in a separate section in the current C API, or in a 
> separate directory, listed or not under “Parts of the documentation” on the 
> front landing page.  But I agree they shouldn’t be intermingled with the 
> public C API.

An idea: it would be cool to have something like Sphinx autodoc for C
headers to pull this documentation from source.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Documenting the private C API (was Re: Questions about signal handling.)

2018-09-25 Thread Yury Selivanov
On Tue, Sep 25, 2018 at 11:55 AM Barry Warsaw  wrote:
>
> On Sep 25, 2018, at 11:28, Victor Stinner  wrote:
> >
> > But if we have a separated documented for CPython internals, why not
> > documenting private functions. At least, I would prefer to not put it
> > at the same place an the *public* C API. (At least, a different
> > directory.)
>
> I like the idea of an “internals” C API documentation, separate from the 
> public API.

For that we can just document them in the code, right?  Like this one,
from Include/internal/pystate.h:

/* Initialize _PyRuntimeState.
   Return NULL on success, or return an error message on failure. */
PyAPI_FUNC(_PyInitError) _PyRuntime_Initialize(void);

My main concern with maintaining a *separate* documentation of
internals is that it would make it harder to keep it in sync with the
actual implementation.  We often struggle to keep the comments in the
code in sync with that code.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Questions about signal handling.

2018-09-24 Thread Yury Selivanov
On Mon, Sep 24, 2018 at 4:19 PM Eric Snow  wrote:
[..]
> Is there a good place where this weirdness is documented?

I'll need to look through uvloop & libuv commit log to remember that;
will try to find time tonight/tomorrow.

[..]
> This matters to me because I'd like to use "pending" calls for
> subinterpreters, which means dealing with signals *in*
> Py_MakePendingCalls() is problematic.  Pulling the
> PyErr_CheckSignals() call out would eliminate that problem.

Py_MakePendingCalls is a public API, even though it's not documented.
If we change it to not call PyErr_CheckSignals and if there are C
extensions that block pure Python code execution for long time (but
call Py_MakePendingCalls explicitly), such extensions would stop
reacting to ^C.

Maybe a better workaround would be to introduce a concept of "main"
sub-interpreter? We can then fix Py_MakePendingCalls to only check for
signals when it's called from the main interpreter.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Questions about signal handling.

2018-09-24 Thread Yury Selivanov
On Fri, Sep 21, 2018 at 7:04 PM Eric Snow  wrote:
>
> Hi all,
>
> I've got a pretty good sense of how signal handling works in the
> runtime (i.e. via a dance with the eval loop), but still have some
> questions:
>
> 1. Why do we restrict calls to signal.signal() to the main thread?
> 2. Why must signal handlers run in the main thread?
> 3. Why does signal handling operate via the "pending calls" machinery
> and not distinctly?

Here's my take on this:

Handling signals in a multi-threaded program is hard. Some signals can
be delivered to an arbitrary thread, some to the one that caused them.
Posix provides lots of mechanisms to tune how signals are received (or
blocked) by individual threads, but (a) Python doesn't expose those
APIs, (b) using those APIs correctly is insanely hard.  By restricting
that we can only receive signals in the main thread we remove all that
complexity.  Restricting that signal.signal() can only be called from
the main thread just makes this API more consistent (and also IIRC
avoids weird sigaction() behaviour when it is called from different
threads within one program).

Next, you can only call reentrant functions in your signal handlers.
For instance, printf() function isn't safe to use.  Therefore one
common practice is to set a flag that a signal was received and check
it later (exactly what we do with the pending calls machinery).

Therefore, IMO, the current way we handle signals in Python is the
safest, most predictable, and most cross-platform option there is.
And changing how Python signals API works with threads in any way will
actually break the world.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Postponed annotations break inspection of dataclasses

2018-09-22 Thread Yury Selivanov
On Sat, Sep 22, 2018 at 3:11 PM Guido van Rossum  wrote:
[..]
> Still, I wonder if there's a tweak possible of the globals and locals used 
> when exec()'ing the function definitions in dataclasses.py, so that 
> get_type_hints() gets the right globals for this use case.
>
> It's really tough to be at the intersection of three PEPs...

If it's possible to fix exec() to accept any Mapping (not just dicts),
then we can create a proxy mapping for "Dataclass.__init__.__module__"
module and everything would work as expected.

Here's a very hack-ish fix we can use in meanwhile (even in 3.7.1?):
https://gist.github.com/1st1/37fdd3cc84cd65b9af3471b935b722df

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Late Python 3.7.1 changes to fix the C locale coercion (PEP 538) implementation

2018-09-21 Thread Yury Selivanov
On Wed, Sep 19, 2018 at 4:26 PM Ned Deily  wrote:
> On Sep 19, 2018, at 13:30, Yury Selivanov  wrote:
[..]
> > Currently it's designed to expose "PyContext*" and "PyContextVar*"
> > pointers.  I want to change that to "PyObject*" as using non-PyObject
> > pointers turned out to be a very bad idea (interfacing with Cython is
> > particularly challenging).
> >
> > Is it a good idea to change this in Python 3.7.1?
>
> It's hard to make an informed decision without a concrete PR to review.  What 
> would be the impact on any user code that has already adopted it in 3.7.0?

Ned, I've created an issue to track this: https://bugs.python.org/issue34762

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Late Python 3.7.1 changes to fix the C locale coercion (PEP 538) implementation

2018-09-19 Thread Yury Selivanov
Ned, Nick, Victor,

There's an issue with the new PEP 567 (contextvars) C API.

Currently it's designed to expose "PyContext*" and "PyContextVar*"
pointers.  I want to change that to "PyObject*" as using non-PyObject
pointers turned out to be a very bad idea (interfacing with Cython is
particularly challenging).

Is it a good idea to change this in Python 3.7.1?

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Use of Cython

2018-09-04 Thread Yury Selivanov
On Tue, Sep 4, 2018 at 2:58 PM Stefan Behnel  wrote:
[..]
> Cython has four ways to provide type declarations: cdef statements in
> Cython code, external .pxd files for Python or Cython files, special
> decorators and declaration functions, and PEP-484/526 type annotations.

Great to hear that PEP 484 type annotations are supported.  Here's a
link to the docs:
https://cython.readthedocs.io/en/latest/src/tutorial/pure.html#static-typing

[..]
> > I know that Cython has a mode to use decorators in
> > pure Python code to annotate types, but they are less intuitive than
> > using typing annotations in 3.6+.
>
> You can use PEP-484/526 type annotations to declare Cython types in Python
> code that you intend to compile. It's entirely up to you, and it's an
> entirely subjective measure which "is better". Many people prefer Cython's
> non-Python syntax because it allows them to apply their existing C
> knowledge. For them, PEP-484 annotations may easily be non-intuitive in
> comparison.

Yeah, but if we decide to use Cython in CPython we probably need to
come up with something like PEP 7 to recommend one particular style
and have an overall guideline.  Using PEP 484 annotations means that
we have pure Python code that PyPy and other interpreters can still
run.

[..]
> > I'd be +0.5 on using Cython (optionally?) to compile some pure Python
> > code to make it 30-50% faster.  asyncio, for instance, would certainly
> > benefit from that.
>
> Since most of this (stdlib) Python code doesn't need to stay syntax
> compatible with Python < 3.6 (actually 3.8) anymore, you can probably get
> much higher speedups than that by statically typing some variables and
> functions here and there. I recently tried that with difflib, makes a big
> difference.

I'd be willing to try this in asyncio if we start using Cython.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Use of Cython

2018-09-04 Thread Yury Selivanov
Hi Stefan,

On Sat, Sep 1, 2018 at 6:12 PM Stefan Behnel  wrote:
>
> Yury,
>
> given that people are starting to quote enthusiastically the comments you
> made below, let me set a couple of things straight.

To everyone reading this thread please keep in mind that I'm not in
position to "defend" mypyc or to "promote" it, and I'm not affiliated
with the project at all.  I am just excited about yet another tool to
statically compile Python and I'm discussing it only from a
theoretical standpoint.

>
> Yury Selivanov schrieb am 07.08.2018 um 19:34:
> > On Mon, Aug 6, 2018 at 11:49 AM Ronald Oussoren via Python-Dev wrote:
> >
> >> I have no strong opinion on using Cython for tests or in the stdlib, other 
> >> than that it is a fairly large dependency.  I do think that adding a 
> >> “Cython-lite” tool the CPython distribution would be less ideal, creating 
> >> and maintaining that tool would be a lot of work without clear benefits 
> >> over just using Cython.
> >
> > Speaking of which, Dropbox is working on a new compiler they call "mypyc".
> >
> > mypyc will compile type-annotated Python code to an optimized C.
>
> That's their plan. Saying that "it will" is a bit premature at this point.
> The list of failed attempts at writing static Python compilers is rather
> long, even if you only count those that compile the usual "easy subset" of
> Python.
>
> I wish them the best of luck and endurance, but they have a long way to go.

I fully agree with you here.

>
>
> > The
> > first goal is to compile mypy with it to make it faster, so I hope
> > that the project will be completed.
>
> That's not "the first goal". It's the /only/ goal. The only intention of
> mypyc is to be able to compile and optimise enough of Python to speed up
> the kind or style of code that mypy uses.
>
>
> > Essentially, mypyc will be similar
> > to Cython, but mypyc is a *subset of Python*, not a superset.
>
> Which is bad, right? It means that there will be many things that simply
> don't work, and that you need to change your code in order to make it
> compile at all. Cython is way beyond that point by now. Even RPython will
> probably continue to be way better than mypyc for quite a while, maybe
> forever, who knows.

To be clear I'm not involved with mypyc, but my understanding is that
the entire Python syntax will be supported, except some dynamic
features like patching `globals()`, `locals()`, or classes, or
__class__.  IMO this is *good* and in general Python programs don't do
that anyways.

>
>
> > Interfacing with C libraries can be easily achieved with cffi.
>
> Except that it will be fairly slow. cffi is not designed for static
> analysis but for runtime operations.

Could you please clarify this point?  My current understanding is that
you can build a static compiler with a knowledge about cffi so that it
can compile calls like `ffi.new("something_t[]", 80)` to pure C.

> You can obviously also use cffi from
> Cython – but then, why would you, if you can get much faster code much more
> easily without using cffi?

The "much more easily" part is debatable here and is highly
subjective.  For me using Cython is also easier *at this point*
because I've spent so much time working with it. Although getting
there wasn't easy for me :(

>
> That being said, if someone wants to write a static cffi optimiser for
> Cython, why not, I'd be happy to help with my advice. The cool thing is
> that this can be improved gradually, because compiling the cffi code
> probably already works out of the box. It's just not (much) faster than
> when interpreted.

Yeah, statically compiling cffi-enabled code is probably the way to go
for mypyc and Cython.

>
>
> > Being a
> > strict subset of Python means that mypyc code will execute just fine
> > in PyPy.
>
> So does normal (non-subset) Python code. You can run it in PyPy, have
> CPython interpret it, or compile it with Cython if you want it to run
> faster in CPython, all without having to limit yourself to a subset of
> Python. Seriously, you make this sound like requiring users to rewrite
> their code to make it compilable with mypyc was a good thing.

But that's the point: unless you add Cython types to your Python code
it gets only moderate speedups.  Using Cython/C types usually means
that you need to use pxd/pyx files which means that the code isn't
Python anymore.  I know that Cython has a mode to use decorators in
pure Python code to annotate types, but they are less intuitive than
using typing annotations in 3.6+.

[..]
> > I'd be more willing to start using mypyc+cffi in CPython stdlib
> > *eventually*, than Cython 

Re: [Python-Dev] Use of Cython

2018-08-07 Thread Yury Selivanov
On Mon, Aug 6, 2018 at 11:49 AM Ronald Oussoren via Python-Dev
 wrote:

> I have no strong opinion on using Cython for tests or in the stdlib, other 
> than that it is a fairly large dependency.  I do think that adding a 
> “Cython-lite” tool the CPython distribution would be less ideal, creating and 
> maintaining that tool would be a lot of work without clear benefits over just 
> using Cython.

Speaking of which, Dropbox is working on a new compiler they call "mypyc".

mypyc will compile type-annotated Python code to an optimized C. The
first goal is to compile mypy with it to make it faster, so I hope
that the project will be completed. Essentially, mypyc will be similar
to Cython, but mypyc is a *subset of Python*, not a superset.
Interfacing with C libraries can be easily achieved with cffi. Being a
strict subset of Python means that mypyc code will execute just fine
in PyPy. They can even apply some optimizations to it eventually, as
it has a strict and static type system.

I'd be more willing to start using mypyc+cffi in CPython stdlib
*eventually*, than Cython now.  Cython is a relatively complex and
still poorly documented language.  I'm speaking from experience after
writing thousands of lines of Cython in uvloop & asyncpg.  In skillful
hands Cython is amazing, but I'd be cautious to advertise and use it
in CPython.

I'm also -1 on using Cython to test C API. While writing C tests is
annoying (I wrote a fair share myself), their very purpose is to make
third-party tools/extensions more stable. Using a third-party tool to
test C API to track regressions that break third-party tools feels
wrong.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Finding Guido's replacement

2018-07-23 Thread Yury Selivanov
On Mon, Jul 23, 2018 at 12:03 PM Antoine Pitrou  wrote:

>
> I suspect Chris A. was merely joking, though I'm not sure what the joke
> ultimately is supposed to be about.
>


Ah, right, I stopped reading his email after the quoted line. Well executed.

Yury

>

> --
 Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Finding Guido's replacement

2018-07-23 Thread Yury Selivanov
On Sun, Jul 22, 2018 at 11:18 PM Chris Angelico  wrote:

>
> * Lately, all Guido's actions have been to benefit his employer, not
> the Common Pythonista. We have proof of this from reliable reporting
> sources such as Twitter and social media.
>

This accusation is ridiculous and not appreciated. Type hinting is one of
the most praised Python features in pretty much any big company, where
managing millions of lines of Python is challenging. Next time, I also
suggest you to cite and name your 'reliable reporting sources' otherwise
this is just bs.

Yury
-- 
 Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 572: Do we really need a ":" in ":="?

2018-07-05 Thread Yury Selivanov
I think I tried a variation of your proposal here
https://mail.python.org/pipermail/python-dev/2018-April/152939.html
and nobody really liked it.

Yury
On Thu, Jul 5, 2018 at 7:44 PM Alexander Belopolsky
 wrote:
>
> I wish I had more time to make my case, but with the PEP 572 pronouncement 
> imminent, let me make an attempt to save Python from having two assignment 
> operators.
>
> I've re-read the PEP, and honestly I am warming up to the idea of allowing a 
> limited form of assignment in expressions.  It looks like in the current 
> form, the PEP supports only well-motivated cases where the return value of 
> the assignment expression is non-controversial.  It also appears that there 
> are no cases where = can be substituted for := and not cause a syntax error.  
> This means that ":" in ":=" is strictly redundant.
>
> Interestingly, Python already has a precedent for using redundant ":" - the 
> line-ending ":" in various statements is redundant, but it is helpful both 
> when reading and writing the code.
>
> On the other hand, ':' in ':=' looks like an unnecessary embellishment.  When 
> we use ':=', we already know that we are inside an expression and being 
> inside an expression is an obvious context for the reader, the writer and the 
> interpreter.
>
> I also believe, allowing a limited form of assignment in expressions is a 
> simpler story to tell to the existing users than an introduction of a new 
> operator that is somewhat like '=', but cannot be used where you currently 
> use '=' and only in places where '=' is currently prohibited.
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com



-- 
 Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Examples for PEP 572

2018-07-04 Thread Yury Selivanov
On Wed, Jul 4, 2018 at 2:16 PM Tim Peters  wrote:
>
> [Yury Selivanov]
> > Wow, I gave up on this example before figuring this out (and I also
>
> > stared at it for a good couple of minutes).  Now it makes sense.  It's
>
> > funny that this super convoluted snippet is shown as a good example
>
> > for PEP 572.  Although almost all PEP 572 examples are questionable.
>
> And another who didn't actually read the PEP Appendix.  See my reply just 
> before this one:  yes, the Appendix gave that as a good example, but as a 
> good example of assignment-expression ABUSE.  The opposite of something 
> desirable.
>
> I've never insisted there's only one side to this, and when staring at code 
> was equally interested in cases where assignment expressions would hurt as 
> where they would help.  I ended up giving more examples where they would 
> help, because after writing up the first two bad examples in the Appendix 
> figured it was clear enough that "it's a bad idea except in cases where it 
> _obviously_ helps".
>
> Same way,  e.g., as when list comprehensions were new, I focused much more on 
> cases where they might help after convincing myself that a great many nested 
> loops building lists were much better left _as_ nested loops.  So I looked 
> instead for real-code cases where they would obviously help, and found plenty.
>
> And I'm really glad Python added listcomps too, despite the possibility of 
> gross abuse ;-)

Thank you for the clarification, Tim.  I agree, list comprehensions
can indeed be abused and we all see that happening occasionally.
However, assignment expressions make it easy (almost compelling) to
push more logic even to simple comprehensions (with one "for" / one
"if").

You probably understand why most core devs are irritated with the PEP:
the majority thinks that the potential of its abuse can't be compared
to any other Python syntax.  It's sad that the PEP doesn't really
address that except saying "This is a tool, and it is up to the
programmer to use it where it makes sense, and not use it where
superior constructs can be used."  Well, to those of use who routinely
review code written by people who aren't proficient Python coders
(read most people working at Google, Facebook, Dropbox, Microsoft, or
really any company) this argument is very weak.

In other words: I'm looking forward to reviewing clever code written
by clever people.  Spending extra time arguing if a snippet is
readable or not should be fun and productive, right? :)  Disabling :=
as a company-wide policy will probably be the only way out.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Examples for PEP 572

2018-07-04 Thread Yury Selivanov
On Wed, Jul 4, 2018 at 1:35 PM Ivan Pozdeev via Python-Dev
 wrote:
>
> On 04.07.2018 11:54, Serhiy Storchaka wrote:

> >> while total != (total := total + term):
> >> term *= mx2 / (i*(i+1))
> >> i += 2
> >> return total
> >
> > This code looks clever that the original while loop with a break in a
> > middle. I like clever code. But it needs more mental efforts for
> > understanding it.
> >
> > I admit that this is a good example.
> >
> > There is a tiny problem with it (and with rewriting a while loop as a
> > for loop, as I like). Often the body contains not a single break. In
> > this case the large part of cleverness is disappeared. :-(
>
> It took me a few minutes to figure out that this construct actually
> checks term == 0.

Wow, I gave up on this example before figuring this out (and I also
stared at it for a good couple of minutes).  Now it makes sense.  It's
funny that this super convoluted snippet is shown as a good example
for PEP 572.  Although almost all PEP 572 examples are questionable.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Yury Selivanov
On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtković  wrote:
>
> Hi,
>
> I've been using asyncio a lot lately and have encountered this problem 
> several times. Imagine you want to do a lot of queries against a database, 
> spawning 1 tasks in parallel will probably cause a lot of them to fail. 
> What you need in a task pool of sorts, to limit concurrency and do only 20 
> requests in parallel.
>
> If we were doing this synchronously, we wouldn't spawn 1 threads using 
> 1 connections, we would use a thread pool with a limited number of 
> threads and submit the jobs into its queue.
>
> To me, tasks are (somewhat) logically analogous to threads. The solution that 
> first comes to mind is to create an AsyncioTaskExecutor with a submit(coro, 
> *args, **kwargs) method. Put a reference to the coroutine and its arguments 
> into an asyncio queue. Spawn n tasks pulling from this queue and awaiting the 
> coroutines.
>
> It'd probably be useful to have this in the stdlib at some point.

Sounds like a good idea!  Feel free to open an issue to prototype the API.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A more flexible task creation

2018-06-13 Thread Yury Selivanov
On Wed, Jun 13, 2018 at 4:47 PM Michel Desmoulin
 wrote:
>
> I was working on a concurrency limiting code for asyncio, so the user
> may submit as many tasks as one wants, but only a max number of tasks
> will be submitted to the event loop at the same time.

What does that "concurrency limiting code" do?  What problem does it solve?

>
> However, I wanted that passing an awaitable would always return a task,
> no matter if the task was currently scheduled or not. The goal is that
> you could add done callbacks to it, decide to force schedule it, etc

The obvious advice is to create a new class "DelayedTask" with a
Future-like API.  You can then schedule the real awaitable that it
wraps with `loop.create_task` at any point.  Providing
"add_done_callback"-like API is trivial.  DelayedTask can itself be an
awaitable, scheduling itself on a first __await__ call.

As a benefit, your implementation will support any Task-like objects
that alternative asyncio loops can implement. No need to mess with
policies either.

>
> I dug in the asyncio.Task code, and encountered:
>
> def __init__(self, coro, *, loop=None):
> ...
> self._loop.call_soon(self._step)
> self.__class__._all_tasks.add(self)
>
> I was surprised to see that instantiating a Task class has any side
> effect at all, let alone 2, and one of them being to be immediately
> scheduled for execution.

To be fair, implicitly scheduling a task for execution is what all
async frameworks (twisted, curio, trio) do when you wrap a coroutine
into a task.  I don't recall them having a keyword argument to control
when the task is scheduled.

>
> I couldn't find a clean way to do what I wanted: either you
> loop.create_task() and you get a task but it runs, or you don't run
> anything, but you don't get a nice task object to hold on to.

A clean way is to create a new layer of abstraction (e.g. DelayedTask
I suggested above).

[..]
> I tried creating a custom task, but it was even harder, setting a custom
> event policy, to provide a custom event loop with my own create_task()
> accepting parameters. That's a lot to do just to provide a parameter to
> Task, especially if you already use a custom event loop (e.g: uvloop). I
> was expecting to have to create a task factory only, but task factories
> can't get any additional parameters from create_task()).

I don't think creating a new Task implementation is needed here, a
simple wrapper should work just fine.

[..]
> Hence I have 2 distinct, but independent albeit related, proposals:
>
> - Allow Task to be created but not scheduled for execution, and add a
> parameter to ensure_future() and create_task() to control this. Awaiting
> such a task would just do like asyncio.sleep(O) until it is scheduled
> for execution.
>
> - Add an parameter to ensure_future() and create_task() named "kwargs"
> that accept a mapping and will be passed as **kwargs to the underlying
> created Task.
>
> I insist on the fact that the 2 proposals are independent, so please
> don't reject both if you don't like one or the other. Passing a
> parameter to the underlying custom Task is still of value even without
> the unscheduled instantiation, and vice versa.

Well, to add a 'kwargs' parameter to ensure_future() we need kwargs in
Task.__init__.  So far we only have 'loop' and it's not something that
ensure_future() should allow you to override.  So unless we implement
the first proposal, we don't need the second.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev)

2018-06-08 Thread Yury Selivanov
On Fri, Jun 8, 2018 at 9:24 AM Ronald Oussoren  wrote:
[..]
> Wouldn’t it be enough to visit just the the newly tracked object in 
> PyObject_GC_Track with a visitor function that does something minimal to 
> verify that the object value is sane, for example by checking 
> PyType_Ready(Py_TYPE(op)).

+1.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Why not using "except: (...) raise" to cleanup on error?

2018-06-04 Thread Yury Selivanov
On Mon, Jun 4, 2018 at 3:38 PM Victor Stinner  wrote:
>
> 2018-06-04 18:45 GMT+02:00 Guido van Rossum :
> > It is currently a general convention in asyncio to only catch Exception, not
> > BaseException. I consider this a flaw and we should fix it, but it's
> > unfortunately not so easy -- the tests will fail if you replace all
> > occurrences of Exception with BaseException, and it is not always clear
> > what's the right thing to do. E.g. catching KeyboardInterrupt may actually
> > make it harder to stop a runaway asyncio app.
>
> I recall vaguely something about loop.run_until_complete() which
> didn't behave "as expected" when interrupted by CTRL+c, like the
> following call to loop.run_until_complete() didn't work as expected.
> But this issue has been sorted out, no?

No, the issue is still there.  And it's not an easy fix.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Why not using "except: (...) raise" to cleanup on error?

2018-06-04 Thread Yury Selivanov
On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico  wrote:
>
> On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner  wrote:
[..]
> > For me, it's fine to catch any exception using "except:" if the block
> > contains "raise", typical pattern to cleanup a resource in case of
> > error. Otherwise, there is a risk of leaking open file or not flushing
> > data on disk, for example.
>
> Pardon the dumb question, but why is try/finally unsuitable?

Because try..finally isn't equivalent to try..except?  Perhaps you
should look at the actual code:
https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Why not using "except: (...) raise" to cleanup on error?

2018-06-04 Thread Yury Selivanov
> It is currently a general convention in asyncio to only catch Exception, not 
> BaseException. I consider this a flaw and we should fix it, but it's 
> unfortunately not so easy -- the tests will fail if you replace all 
> occurrences of Exception with BaseException, and it is not always clear 
> what's the right thing to do. E.g. catching KeyboardInterrupt may actually 
> make it harder to stop a runaway asyncio app.

Yes.

Catching BaseExceptions or KeyboardInterrupts in start_tls() would be
pointless. Currently asyncio's internal state isn't properly hardened
to survive a BaseException in all other places it can occur.  Fixing
that is one of my goals for 3.8.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Reminder: Please elaborate commit messages

2018-05-22 Thread Yury Selivanov
On Tue, May 22, 2018 at 8:52 AM Victor Stinner  wrote:

> Usually, I don't open a new bug to fix or enhance a test. So I
> wouldn't say that it's mandatory. It's really on a case by case basis.

> It seems like test_asyncio failures are a hot topic these days :-)
> It's one of the reasons why Python 3.7rc1 has been delayed by 2 days,
> no? :-)

Yes, getting Windows tests stable wasn't easy. I think it's solved now
(thanks to Andrew), but we always welcome any help from other core devs :)

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] please help triage VSTS failures

2018-05-18 Thread Yury Selivanov
On Fri, May 18, 2018 at 4:15 PM Steve Dower  wrote:
[..]
> The asyncio instability is apparently really hard to fix. There were 2-3
people looking into it yesterday on one of the other systems, but
apparently we haven’t solved it yet (my guess is lingering state from a
previous test). The multissl script was my fault for not realising that we
don’t use it on 3.6 builds, but that should be fixed already. Close/reopen
PR is the best way to trigger a rebuild right now.

I asked Andrew Svetlov to help with asyncio CI triage.  Hopefully we'll
resolve most of them early next week.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 572: Assignment Expressions

2018-04-30 Thread Yury Selivanov
On Mon, Apr 30, 2018 at 1:03 PM Chris Angelico  wrote:
> > That's a weird argument, Chris :-)
> >
> > If `f(x)` has no meaningful name, then *what* is the result of the
> > comprehension?  Perhaps some meaningless data? ;)

> f(x) might have side effects. Can you give a meaningful name to the
> trivial helper function?

I don't understand your question. How is `f(x)` having side effects or not
having them is relevant to the discussion? Does ':=' work only with pure
functions?

> Not every trivial helper can actually have a
> name that saves people from having to read the body of the function.

I don't understand this argument either, sorry.

> >> We've been over this argument plenty, and I'm not going to rehash it.
> >
> > Hand-waving the question the way you do simply alienates more core devs
to
> > the PEP.  And PEP 572 hand-waves a lot of questions and concerns.
Asking
> > people to dig for answers in 700+ emails about the PEP is a bit too
much,
> > don't you agree?
> >
> > I think it's PEP's author responsibility to address questions right in
> > their PEP.

> If I answer every question, I make that number into 800+, then 900+,
> then 1000+. If I don't, I'm alienating everyone by being dismissive.
> If every question is answered in the PEP, the document itself becomes
> so long that nobody reads it. Damned if I do, damned if I don't. Got
> any alternative suggestions?

IMO, big part of why that we have 100s of emails is because people are very
concerned with readability.  The PEP just hand-waives the question
entirely, instead of listing good and realistic examples of code, as well
as listing bad examples.  So that, you know, people could compare them and
understand *both* pros and cons.

Instead we have a few very questionable examples in the PEP that most
people don't like at all. Moreover, half of the PEP is devoted to fixing
comprehensions scoping, which is almost an orthogonal problem to adding a
new syntax.

So my suggestion remains to continue working on the PEP, improving it and
making it more comprehensive. You're free to ignore this advice, but don't
be surprised that you see new emails about what ':=' does to code
readability (with the same arguments).  PEP 572 proponents answering to
every email with the same dismissive template doesn't help either.

> >> > def do_things(fire_missiles=False, plant_flowers=False): ...
> >> > do_things(plant_flowers:=True) # whoops!
> >
> >> If you want your API to be keyword-only, make it keyword-only. If you
> >
> > Another hand-waving.  Should we deprecate passing arguments by name if
> > their corresponding parameters are not keyword-only?
> >
> > Mark shows another potential confusion between '=' and ':=' that people
> > will have, and it's an interesting one.

> A very rare one compared to the confusions that we already have with
> '=' and '=='. And this is another argument that we've been over,
> multiple times.

How do you know if it's rare or not?  '=' is used to assign, ':=' is used
to assign, '==' is used to compare.  I can easily imagine people being
confused why '=' works for setting an argument, and why ':=' doesn't.
Let's agree to disagree on this one :)

> > Strange. I see people who struggle to format their code properly or use
the
> > language properly *every day* ;)

> And do they blame the language for having a comparison operator that
> is so easy to type? Or do they fix their bugs and move on? Again,
> language syntax is not the solution to bugs.

I'm not sure how to correlate what I was saying with your reply, sorry.

Anyways, Chris, I think that the PEP hand-waves a lot of questions and
doesn't have a comprehensive analysis of how the PEP will affect syntax and
readability. It's up to you to consider taking my advice or not. I'll try
to (again) restrain myself posting about this topic.

Y
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 572: Assignment Expressions

2018-04-30 Thread Yury Selivanov
On Mon, Apr 30, 2018 at 11:32 AM Chris Angelico  wrote:

> On Tue, May 1, 2018 at 12:30 AM, Mark Shannon  wrote:
> > List comprehensions
> > ---
> > The PEP uses the term "simplifying" when it really means "shortening".
> > One example is
> > stuff = [[y := f(x), x/y] for x in range(5)]
> > as a simplification of
> > stuff = [(lambda y: [y,x/y])(f(x)) for x in range(5)]

> Now try to craft the equivalent that captures the condition in an if:

> results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

Easy:

   results = []
   for x in input_data:
  y = f(x)
  if y > 0:
results.append((x, y, x/y))

Longer, but way more readable and debuggable if you're into that.  This has
worked for us many years and only a handful of people complained about
this.

OTOH, I see plenty of people complaining that nested list comprehensions
are hard to read.  In my own code reviews I ask people to avoid using
complex comprehensions all the time.

> Do that one with a lambda function.

Why would I?  Is using lambda functions mandatory?


> > IMO, the "simplest" form of the above is the named helper function.
> >
> > def meaningful_name(x):
> > t = f(x)
> > return t, x/t
> >
> > [meaningful_name(i) for i in range(5)]
> >
> > Is longer, but much simpler to understand.

> Okay, but what if there is no meaningful name? It's easy to say "pick
> a meaningful name". It's much harder to come up with an actual name
> that is sufficiently meaningful that a reader need not go look at the
> definition of the function.

That's a weird argument, Chris :-)

If `f(x)` has no meaningful name, then *what* is the result of the
comprehension?  Perhaps some meaningless data? ;)


> > I am also concerned that the ability to put assignments anywhere
> > allows weirdnesses like these:
> >
> > try:
> > ...
> > except (x := Exception) as x:
> > ...
> >
> > with (x: = open(...)) as x:
> > ...

> We've been over this argument plenty, and I'm not going to rehash it.

Hand-waving the question the way you do simply alienates more core devs to
the PEP.  And PEP 572 hand-waves a lot of questions and concerns.  Asking
people to dig for answers in 700+ emails about the PEP is a bit too much,
don't you agree?

I think it's PEP's author responsibility to address questions right in
their PEP.


> > def do_things(fire_missiles=False, plant_flowers=False): ...
> > do_things(plant_flowers:=True) # whoops!

> If you want your API to be keyword-only, make it keyword-only. If you

Another hand-waving.  Should we deprecate passing arguments by name if
their corresponding parameters are not keyword-only?

Mark shows another potential confusion between '=' and ':=' that people
will have, and it's an interesting one.

> want a linter that recognizes unused variables, get a linter that
> recognizes unused variables.

Many want Python to be readable and writeable without linters.

> Neither of these is the fault of the
> proposed syntax; you could just as easily write this:

> do_things(plant_flowers==True)

> but we don't see myriad reports of people typing too many characters
> and blaming the language.

Strange. I see people who struggle to format their code properly or use the
language properly *every day* ;)

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Review of Pull Request 5974 please

2018-04-29 Thread Yury Selivanov
Reviewed. This seems to be an omission that needs to fixed, thanks for the
PR! Almost good to go in 3.8. As for 3.7, this isn't a bug fix it's up to
Ned if he wants to accept it.

Yury
On Sun, Apr 29, 2018 at 8:02 AM Anthony Flury via Python-Dev <
python-dev@python.org> wrote:

> All,

> Can someone please review Pull Request 5974
>  on Python3.8 - the Pull
> request was submitted on 4th March - this pull request is associated
> with bpo-32933 

> To summarize the point of this pull request:

> It fixes a bug of omission within mock_open
> <
https://docs.python.org/3/library/unittest.mock.html?highlight=mock_open#unittest.mock.mock_open

> (part of unittest.mock)

> The functionality of mock_open enables the test code to mock a file
> being opened with some data which can be read. Importantly, mock_open
> has a read_data attrribute which can be used to specify the data to read
> from the file.

> The mocked file which is opened correctly supports file.read(),
> file.readlines(), file.readline(). These all make use of the read_data
> as expected, and the mocked file also supports being opened as a context
> manager.

> But the mock_open file does not support iteration  - so pythonic code
> which uses a for loop to iterate around the file content will only ever
> appear to iterate around an empty file, regardless of the read_data
> attribute when the mock_open is created

> So non-pythonic methods to iterate around the file contents - such as
> this :

>   data = opened_file.readlines()
>   for line in data:
>   process_line(line)

> and this :

>  line = opened_file.readline()
>  while line:
>  process_line(line)
>  line = opened_file.readline()

> Can both be tested with the mocked file containing simulated data (using
> the read_data attribute) as expected.

> But this code (which by any standard is the 'correct' way to iterate
> around the file content of a text file):

>   for line in opened_file:
>   process_line(line)

> Will only ever appear to iterate around an empty file when tested using
> mock_open.

> I would like this to be reviewed so it can be back-ported into Python3.7
> and 3.6 if at all possible. I know that the bug has existed since the
> original version of mock_open, but it does seem strange that code under
> test which uses a pythonic code structure can't be fully tested fully
> using the standard library.

> --
> Anthony Flury
> email : *anthony.fl...@btinternet.com*
> Twitter : *@TonyFlury *

> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com



-- 
  Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] (name := expression) doesn't fit the narrative of PEP 20

2018-04-25 Thread Yury Selivanov
On Wed, Apr 25, 2018 at 8:22 PM Chris Angelico  wrote:
[..]
> >   my_func(arg, buffer=(buf := [None]*get_size()), size=len(buf))
> >
> > To my eye this is an anti-pattern.  One line of code was saved, but the
> > other line becomes less readable.  The fact that 'buf' can be used after
> > that line means that it will be harder for a reader to trace the origin
of
> > the variable, as a top-level "buf = " statement would be more visible.

> Making 'buf' more visible is ONLY a virtue if it's going to be used
> elsewhere. Otherwise, the name 'buf' is an implementation detail of
> the fact that this function wants both a buffer and a size. Should you
> want to expand this out over more lines, you could do this:

Chris, you didn't read that paragraph in my email to the end or I did a
poor job at writing it.

My point is that "buf" can still be used below that line, and therefore
sometimes it will be used, as a result of quick refactoring or poor coding
style.  It's just how things happen when you write code: it gets rewritten
and parts of it left outdated or not properly revised.  *If* "buf" is used
below that line it *will* be harder to find where it was initially set.

Anyways, I don't want to distract everyone further so I'm not interested
in continuing the discussion about what is readable and what is not.
My own opinion on this topic is unlikely to change.  I wanted to explain
my -1; hopefully it will be noted.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] (name := expression) doesn't fit the narrative of PEP 20

2018-04-25 Thread Yury Selivanov
On Wed, Apr 25, 2018 at 5:58 PM Guido van Rossum  wrote:
[..]
> It was meant dismissive. With Chris, I am tired of every core dev
starting their own thread about how PEP 572 threatens readability or
doesn't reach the bar for new syntax (etc.). These arguments are entirely
emotional and subjective.

FWIW I started my thread for allowing '=' in expressions to make sure that
we fully explore that path.  I don't like ':=' and I thought that using '='
can make the idea more appealing to myself and others. It didn't, sorry if
it caused any distraction. Although adding a new ':=' operator isn't my main
concern.

I think it's a fact that PEP 572 makes Python more complex.
Teaching/learning Python will inevitably become harder, simply because
there's one more concept to learn.

Just yesterday this snippet was used on python-dev to show how great the
new syntax is:

  my_func(arg, buffer=(buf := [None]*get_size()), size=len(buf))

To my eye this is an anti-pattern.  One line of code was saved, but the
other line becomes less readable.  The fact that 'buf' can be used after
that line means that it will be harder for a reader to trace the origin of
the variable, as a top-level "buf = " statement would be more visible.

The PEP lists this example as an improvement:

  [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

I'm an experienced Python developer and I can't read/understand this
expression after one read. I have to read it 2-3 times before I trace where
'y' is set and how it's used.  Yes, an expanded form would be ~4 lines
long, but it would be simple to read and therefore review, maintain, and
update.

Assignment expressions seem to optimize the *writing code* part, while
making *reading* part of the job harder for some of us.  I write a lot of
Python, but I read more code than I write. If the PEP gets accepted I'll
use
the new syntax sparingly, sure.  My main concern, though, is that this PEP
will likely make my job as a code maintainer harder in the end, not easier.

I hope I explained my -1 on the PEP without sounding emotional.

Thank you,
Yury




Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 12:03 PM, Ethan Furman  wrote:
[..]
> But I do write this:
>
>   def wrapper(func, some_value):
> value_I_want = process(some_value)
> def wrapped(*args, **kwds):
>   if value_I_want == 42:
>  ...

But this pattern is more rare than comparing local variables. That's
the point I'm trying to use.  Besides, to make it an assignment
expression under my proposal you would need to use parens. Which makes
it even less likely that you confuse '=' and '=='.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:58 AM, Chris Angelico <ros...@gmail.com> wrote:
> On Wed, Apr 25, 2018 at 1:49 AM, Yury Selivanov <yselivanov...@gmail.com> 
> wrote:
>> On Tue, Apr 24, 2018 at 11:34 AM, Steven D'Aprano <st...@pearwood.info> 
>> wrote:
[..]
>>> There's no advantage to using binding-expressions unless you're going to
>>> re-use the name you just defined, and that re-use will give you a hint
>>> as to what is happening:
>>>
>>> my_func(arg, buffer=(buf := [None]*get_size()), size=len(buf))
>>
>> Again, this is very subjective, but this code would fail my code review :)
>>
>> Don't you find
>>
>>   buf = [None] * get_size()
>>   my_func(arg, buffer=buf, size=len(buf))
>>
>> to be more readable?
>
> Only if 'buf' is going to be used elsewhere. I'd be looking down below
> for some other use of 'buf'. Technically the same could be true of the
> inline assignment, but it makes more sense for a "this statement only"
> name binding to be within that statement, not broken out and placed
> above it as another operation at equal importance.

Well, you can use empty lines to visually indicate that 'buf' is
related to the call.

Moreover, 'buf' is still available to the code below that code and
sometimes be used there. You can't tell for sure until you glance over
the entire file/function. PEP 572 does not implement any sort of
sub-scoping.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:51 AM, Ethan Furman  wrote:

>> When I compare to variables from outer scopes they *usually* are on
>> the *right* side of '=='.
>
>
> You mean something like
>
>   if 2 == x:
>
> ?  I never write code like that, and I haven't seen it, either.

Hm. I mean this:

   const = 'something'

   def foo(arg):
 if arg == const:
do something

Note that "const" is on the right side of "==".

Would you write this as

   def foo(arg):
  if const == arg:

? ;)

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:34 AM, Steven D'Aprano <st...@pearwood.info> wrote:
> On Tue, Apr 24, 2018 at 11:05:57AM -0400, Yury Selivanov wrote:
>
>> Well, `my_func(a=(b:=foo))` or `my_func(b:=foo)` are also barely
>> readable to my eye.
>
> There's no advantage to using binding-expressions unless you're going to
> re-use the name you just defined, and that re-use will give you a hint
> as to what is happening:
>
> my_func(arg, buffer=(buf := [None]*get_size()), size=len(buf))

Again, this is very subjective, but this code would fail my code review :)

Don't you find

  buf = [None] * get_size()
  my_func(arg, buffer=buf, size=len(buf))

to be more readable?

IMHO this example is why we shouldn't implement any form of assignment
expressions in Python :)

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:28 AM, Chris Angelico  wrote:

> On re-thinking this, I think the distinction IS possible, but (a) only
> in function/class scope, not at global; and (b) would be defined in
> terms of lexical position, not run-time. For instance:
>
> def f():
> (a = 1) # Legal; 'a' has not been used yet
> a = 2 # doesn't change that
>
> def f(a):
> (a = 1) # Invalid - 'a' has been used already
>
> def f():
> while (a = get_next()): # Legal
> ...

Now *this* is a weird rule. Moving functions around files would become
impossible.

Please experiment with my reference implementation, it already
implements my proposal in full. Loops and inline assignments work as
expected in it.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:27 AM, Steven D'Aprano <st...@pearwood.info> wrote:
> On Tue, Apr 24, 2018 at 11:03:35AM -0400, Yury Selivanov wrote:
>
>> My point was that when you see lots of '=' and ':=' used at the
>> statement level, one might try to write "if x = 1" instead of "if x :=
>> 1" -- boom, we have an unexpected SyntaxError for some users.
>
> That's a *good* thing. They will then learn not to write x = 1 as an
> expression.
>
> Also, if I write lots of x := 1 binding-expressions as statements, my
> code is bad and deserves to fail code-review. But why would I write the
> extra colon (one character, two key-presses) to use
>
> x := 1
>
> as a statement, when x = 1 will work? That's a sure sign that I don't
> know what I'm doing. (Or that I desperately wish I was writing Pascal.)

In JavaScript there's a new backticks syntax for string—their variant
of f-strings.  I'm seeing a lot of JS coders that use backticks
everywhere, regardless if there's formatting in them or not.  The
result is that some JS code in popular libraries has now *three*
different string literal syntaxes separated by one line of code. It
looks weird. I expect to see something similar in Python code if we
adapt ':='.  I don't think the language will benefit from this.

FWIW I'm fine with keeping the status quo and not adding new syntax at all.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:31 AM, Nick Coghlan  wrote:
> On 25 April 2018 at 00:54, Eric Snow  wrote:
>> Regardless, your 3 rules would benefit either syntax.  Nick may have a
>> point that the rules might be an excessive burden, but I don't think
>> it's too big a deal since the restrictions are few (and align with the
>> most likely usage) and are limited to syntax so the compiler will be
>> quick to point mistakes.
>
> I think the "single name target only" rule should be in place no
> matter the syntax for the name binding operator itself.
>
> I don't mind too much either way on the mandatory parentheses question
> (it's certainly an easy option to actively discourage use of binding
> expressions as a direct alternative to assignment statements, but as
> with the single-name-only rule, it's independent of the choice of
> syntax)

Mandatory parenthesis around `(name := expr)` would at least solve the
problem of users mixing up '=' and ':=' in statements.

>
> I *do* think the "no name rebinding except in a while loop header"
> restriction would be annoying for the if/elif use case and the while
> use case:
>
> while (item = get_item()) is not first_delimiter:
> # First processing loop
> while (item = get_item()) is not second_delimiter:
> # Second processing loop
> # etc...
>
> if (target = get_first_candidate()) is not None:
> ...
> elif (target = get_second_candidate()) is not None:
> ...
> elif (target = get_third_candidate()) is not None:
> ...

Yes, it would force users to come up with better names *iff* they want
to use this new sugar:

  if (first_target = get_first_candidate()) ...
  elif (second_target = get_second_candidate()) ...

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:15 AM, Steven D'Aprano  wrote:
[..]

>> >> 3. Most importantly: it is *not* allowed to mask names in the current
>> >> local scope.
>
> That means you can't rebind existing variables. That means you can't
> rebind to the same variable in a loop.

No, it doesn't. The check is performed during compile phase, and
Python does not unroll loops. Anyways, read below.

> I believe that one of the most important use-cases for binding-
> expression syntax is while loops, like this modified example taken from
> PEP 572 version 3:
>
> while (data = sock.read()):
> print("Received data:", data)
>
> If you prohibit re-binding data, that prohibits cases like this, or even
> using it inside a loop:
>
> for value in sequence:
> process(arg, (item = expression), item+1)

No it doesn't. symtable in Python works differently. I encourage you
to test my reference implementation:

py> for val in [1, 2, 3]:
...   print((item=val), item+1)
...
1 2
2 3
3 4

> Why is this allowed?
>
> x = 1  # both are statement forms
> x = 2
>
> but this is prohibited?
>
> x = 1
> (x = 2)  # no rebinding is allowed
>
> and even more confusing, this is allowed!
>
> (x = 1)  # x doesn't exist yet, so it is allowed
> x = 2  # statement assignment is allowed to rebind

These all are very limited code snippets that you're unlikely to see
in real code.  I can write (and I did in this thread) a bunch of
examples of where PEP 572 is also inconsistent.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 11:07 AM, Chris Angelico  wrote:
[..]

> x = 1
> if (x = 2): ...
>
> This, according to your proposal, raises SyntaxError - not because a
> comparison was wanted and an assignment was made, but because the name
> already had a value. And, even worse, this is NOT an error:

Yes, because I'm trying to think about this from a pragmatic side of
things. My question to myself: "what syntax could I use that would
prevent me from making '=' vs '==' mistake when I code?"  To me, the
answer is that I usually want to compare local variables.

When I compare to variables from outer scopes they *usually* are on
the *right* side of '=='.

>
> x = 1
> def f():
> if (x = 2):
> ...
>
> That's a bizarre distinction.

Chris, FWIW I'm trying to avoid using 'bizarre', 'arcane' etc with
regards to PEP 572 or any proposal, really. For example, I,
personally, find ':=' bizarre, but it's subjective and it's
unproductive to say that.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 10:54 AM, Anthony Flury via Python-Dev
 wrote:
[..]
> As discussed previously by others on this exact proposals, you now have the
> issue of  confusion when using keyword arguments : *my_func(a = b)* :
> clearly that is a call to `my_func' where argument a has the value of b, but
> if you want to do an assigment expression when calling the function you now
> have to do *my_func((a=b)) -* which frankly looks messy in my opinion; you
> get the same issue when you are wanting to do assignment expressions in
> tuples.

Well, `my_func(a=(b:=foo))` or `my_func(b:=foo)` are also barely
readable to my eye.  My expectation is that users won't use any form
of assignment expressions in function calls, it's painful with both
proposals.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 10:49 AM, Paul Moore  wrote:
[..]
>>> 3. Most importantly: it is *not* allowed to mask names in the current
>>> local scope.
>>
>> While I agree this would be unambiguous to a computer, I think for
>> most humans it would be experienced as a confusing set of arcane and
>> arbitrary rules about what "=" means in Python.
>
> Also, there's the ambiguity and potential for misreading in the
> opposite direction (accidentally *reading* = as == even though it
> isn't):
>
> if (diff = x - x_base) and (g = gcd(diff, n)) > 1:
>  return g

Since 'diff' and 'g' must be new names according to rule (3), those
who read the code will notice that both were not previously bound.
Therefore both are new variables so it can't be a comparison.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 10:56 AM, Chris Angelico  wrote:
[..]
>> A lot of other questions arise though.  PEP 572 proposes:
>>
>> a = 1  # assignment
>> a := 1  # also assignment
>> (a := 1)  # also assignment
>> (a = 1)  # error, why?
>
> Your third example is just the same as the second, with parentheses
> around it. In most of Python, parentheses (if legal) have no effect
> other than grouping; "a + b * c" is the same thing as "(a + b) * c",
> just done in the other order. The last one is a clear demonstration
> that "=" is a statement, not an expression. Are people confused by
> this sort of thing:
>
> if x > 1:
> print("x is more than 1")
> (if x > 1:)
> print("SyntaxError")

This is a very far-fetched example :)

My point was that when you see lots of '=' and ':=' used at the
statement level, one might try to write "if x = 1" instead of "if x :=
1" -- boom, we have an unexpected SyntaxError for some users.

In my opinion adding *any* assignment expression syntax to Python
*will* create this sort of issues.  PEP 572 isn't free of them, my
proposal isn't free of them.  My proposal doesn't add a new ':='
operator at the cost of slightly complicating rules around '='.  PEP
572 avoids complicating '=', but adds an entirely new form of
assignment.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 10:07 AM, Nick Coghlan  wrote:

>> "=" is always an assignment.
>> "==" is always an equality check.
>
> That's not the distinction I meant, I meant the difficulty of
> explaining the discrepancies in this list:
>
> a = 1 # Assignment
> (a = 1) # Also assignment
>
> a, b = 1, 2 # Tuple assignment
> (a, b = 1, 2) # SyntaxError. Why?
>
> ...
> Whereas if binding expressions use a different symbol, the question is
> far less likely to arise, and if it does come up, then the answer is
> the same as the one for def statements vs lambda expressions: because
> one is a statement, and the other is an expression.

A lot of other questions arise though.  PEP 572 proposes:

a = 1  # assignment
a := 1  # also assignment
(a := 1)  # also assignment
(a = 1)  # error, why?

It's also difficult to explain which one to use when.  The net result
is that code will be littered with both at random places.  That will
decrease the readability of Python code at least for some users who
have similar taste to myself.

With '=' in expressions, the code will look uniform.  There will be a
simple rule to put parens around assignments in expression and use
simple names.  After one or two descriptive SyntaxError users will
learn how this syntax works (like people learn everything in coding).

This all is very subjective.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
On Tue, Apr 24, 2018 at 9:46 AM, Nick Coghlan <ncogh...@gmail.com> wrote:
> On 24 April 2018 at 23:38, Yury Selivanov <yselivanov...@gmail.com> wrote:
>> I propose to use the following syntax for assignment expressions:
>>
>> ( NAME = expr )
>>
>> I know that it was proposed before and this idea was rejected, because
>> accidentally using '=' in place of '==' is a pain point in
>> C/C++/JavaScript.
>>
>> That said, I believe we can still use this syntax as long as we impose
>> the following three restrictions on it:
>>
>> 1. Only NAME token is allowed as a single target.
>>
>> 2. Parenthesis are required.
>>
>> 3. Most importantly: it is *not* allowed to mask names in the current
>> local scope.
>
> While I agree this would be unambiguous to a computer, I think for
> most humans it would be experienced as a confusing set of arcane and
> arbitrary rules about what "=" means in Python.

I respectfully disagree.  There are no "arcane and confusing rules"
about "=", it's rather simple:

"=" is always an assignment.
"==" is always an equality check.

Having two assignment operators feels way more arcane to me.
Especially in Python guided by "there should be one way" Zen.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] assignment expressions: an alternative proposal

2018-04-24 Thread Yury Selivanov
I propose to use the following syntax for assignment expressions:

( NAME = expr )

I know that it was proposed before and this idea was rejected, because
accidentally using '=' in place of '==' is a pain point in
C/C++/JavaScript.

That said, I believe we can still use this syntax as long as we impose
the following three restrictions on it:

1. Only NAME token is allowed as a single target.

2. Parenthesis are required.

3. Most importantly: it is *not* allowed to mask names in the current
local scope.


Let's see how each restriction affects the syntax in detail:

(1) NAME tokens only:

  if (a[1] = value)# SyntaxError
  if (a.attr = value)  # SyntaxError

(2) Required parens disambiguate the new syntax from keyword-arguments
and prevent using '=' in place of '==':

   if a = value# SyntaxError
   if expr and a = value# SyntaxError

(3) No masking of existing names in local scope makes using '=' in
place of '==' by mistake even less probable:

   flag = get_flag()
   ...
   if (flag = 'win')# SyntaxError

   # or

   def foo(value):
  if (value = 1)   # SyntaxError

   # or

   py> (c = 1) and (c = 2)   # SyntaxError

   # etc


The following code snippets are perfectly valid though:

py> a = (b = (c = 3))
py> a, b, c
(3, 3, 3)

  # and

py> f = lambda x: x * 10
py> [[(y = f(x)), x/y] for x in range(1,5)]
[[10, 0.1], [20, 0.1], [30, 0.1], [40, 0.1]]

  # and

  def read():
  while (command = input("> ")) != "quit":
 print('you entered', command)

  # and

py> if (match = re.search(r'wor\w+', 'hello world')):
py. print(match)


  # and

 if (diff = x - x_base) and (g = gcd(diff, n)) > 1:
 return g


Enabling '=' for assignment expressions introduces a limited form of
the more general assignment statement. It is designed to be useful in
expressions and is deliberately simple to make it hard for users to
shoot in the foot. The required Python grammar changes are simple and
unambiguous.

Although it is still possible to accidentally mask a global name or a
name from an outer scope, the risk of that is significantly lower than
masking a local name.  IDEs and linters can improve the usability
further by highlighting invalid or suspicious assignment expressions.

I believe that this syntax is the best of both worlds: it allows to
write succinct code just like PEP 572, but without introducing a new
':=' operator.

An implementation of this proposal is available here:
https://github.com/1st1/cpython/tree/assign.

If this idea is deemed viable I will write a PEP detailing the
grammar/compiler changes and syntax restrictions.

Thanks,
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 572: Assignment Expressions

2018-04-17 Thread Yury Selivanov
Hi Chris,

Thank you for working on this PEP!  Inline assignments is a long
requested feature and this seems to be the first serious attempt
at adding it.

That said I'm very -1 on the idea.


1. I switch between Python / JavaScript / C frequently (although
I code in Python 70% of my time.)  C and JS have inline
assignments but I don't find myself using them often.

JavaScript has inline assignments and they are useful to get the
match object after applying a regex. You use the same example
in your PEP.  But in my experience, this is the only common pattern
in JavaScript.  I don't see people using inline assignments for
anything else, at least it's not a common pattern.

C is low-level and has no exceptions. It uses function return values
to signal if there was an error or not. It's a popular pattern to call
a function from an 'if' statement like this: "if ((int ret = func()))" to
save a line of code.  If we ignore this particular pattern, we see that
inline assignment isn't used that often.

In your PEP you use comprehensions and regex match object to
show how inline assignment can simplify the code. In my experience,
comprehensions that are a little more complex than
"(f(x) for x in something)" are always better being rewritten to an
expanded form.  I don't find "stuff = [[y := f(x), x/y] for x in range(5)]"
very readable, and yes, I think that the simple expanded version
of this comprehension is better.

Using inline assignments in "while" statements is neat, but how
often do we use "while" statements?


2. We all try to follow the Python zen when we are designing new
language features.  With the exception of string formatting Python
follows the "There should be one-- and preferably only one --
obvious way to do it." line.  Your PEP, in my personal opinion,
goes agains this one, and also a few other lines.


I simply don't see a very compelling use case to have two forms
of assignment in Python.  It does complicate the grammar by adding
a new operator, it invites people to write more complex code, and it
has only a couple good use cases.


Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] How is the GitHub workflow working for people?

2018-02-21 Thread Yury Selivanov
On Wed, Feb 21, 2018 at 10:27 PM, Guido van Rossum  wrote:
[..]
> I honestly expect that running either with close-to-default flags on stdlib
> code would be a nightmare, and I wouldn't want *any* directives for either
> one to appear in stdlib code, ever.

It would be great to enable the linter on a per-module basis then.
For instance, I believe that all files in asyncio package pass flake8
with default flags (at least I'm doing my best to keep it that way).
Sometimes it takes an extra review round to fix the code style, having
the CI to enforce it would save time for everybody.

Something similar to "cpython/.github/CODEOWNERS" but for enabling
linters would work.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] How is the GitHub workflow working for people?

2018-02-21 Thread Yury Selivanov
FWIW I'm extremely happy with the current workflow. The recent
improvements to @miss-islington (kudos to Mariatta!) allowing her to
auto-backport PRs and commit them is a big time saver.

I can only suggest a couple improvements:

1. Make our bots check the code style—fully enforce PEP 8, lint the
code, and detect trailing whitespace on all lines that a PR modifies.

2. AppVeyor and Travis are a bit slow at times.  Maybe it's possible
to ask them to slightly increase our quotas (again).  Although usually
this isn't a problem and CI is fast enough.

3. It would be great if our buildbots could update the PR at blame
when they detect a regression (I understand that this is a hard to
implement feature...)

Huge thanks to the core-workflow team!

Yury

On Tue, Feb 20, 2018 at 8:58 PM, Brett Cannon  wrote:
> It's been a year and 10 days since we moved to GitHub, so I figured now is
> as good a time as any to ask people if they are generally happy with the
> workflow and if there is a particular sticking point to please bring it up
> on the core-workflow mailing list so we can potentially address it.
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Dataclasses and correct hashability

2018-02-02 Thread Yury Selivanov
On Fri, Feb 2, 2018 at 10:51 AM, Paul Moore  wrote:
[..]
> To put it another way, using your words above, "The moment you want to
> use a dataclass a a dict key, or put it in a set, you need it to be
> *immutable*" (not hashable, unless you really know what you're doing).

Can someone clarify what is the actual use case of someone *knowingly*
making a mutable collection hashable?  Why can't that advanced user
write their own __hash__ implementation?  It's easy to do so.

For what it's worth I think this argument is being blindly used to
justify the current questionable design of "dataclass(hash=True)"
being the same as "dataclass(hash=True, frozen=False) case.  At least
a few other core developers are concerned with this, but all I see is
"attrs does the same thing".

Eric, in my opinion we shouldn't copy attrs.  It was designed as an
external package with its own backwards-compatibility story.  At some
point it was realized that "attrs(hash=True, frozen=False)" is an
anti-pattern, but it couldn't be removed at that point.  Hence the
warning in the documentation.  We can do better.

We are designing a new API that is going to be hugely popular.  Why
can't we ship it with dangerous options prohibited in 3.7 (it's easy
to do that!) and then enable them in 3.8 when there's an actual clear
use case?

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Is static typing still optional?

2018-01-28 Thread Yury Selivanov
On Mon, Jan 29, 2018 at 1:36 AM Nick Coghlan  wrote:

> [...]
> Currently the answers are:
>
> - A: not hashable
> - B: hashable (by identity) # Wat?
> - C: hashable (by field hash)
> - D: hashable (by identity) # Wat?
> - E: hashable (by field hash)
> - F: hashable (by field hash)
> - G: hashable (by field hash)
> - H: hashable (by field hash)


This is very convoluted.

+1 to make hashability an explicit opt-in.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Merging the implementation of PEP 563

2018-01-25 Thread Yury Selivanov
I looked at the PR and I think the code is fine.

Yury

On Thu, Jan 25, 2018 at 4:39 PM, Victor Stinner
 wrote:
> Hi,
>
> If nobody is available to review your PR, I suggest to push it anyway,
> to get it merged before the feature freeze. The code can be reviewed
> later. Merging it sooner gives more time to test it and spot bugs. It
> also gives more time to fix bugs ;-) Well, at the end, it's up to you.
>
> Victor
>
> 2018-01-25 22:07 GMT+01:00 Lukasz Langa :
>> Hi all,
>> Serhiy looks busy these days. I'd appreciate somebody looking at and
>> hopefully merging https://github.com/python/cpython/pull/4390. Everything
>> there was reviewed by Serhiy except for the latest commit.
>>
>> This should be ready to merge and maybe tweak in the beta stage. I'd like to
>> avoid merging it myself but I'd really hate missing the deadline.
>>
>> - Ł
>>
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com
>>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Intention to accept PEP 567 (Context Variables)

2018-01-22 Thread Yury Selivanov
Yay! Thank you, Guido!

Yury

On Mon, Jan 22, 2018 at 6:52 PM, Guido van Rossum  wrote:
> Yury,
>
> I am hereby *accepting* the latest version of PEP 567[1]. Congrats!
>
> --Guido
>
> [1]
> https://github.com/python/peps/commit/a459539920b9b8c8394ef61058e88a076ef8b133#diff-9d0ccdec754459da5f665cc6c6b2cc06
>
> On Fri, Jan 19, 2018 at 9:30 AM, Guido van Rossum  wrote:
>>
>> There has been useful and effective discussion on several of the finer
>> points of PEP 567. I think we've arrived at a solid specification, where
>> every part of the design is well motivated. I plan to accept it on Monday,
>> unless someone brings up something significant that we've overlooked before
>> then. Please don't rehash issues that have already been debated -- we're
>> unlikely to reach a different conclusion upon revisiting the same issue
>> (read the Rejected Ideas section first).
>>
>> --
>> --Guido van Rossum (python.org/~guido)
>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-17 Thread Yury Selivanov
On Wed, Jan 17, 2018 at 8:53 PM, Yury Selivanov <yselivanov...@gmail.com> wrote:
> On Wed, Jan 17, 2018 at 2:24 PM, Guido van Rossum <gvanros...@gmail.com> 
> wrote:
>> Perhaps you can update the PEP with a summary of the rejected ideas from
>> this thread?
>
> The Rejected Ideas section of the PEP is now updated with the below:

I've added two more subsections to Rejected Ideas:


Make Context a MutableMapping
-

Making the ``Context`` class implement the ``abc.MutableMapping``
interface would mean that it is possible to set and unset variables
using ``Context[var] = value`` and ``del Context[var]`` operations.

This proposal was deferred to Python 3.8+ because of the following:

1. If in Python 3.8 it is decided that generators should support
   context variables (see :pep:`550` and :pep:`568`), then ``Context``
   would be transformed into a chain-map of context variables mappings
   (as every generator would have its own mapping).  That would make
   mutation operations like ``Context.__delitem__`` confusing, as
   they would operate only on the topmost mapping of the chain.

2. Having a single way of mutating the context
   (``ContextVar.set()`` and ``ContextVar.reset()`` methods) makes
   the API more straightforward.

   For example, it would be non-obvious why the below code fragment
   does not work as expected::

 var = ContextVar('var')

 ctx = copy_context()
 ctx[var] = 'value'
 print(ctx[var])  # Prints 'value'

 print(var.get())  # Raises a LookupError

   While the following code would work::

 ctx = copy_context()

 def func():
 ctx[var] = 'value'

 # Contrary to the previous example, this would work
 # because 'func()' is running within 'ctx'.
 print(ctx[var])
 print(var.get())

 ctx.run(func)


Have initial values for ContextVars
---

Nathaniel Smith proposed to have a required ``initial_value``
keyword-only argument for the ``ContextVar`` constructor.

The main argument against this proposal is that for some types
there is simply no sensible "initial value" except ``None``.
E.g. consider a web framework that stores the current HTTP
request object in a context variable.  With the current semantics
it is possible to create a context variable without a default value::

# Framework:
current_request: ContextVar[Request] = \
ContextVar('current_request')


# Later, while handling an HTTP request:
request: Request = current_request.get()

# Work with the 'request' object:
return request.method

Note that in the above example there is no need to check if
``request`` is ``None``.  It is simply expected that the framework
always sets the ``current_request`` variable, or it is a bug (in
which case ``current_request.get()`` would raise a ``LookupError``).

If, however, we had a required initial value, we would have
to guard against ``None`` values explicitly::

# Framework:
current_request: ContextVar[Optional[Request]] = \
ContextVar('current_request', initial_value=None)


# Later, while handling an HTTP request:
request: Optional[Request] = current_request.get()

# Check if the current request object was set:
if request is None:
raise RuntimeError

# Work with the 'request' object:
return request.method

Moreover, we can loosely compare context variables to regular
Python variables and to ``threading.local()`` objects.  Both
of them raise errors on failed lookups (``NameError`` and
``AttributeError`` respectively).

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-17 Thread Yury Selivanov
On Wed, Jan 17, 2018 at 2:24 PM, Guido van Rossum  wrote:
> Perhaps you can update the PEP with a summary of the rejected ideas from
> this thread?

The Rejected Ideas section of the PEP is now updated with the below:

Token.reset() instead of ContextVar.reset()
---

Nathaniel Smith suggested to implement the ``ContextVar.reset()``
method directly on the ``Token`` class, so instead of::

token = var.set(value)
# ...
var.reset(token)

we would write::

token = var.set(value)
# ...
token.reset()

Having ``Token.reset()`` would make it impossible for a user to
attempt to reset a variable with a token object created by another
variable.

This proposal was rejected for the reason of ``ContextVar.reset()``
being clearer to the human reader of the code which variable is
being reset.


Make Context objects picklable
--

Proposed by Antoine Pitrou, this could enable transparent
cross-process use of ``Context`` objects, so the
`Offloading execution to other threads`_ example would work with
a ``ProcessPoolExecutor`` too.

Enabling this is problematic because of the following reasons:

1. ``ContextVar`` objects do not have ``__module__`` and
   ``__qualname__`` attributes, making straightforward pickling
   of ``Context`` objects impossible.  This is solvable by modifying
   the API to either auto detect the module where a context variable
   is defined, or by adding a new keyword-only "module" parameter
   to ``ContextVar`` constructor.

2. Not all context variables refer to picklable objects.  Making a
   ``ContextVar`` picklable must be an opt-in.

Given the time frame of the Python 3.7 release schedule it was decided
to defer this proposal to Python 3.8.


Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-17 Thread Yury Selivanov
On Wed, Jan 17, 2018 at 6:03 AM, Antoine Pitrou <solip...@pitrou.net> wrote:
> On Tue, 16 Jan 2018 17:18:06 -0800
> Nathaniel Smith <n...@pobox.com> wrote:
>> On Tue, Jan 16, 2018 at 5:06 PM, Yury Selivanov <yselivanov...@gmail.com> 
>> wrote:
>> >
>> > I think it would be a very fragile thing In practice: if you have even
>> > one variable in the context that isn't pickleable, your code that uses
>> > a ProcessPool would stop working.  I would defer Context pickleability
>> > to 3.8+.
>>
>> There's also a more fundamental problem: you need some way to match up
>> the ContextVar objects across the two processes, and right now they
>> don't have an attached __module__ or __qualname__.
>
> They have a name, though.  So perhaps the name could serve as a unique
> identifier?  Instead of being serialized as a bunch of ContextVars, the
> Context would then be serialized as a {name: value} dict.

One of the points of the ContextVar design is to avoid having unique
identifiers requirement. Names can clash which leads to data being
lost. If you prohibit them from clashing, then if libraries A and B
happen to use the same context variable name—you can't use them both
in your projects.  And without enforcing name uniqueness, your
approach to serialize context as a dict with string keys won't work.

I like Nathaniel's idea to explicitly enable ContextVars pickling
support on a per-var basis.  Unfortunately we don't have time to
seriously consider and debate (and implement!) this idea in time
before the 3.7 freeze.

In the meanwhile, given that Context objects are fully introspectable,
users can implement their own ad-hoc solutions for serializers or
cross-process execution.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-16 Thread Yury Selivanov
On Tue, Jan 16, 2018 at 8:27 PM, Nathaniel Smith  wrote:
[..]
> token = cvar.set(...)
> token.reset()

I see the point, but I think that having the 'reset' method defined on
the ContextVar class is easier to grasp.  It also feels natural that a
pair of set/reset methods is defined on the same class.  This is
highly subjective though, so let's see which option Guido likes more.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-16 Thread Yury Selivanov
On Tue, Jan 16, 2018 at 7:45 PM, Guido van Rossum <gu...@python.org> wrote:
> On Tue, Jan 16, 2018 at 4:37 PM, Antoine Pitrou <solip...@pitrou.net> wrote:
>>
>> On Tue, 16 Jan 2018 17:44:14 -0500
>> Yury Selivanov <yselivanov...@gmail.com> wrote:
>>
>> > Offloading execution to other threads
>> > -
>> >
>> > It is possible to run code in a separate OS thread using a copy
>> > of the current thread context::
>> >
>> > executor = ThreadPoolExecutor()
>> > current_context = contextvars.copy_context()
>> >
>> > executor.submit(
>> > lambda: current_context.run(some_function))
>>
>> Does it also support offloading to a separate process (using
>> ProcessPoolExecutor in the example above)?  This would require the
>> Context to support pickling.
>
>
> I don't think that's a requirement. The transparency between the two
> different types of executor is mostly misleading anyway -- it's like the old
> RPC transparency problem, which was never solved IIRC. There are just too
> many things you need to be aware of before you can successfully offload
> something to a different process.

I agree.

I think it would be a very fragile thing In practice: if you have even
one variable in the context that isn't pickleable, your code that uses
a ProcessPool would stop working.  I would defer Context pickleability
to 3.8+.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-16 Thread Yury Selivanov
Thanks, Victor!
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v3

2018-01-16 Thread Yury Selivanov
On Tue, Jan 16, 2018 at 6:53 PM, Guido van Rossum  wrote:
> On Tue, Jan 16, 2018 at 3:26 PM, Victor Stinner 
[..]
>> I don't think that it's worth it to prevent misuage of reset(). IMHO
>> it's fine if calling reset() twice reverts the variable state twice.
>
>
> Maybe the effect of calling it twice should be specified as undefined -- the
> implementation can try to raise in simple cases.
>
> Unless Yury has a use case for the idempotency? (But with __enter__/__exit__
> as the main use case for reset() I wouldn't know what the use case for
> idempotency would be.)

I don't have any use case for idempotent reset, so I'd change it to
raise an error on second call. We can always relax this in 3.8 if
people request it to be idempotent.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 567 v3

2018-01-16 Thread Yury Selivanov
Hi,

This is a third version of PEP 567.

Changes from v2:

1. PyThreadState now references Context objects directly (instead of
referencing _ContextData).  This fixes out of sync Context.get() and
ContextVar.get().

2. Added a new Context.copy() method.

3. Renamed Token.old_val property to Token.old_value

4. ContextVar.reset(token) now raises a ValueError if the token was
created in a different Context.

5. All areas of the PEP were updated to be more precise. Context is
*no longer* defined as a read-only or an immutable mapping;
ContextVar.get() behaviour is fully defined; the immutability is only
mentioned in the Implementation section to avoid confusion; etc.

6. Added a new Examples section.

The reference implementation has been updated to include all these changes.

The only open question I personally have is whether ContextVar.reset()
should be idempotent or not.  Maybe we should be strict and raise an
error if a user tries to reset a variable more than once with the same
token object?

Other than that, I'm pretty happy with this version.  Big thanks to
everybody helping with the PEP!


PEP: 567
Title: Context Variables
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <y...@magic.io>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 12-Dec-2017
Python-Version: 3.7
Post-History: 12-Dec-2017, 28-Dec-2017, 16-Jan-2018


Abstract


This PEP proposes a new ``contextvars`` module and a set of new
CPython C APIs to support context variables.  This concept is
similar to thread-local storage (TLS), but, unlike TLS, it also allows
correctly keeping track of values per asynchronous task, e.g.
``asyncio.Task``.

This proposal is a simplified version of :pep:`550`.  The key
difference is that this PEP is concerned only with solving the case
for asynchronous tasks, not for generators.  There are no proposed
modifications to any built-in types or to the interpreter.

This proposal is not strictly related to Python Context Managers.
Although it does provide a mechanism that can be used by Context
Managers to store their state.


Rationale
=

Thread-local variables are insufficient for asynchronous tasks that
execute concurrently in the same OS thread.  Any context manager that
saves and restores a context value using ``threading.local()`` will
have its context values bleed to other code unexpectedly when used
in async/await code.

A few examples where having a working context local storage for
asynchronous code is desirable:

* Context managers like ``decimal`` contexts and ``numpy.errstate``.

* Request-related data, such as security tokens and request
  data in web applications, language context for ``gettext``, etc.

* Profiling, tracing, and logging in large code bases.


Introduction


The PEP proposes a new mechanism for managing context variables.
The key classes involved in this mechanism are ``contextvars.Context``
and ``contextvars.ContextVar``.  The PEP also proposes some policies
for using the mechanism around asynchronous tasks.

The proposed mechanism for accessing context variables uses the
``ContextVar`` class.  A module (such as ``decimal``) that wishes to
use the new mechanism should:

* declare a module-global variable holding a ``ContextVar`` to
  serve as a key;

* access the current value via the ``get()`` method on the
  key variable;

* modify the current value via the ``set()`` method on the
  key variable.

The notion of "current value" deserves special consideration:
different asynchronous tasks that exist and execute concurrently
may have different values for the same key.  This idea is well-known
from thread-local storage but in this case the locality of the value is
not necessarily bound to a thread.  Instead, there is the notion of the
"current ``Context``" which is stored in thread-local storage.
Manipulation of the current context is the responsibility of the
task framework, e.g. asyncio.

A ``Context`` is a mapping of ``ContextVar`` objects to their values.
The ``Context`` itself exposes the ``abc.Mapping`` interface
(not ``abc.MutableMapping``!), so it cannot be modified directly.
To set a new value for a context variable in a ``Context`` object,
the user needs to:

* make the ``Context`` object "current" using the ``Context.run()``
  method;

* use ``ContextVar.set()`` to set a new value for the context
  variable.

The ``ContextVar.get()`` method looks for the variable in the current
``Context`` object using ``self`` as a key.

It is not possible to get a direct reference to the current ``Context``
object, but it is possible to obtain a shallow copy of it using the
``contextvars.copy_context()`` function.  This ensures that the
caller of ``Context.run()`` is the sole owner of its ``Context``
object.


Specification
=

A new standard library module ``contextvars`` is added with the
following APIs:

1. ``copy_context() -> Context`` function is used to get a co

Re: [Python-Dev] PEP 567 pre v3

2018-01-10 Thread Yury Selivanov
On Thu, Jan 11, 2018 at 10:35 AM, Chris Jerdonek
 wrote:
> On Mon, Jan 8, 2018 at 11:02 PM, Nathaniel Smith  wrote:
>> Right now, the set of valid states for a ContextVar are: it can hold
>> any Python object, or it can be undefined. However, the only way it
>> can be in the "undefined" state is in a new Context where it has never
>> had a value; once it leaves the undefined state, it can never return
>> to it.
>
> I know Yury responded to one aspect of this point later on in the
> thread. However, in terms of describing the possible states without
> reference to the internal Context mappings, IIUC, wouldn't it be more
> accurate to view a ContextVar as a stack of values rather than just
> the binary "holding an object or not"? This is to reflect the number
> of times set() has been called (and so the number of times reset()
> would need to be called to "empty" the ContextVar).


But why do you want to think of ContextVar as a stack of values?  Or
as something that is holding even one value?

Do Python variables hold/envelope objects they reference?  No, they
don't.  They are simple names and are used to lookup objects in
globals/locals dicts.  ContextVars are very similar!  They are *keys*
in Context objects—that is it.

ContextVar.default is returned by ContextVar.get() when it cannot find
the value for the context variable in the current Context object.  If
ContextVar.default was not provided, a LookupError is raised.

The reason why this is simpler for regular variables is because they
have a dedicated syntax.  Instead of writing

print(globals()['some_variable'])

we simply write

print(some_variable)

Similarly for context variables, we could have written:

   print(copy_context()[var])

But instead we use a ContextVar.get():

   print(var.get())

If we had a syntax support for context variables, it would be like this:

   context var
   print(var)   # Lookups 'var' in the current context

Although I very much doubt that we would *ever* want to have a
dedicated syntax for context variables (they are very niche and are
only needed in some very special cases), I hope that this line of
thinking would help to clear the waters.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 pre v3

2018-01-10 Thread Yury Selivanov
On Thu, Jan 11, 2018 at 10:39 AM, Ethan Furman <et...@stoneleaf.us> wrote:
> On 01/10/2018 10:23 PM, Yury Selivanov wrote:
[..]
>> Therefore I'm still in favour of keeping the current PEP 567
>> behaviour.
>
>
> To be clear:  We'll now be able to specify a default when we create the
> variable, but we can also leave it out so a LookupError can be raised later?

Correct.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 pre v3

2018-01-10 Thread Yury Selivanov
On Thu, Jan 11, 2018 at 4:44 AM, Nathaniel Smith  wrote:
[..]
> It may have gotten lost in that email, but my actual favorite approach
> is that we make the signatures:
>
> ContextVar(name, *, initial_value)  # or even (*, name, initial_value)
> ContextVar.get()
> ContextVar.set(value)
>
> so that when you create a ContextVar you always state the initial
> value, whatever makes sense in a particular case. (Obviously None will
> be a very popular choice, but this way it won't be implicit, and
> no-one will be surprised to see it returned from get().)

Alright, you've shown that most of the time when we use
threading.local in the standard library we subclass it in order to
provide a default value (and avoid AttributeError being thrown).  This
is a solid argument in favour of keeping the 'default' parameter for
the ContextVar constructor.  Let's keep it.

However I still don't like the idea of making defaults mandatory.  I
have at least one exemplary use case (I can come up with more of such
examples, btw) which shows that it's not always desired to have a None
default: getting the current request object in a web application.
With the current PEP 567 semantics:

request_var: ContextVar[Request] = ContextVar('current_request')

and later:

request : Request = request_var.get()

'request_var.get()' will throw a LookupError, which will indicate that
something went wrong in the framework layer.  The user should never
see this error, and they can just rely on the fact that the current
request is always available (cannot be None).

With mandatory defaults, the type of 'request' variable will be
'Optional[Request]', and the user will be forced to add an 'if'
statement to guard against None values.  Otherwise the user risks
having occasional AttributeErrors that don't really explain what
actually happened.  I would prefer them to see a LookupError('cannot
lookup current_request context variable') instead.

I think that when you have an int stored in a context variable it
would usually make sense to give it a 0 default (or some other
number). However, for a complex object (like current request object)
there is *no* sensible default value sometimes.  Forcing the user to
set it to None feels like a badly designed API that forces the user to
work around it.

Therefore I'm still in favour of keeping the current PEP 567
behaviour.  It feels very consistent with how variable lookups and
threading.local objects work in Python now.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Thoughts on "contexts". PEPs 550, 555, 567, 568

2018-01-09 Thread Yury Selivanov
Wasn't PEP 555 rejected by Guido? What's the point of this post?

Yury

On Wed, Jan 10, 2018 at 4:08 AM Koos Zevenhoven  wrote:

> Hi all,
>
> I feel like I should write some thoughts regarding the "context"
> discussion, related to the various PEPs.
>
> I like PEP 567 (+ 567 ?) better than PEP 550. However, besides providing
> cvar.set(), I'm not really sure about the gain compared to PEP 555 (which
> could easily have e.g. a dict-like interface to the context). I'm still not
> a big fan of "get"/"set" here, but the idea was indeed to provide those on
> top of a PEP 555 type thing too.
>
> "Tokens" in PEP 567, seems to resemble assignment context managers in PEP
> 555. However, they feel a bit messy to me, because they make it look like
> one could just set a variable and then revert the change at any point in
> time after that.
>
> PEP 555 is in fact a simplification of my previous sketch that had a
> .set(..) in it, but was somewhat different from PEP 550. The idea was to
> always explicitly define the scope of contextvar values. A context manager
> / with statement determined the scope of .set(..) operations inside the
> with statement:
>
> # Version A:
> cvar.set(1)
> with context_scope():
> cvar.set(2)
>
> assert cvar.get() == 2
>
> assert cvar.get() == 1
>
> Then I added the ability to define scopes for different variables
> separately:
>
> # Version B
> cvar1.set(1)
> cvar2.set(2)
> with context_scope(cvar1):
> cvar1.set(11)
> cvar2.set(22)
>
> assert cvar1.get() == 1
> assert cvar2.get() == 22
>
>
> However, in practice, most libraries would wrap __enter__, set and
> __exit__ into another context manager. So maybe one might want to allow
> something like
>
> # Version C:
> assert cvar.get() == something
> with context_scope(cvar, 2):
> assert cvar.get() == 2
>
> assert cvar.get() == something
>
>
> But this then led to combining "__enter__" and ".set(..)" into
> Assignment.__enter__ -- and "__exit__" into Assignment.__exit__ like this:
>
> # PEP 555 draft version:
> assert cvar.value == something
> with cvar.assign(1):
> assert cvar.value == 1
>
> assert cvar.value == something
>
>
> Anyway, given the schedule, I'm not really sure about the best thing to do
> here. In principle, something like in versions A, B and C above could be
> done (I hope the proposal was roughly self-explanatory based on earlier
> discussions). However, at this point, I'd probably need a lot of help to
> make that happen for 3.7.
>
> -- Koos
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 pre v3

2018-01-09 Thread Yury Selivanov
By default, threading.local raises an AttributeError (unless you subclass
it.)  Similar to that and to NameErrors, I think it's a good idea for
ContextVars to raise a LookupError if a variable was not explicitly set.

Yury


On Tue, Jan 9, 2018 at 7:15 PM Victor Stinner <victor.stin...@gmail.com>
wrote:

> 2018-01-09 12:41 GMT+01:00 Yury Selivanov <yselivanov...@gmail.com>:
> > But I'd be -1 on making all ContextVars have a None default
> > (effectively have a "ContextVar.get(default=None)" signature. This
> > would be a very loose semantics in my opinion.
>
> Why do you think that it's a loose semantics? For me
> ContextVar/Context are similar to Python namespaces and thread local
> storage.
>
> To "declare" a variable in a Python namespace, you have to set it:
> "global x" doesn't create a variable, only "x = None".
>
> It's not possible to define a thread local variable without specifying
> a "default" value neither.
>
> Victor
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 pre v3

2018-01-09 Thread Yury Selivanov
On Tue, Jan 9, 2018 at 11:02 AM, Nathaniel Smith <n...@pobox.com> wrote:
> On Mon, Jan 8, 2018 at 11:34 AM, Yury Selivanov <yselivanov...@gmail.com> 
> wrote:
>> 1. Proposal: ContextVar has default set to None.
>>
>> From the typing point of view that would mean that if a context
>> variable is declared without an explicit default, its type would be
>> Optional.  E.g. say we have a hypothetical web framework that allows
>> to access the current request object through a context variable:
>>
>>   request_var: ContextVar[Optional[Request]] = \
>>   ContextVar('current_request')
>>
>> When we need to get the current request object, we would write:
>>
>>   request: Optional[Request] = request_var.get()
>>
>> And we'd also need to explicitly handle when 'request' is set to None.
>> Of course we could create request_var with its default set to some
>> "InvalidRequest" object, but that would complicate things.  It would
>> be easier to just state that the framework always sets the current
>> request and it's a bug if it's not set.
>>
>> Therefore, in my opinion, it's better to keep the current behaviour:
>> if a context variable was created without a default value,
>> ContextVar.get() can raise a LookupError.
>
> All the different behaviors here can work, so I don't want to make a
> huge deal about this. But the current behavior is bugging me, and I
> don't think anyone has brought up the reason why, so here goes :-).
>
> Right now, the set of valid states for a ContextVar are: it can hold
> any Python object, or it can be undefined. However, the only way it
> can be in the "undefined" state is in a new Context where it has never
> had a value; once it leaves the undefined state, it can never return
> to it.

Is "undefined" a state when a context variable doesn't have a default
and isn't yet set?  If so, why can't it be returned back to the
"undefined" state?  That's why we have the 'reset' method:

   c = ContextVar('c')
   c.get()  # LookupError

   t = c.set(42)
   c.get()   # 42
   c.reset(t)

   c.get()   # LookupError

I don't like how context variables are defined in Option 1 and Option
2.  I view ContextVars as keys in some global context mapping--akin to
Python variables.  Similar to how we have a NameError for variables,
we have a LookupError for context variables.  When we write a variable
name, Python looks it up in locals and globals.  When we call
ContextVar.get(), Python will look up that context variable in the
current Context.  I don't think we should try to classify ContextVar
objects as containers or something capable of holding a value on their
own.

Even when you have a "del some_var" statement, you are only guaranteed
to remove the "some_var" name from the innermost scope.  This is
similar to what ContextVar.unset() will do in PEP 568, by removing the
variable only from the head of the chain.

So the sole purpose of ContextVar.default is to make ContextVar.get()
convenient.  Context objects don't know about ContextVar.default, and
ContextVars don't know about values they are mapped to in some Context
object.

In any case, at this point I think that the best option is to simply
drop the "default" parameter from the ContextVar constructor.  This
would leave us with only one default in ContextVar.get() method:

c.get()   # Will raise a LookupError if 'c' is not set
c.get('python')  # Will return 'python' if 'c' is not set

I also now see how having two different 'default' values: one defined
when a ContextVar is created, and one can be passed to
ContextVar.get() is confusing.

But I'd be -1 on making all ContextVars have a None default
(effectively have a "ContextVar.get(default=None)" signature. This
would be a very loose semantics in my opinion.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-09 Thread Yury Selivanov


> On Jan 9, 2018, at 11:18 AM, Nathaniel Smith  wrote:
> 
>> On Thu, Jan 4, 2018 at 9:42 PM, Guido van Rossum  wrote:
>>> On Thu, Jan 4, 2018 at 7:58 PM, Nathaniel Smith  wrote:
>>> This does make me think that I should write up a short PEP for
>>> extending PEP 567 to add context lookup, PEP 550 style: it can start
>>> out in Status: deferred and then we can debate it properly before 3.8,
>>> but at least having the roadmap written down now would make it easier
>>> to catch these details. (And it might also help address Paul's
>>> reasonable complaint about "unstated requirements".)
>> 
>> Anything that will help us kill a 550-pound gorilla sounds good to me. :-)
>> 
>> It might indeed be pretty short if we follow the lead of ChainMap (even
>> using a different API than MutableMapping to mutate it). Maybe
>> copy_context() would map to new_child()? Using ChainMap as a model we might
>> even avoid the confusion between Lo[gi]calContext and ExecutionContext which
>> was the nail in PEP 550's coffin. The LC associated with a generator in PEP
>> 550 would be akin to a loose dict which can be pushed on top of a ChainMap
>> using cm = cm.new_child(). (Always taking for granted that instead of
>> an actual dict we'd use some specialized mutable object implementing the
>> Mapping protocol and a custom mutation protocol so it can maintain
>> ContextVar cache consistency.)
> 
> The approach I took in PEP 568 is even simpler, I think. The PEP is a
> few pages long because I wanted to be exhaustive to make sure we
> weren't missing any details, but the tl;dr is: The ChainMap lives
> entirely inside the threadstate, so there's no need to create a LC/EC
> distinction -- users just see Contexts, or there's the one stack
> introspection API, get_context_stack(), which returns a List[Context].
> Instead of messing with new_child, copy_context is just
> Context(dict(chain_map)) -- i.e., it creates a flattened copy of the
> current mapping. (If we used new_child, then we'd have to have a way
> to return a ChainMap, reintroducing the LC/EC mess.

This sounds reasonable. Although keep in mind that merging hamt is still an 
expensive operation, so flattening shouldn't always be performed (this is 
covered in 550).

I also wouldn't call LC/EC a "mess". Your pep just names things differently, 
but otherwise is entirely built on concepts and ideas introduced in pep 550.

Yury

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 567 pre v3

2018-01-08 Thread Yury Selivanov
Hi,

Thanks to everybody participating in the PEP 567 discussion!  I want
to summarize a few topics to make sure that we are all on the same
page (and maybe provoke more discussion).


1. Proposal: ContextVar has default set to None.

>From the typing point of view that would mean that if a context
variable is declared without an explicit default, its type would be
Optional.  E.g. say we have a hypothetical web framework that allows
to access the current request object through a context variable:

  request_var: ContextVar[Optional[Request]] = \
  ContextVar('current_request')

When we need to get the current request object, we would write:

  request: Optional[Request] = request_var.get()

And we'd also need to explicitly handle when 'request' is set to None.
Of course we could create request_var with its default set to some
"InvalidRequest" object, but that would complicate things.  It would
be easier to just state that the framework always sets the current
request and it's a bug if it's not set.

Therefore, in my opinion, it's better to keep the current behaviour:
if a context variable was created without a default value,
ContextVar.get() can raise a LookupError.


2. Context.__contains__, Context.__getitem__ and ContexVar.default

So if we keep the current PEP 567 behaviour w.r.t. defaults,
ContextVar.get() might return a different value from Context.get():

v = ContextVar('v', default=42)
ctx = contextvars.copy_context()

ctx.get(v)   # returns None
v.get()   # returns 42
v in ctx  # returns False

I think this discrepancy is OK.  Context is a mapping-like object and
it reflects the contents of the underlying _ContextData mapping
object.

ContextVar.default is meant to be used only by ContextVar.get().
Context objects should not use it.

Maybe we can rename ContextVar.get() to ContextVar.lookup()?  This
would help to avoid potential confusion between Context.get() and
ContextVar.get().


3. Proposal: Context.get() and __getitem__() should always return up
to date values.

The issue with the current PEP 567 design is that PyThreadState points
to a _ContextData object, and not to the current Context.  The
following code illustrates how this manifests in Python code:

v = ContextVar('v')

def foo():
v.set(42)
print(v.get(), ctx.get(v, 'missing'))

ctx = Context()
ctx.run(foo)

The above code will print "42 missing", because 'ctx' points to an
outdated _ContextData.

This is easily fixable if we make PyThreadState to point to the
current Context object (instead of it pointing to a _ContextData).
This change will also make "contextvars.copy_context()" easier to
understand--it will actually return a copy of the current context that
the thread state points to.

Adding a private Context._in_use attribute would allow us to make sure
that Context.run() cannot be simultaneously called in two OS threads.
As Nathaniel points out, this will also simplify cache implementation
in ContextVar.get().  So let's do this.


4. Add Context.copy().

I was actually going to suggest this addition myself.  With the
current PEP 567 design, Context.copy() can be implemented with
"ctx.run(contextvars.copy_context)", but this is very cumbersome.

An example of when a copy() method could be useful is capturing the
current context and executing a few functions with it using
ThreadPoolExecutor.map().  Copying the Context object will ensure that
every mapped function executes in its own context copy (i.e.
isolated).  So I'm +1 for this one.


5. PEP language.

I agree that PEP is vague about some details and is incorrect in some
places (like calling Context objects immutable, which is not really
true, because .run() can modify them). I'll fix the language in v3
once I'm back home.


Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-03 Thread Yury Selivanov
I think we can expose the default property. If it's not set we can return 
MISSING.

Yury

Sent from my iPhone

> On Jan 3, 2018, at 1:04 PM, Victor Stinner  wrote:
> 
> Le 3 janv. 2018 06:38, "Guido van Rossum"  a écrit :
> But is there a common use case? For var.get() I'd rather just pass the 
> default or catch the exception if the flow is different. Using ctx[var] is 
> rare (mostly for printing contexts, and perhaps for explaining var.get()).
> 
> I don't think that it would be a common use case. Maybe we don't need 
> is_set(), I'm fine with catching an exception.
> 
> But for introspection at least, it would help to expose the default as a 
> read-only attribute, no?
> 
> Another example of a mapping with default value:
> 
> https://docs.python.org/dev/library/collections.html#collections.defaultdict
> 
> And defaultdict has a default_factory attribute. The difference here is that 
> default_factory is mandatory. ContextVar would be simpler if the default 
> would be mandatory as well :-)
> 
> Victor
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-03 Thread Yury Selivanov

> On Jan 3, 2018, at 12:26 PM, Victor Stinner <victor.stin...@gmail.com> wrote:
> 
> Le 3 janv. 2018 06:05, "Yury Selivanov" <yselivanov...@gmail.com> a écrit :
> tuples in Python are immutable, but you can have a tuple with a dict as its 
> single element. The tuple is immutable, the dict is mutable.
> 
> At the C level we have APIs that can mutate a tuple though.
> 
> Now, tuple is not a direct analogy to Context, but there are some parallels.  
> Context is a container like tuple, with some additional APIs on top.
> 
> Sorry, I don't think that it's a good analogy. Context.run() is a public 
> method accessible in Python which allows to modify the context. A tuple 
> doesn't have such method.
> 
> While it's technically possible to modify a tuple or a str at C level, it's a 
> bad practice leading to complex bugs when it's not done carefully: see 
> https://bugs.python.org/issue30156 property_descr_get() optimization was 
> fixed twice but still has a bug. I proposed a PR to remove the hack.
> 
>> Why Context could not inherit from MutableMapping? (Allow ctx.set(var, 
>> value) and ctx [var] = value.) Is it just to keep the API small: changes 
>> should only be made using var.set()?
> 
> Because that would be confusing to end users.
> 
>   ctx = copy_context()
>   ctx[var] = something
> 
> What did we just do?  Did we modify the 'var' in the code that is currently 
> executing? No, you still need to call Context.run to see the new value for 
> var.
> 
> IMHO it's easy to understand that modifying a *copy* of the current context 
> doesn't impact the current context. It's one the first thing to learn when 
> learning Python:
> 
> a = [1, 2]
> b = a.copy()
> b.append(3)
> assert a == [1, 2]
> assert b == [1, 2, 3]
> 
> Another problem is that MutableMapping defines a __delitem__ method, which i 
> don't want the Context to implement.
> 
> I wouldn't be shocked if "del ctx [var]" would raise an exception.
> 
> I almost never use del anyway. I prefer to assign a variable to None, since 
> "del var" looks like C++ destructor whereas it's more complex than a direct 
> call to the destructor.
> 
> But it's annoying to have to call a function with Context.run() whereas 
> context is just a mutable mapping. It seems overkill to me to have to call 
> run() to modify a context variable:

Do you have any use case for modifying a variable inside some context?

numpy, decimal, or some sort of tracing for http requests or async frameworks 
like asyncio do not need that.

> run() changes temporarely the context and requires to use the indirect 
> ContextVar API, while I know that ContextVar.set() modifies the context.
> 
> Except of del corner case, I don't see any technical reason to prevent direct 
> modification of a context.
> 
> contextvars isn't new, it extends what we already have: decimal context. And 
> decimal quick start documentation shows how to modify a context and then set 
> it as the current context:
> 

I think you are confusing context in decimal and pep 567.

Decimal context is a mutable object. We use threading.local to store it. With 
pep 567 you will use a context variable behind the scenes to store it.

I think it's incorrect to compare decimal contexts to pep567 in any way.

Yury

> >>> myothercontext = Context(prec=60, rounding=ROUND_HALF_DOWN)
> >>> setcontext(myothercontext)
> >>> Decimal(1) / Decimal(7)
> Decimal('0.142857142857142857142857142857142857142857142857142857142857')
> 
> https://docs.python.org/dev/library/decimal.html
> 
> Well, technically it doesn't modify a context. An example closer to 
> contextvars would be:
> 
> >>> mycontext = getcontext().copy()
> >>> mycontext.prec = 60
> >>> setcontext(mycontext)
> >>> Decimal(1) / Decimal(7)
> Decimal('0.142857142857142857142857142857142857142857142857142857142857')
> 
> Note: "getcontext().prec = 6" does modify the decimal context directly, and 
> it's the *first* example in the doc. But here contextvars is different since 
> there is no API to get the current API. The lack of API to access directly 
> the current contextvars context is the main difference with decimal context, 
> and I'm fine with that.
> 
> It's easy to see a parallel since decimal context can be copied using 
> Context.copy(), it has also multiple (builtin) "variables", it's just that 
> the API is different (decimal context variables are modified as attributes), 
> and it's possible to set a context using decimal.setcontext().
> 
> Victor
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-02 Thread Yury Selivanov
I don't want to expose a SetContext operation because of, again, potential
incompatibility with PEP 550, where generators expect to fully control
push/pop context operation.

Second, Context.run is 100% enough for *any* async framework to add support
for PEP 567. And because the PEP is focused just on async, I think that we
don't need anything more than 'run'.

Third, I have a suspicion that we focus too much on actual Context and
Context.run. These APIs are meant for asyncio/twisted/trio/etc maintainers,
not for an average Python user. An average person will likely not interact
with any of the PEP 567 machinery directly, wven when using PEP 567-enabled
libraries like numpy/decimal.

Yury

On Wed, Jan 3, 2018 at 2:56 AM Victor Stinner 
wrote:

> PEP:
> "int PyContext_Enter(PyContext *) and int PyContext_Exit(PyContext *)
> allow to set and restore the context for the current OS thread."
>
> What is the difference between Enter and Exit? Why not having a single
> Py_SetContext() function?
>
> Victor
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-02 Thread Yury Selivanov
On Wed, Jan 3, 2018 at 3:04 AM Victor Stinner 
wrote:

> What is the behaviour of ContextVar.reset(token) if the token was created
> from a different variable? Raise an exception?
>
> token = var1.set("value")
> var2.reset(token)
>
> The PEP states that Token.var only exists for debug or introspection.
>

It will raise an error. I'll specify this in the PEP.

Yury


> Victor
>
>
> Le 3 janv. 2018 00:51, "Victor Stinner"  a
> écrit :
>
> Why ContextVar.reset(token) does nothing at the second call with the same
> token? What is the purpose of Token._used? I guess that there is an use
> case to justify this behaviour.
>
> reset() should have a result: true if the variable was restored to its
> previous state, false if reset() did nothing because the token was already
> used. And/Or Token should have a read-only "used" property.
>
> Victor
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2018-01-02 Thread Yury Selivanov
On Wed, Jan 3, 2018 at 2:36 AM Victor Stinner 
wrote:

> > I would really like to invite more people to review this PEP! I expect
> I'll be accepting it in the next two weeks, but it needs to go through more
> rigorous review.
>
> I read again the PEP and I am still very confused by Context.run().
>
> The PEP states multiple times that a context is immutable:
>
> * "read-only mapping"
> * inherit from Mapping, not from MutableMapping
>

> But run() does modify the context (or please correct me if I completely 
> misunderstood
> the PEP! I had to read it 3 times to check if run() mutates or not the
> context).
>
> It would help if the ctx.run() example in the PEP would not only test
> var.get() but also test ctx.get(var). Or maybe show that the variable value
> is kept in a second function call, but the variable is "restored" between
> run() calls.
>
> The PEP tries hard to hide "context data", which is the only read only
> thing in the whole PEP, whereas it's a key concept to understand the
> implementation.
>
> I understood that:
>
> * _ContextData is immutable
> * ContextVar.set() creates a new _ContextData and sets it in the current
> Python thread state
> * When the called function completes, Context.run() sets its context data
> to the new context data from the Python thread state: so run() does modify
> the "immutable" context
>

tuples in Python are immutable, but you can have a tuple with a dict as its
single element. The tuple is immutable, the dict is mutable.

At the C level we have APIs that can mutate a tuple though.

Now, tuple is not a direct analogy to Context, but there are some
parallels.  Context is a container like tuple, with some additional APIs on
top.


>
> The distinction between the internal/hiden *immutable* context data and
> public/visible "mutable" (from my point of view) context is unclear to me
> in the PEP.
>
> The concept of "current context" is not defined in the PEP. In practice,
> there is no "current context", there is only a "current context data" in
> the current Python thread. There is no need for a concrete context instance
> to store variable variables values. It's also hard to understand that in
> the PEP.
>
>
> Why Context could not inherit from MutableMapping? (Allow ctx.set(var,
> value) and ctx [var] = value.) Is it just to keep the API small: changes
> should only be made using var.set()?
>

Because that would be confusing to end users.

  ctx = copy_context()
  ctx[var] = something

What did we just do?  Did we modify the 'var' in the code that is currently
executing? No, you still need to call Context.run to see the new value for
var.

Another problem is that MutableMapping defines a __delitem__ method, which
i don't want the Context to implement.  Deleting variables like that is
incompatible with PEP 550, where it's ambiguous (due to the stacked nature
of contexts).

Now we don't want PEP 550 in 3.7, but I want to keep the door open for its
design, in case we want context to work with generators.


> Or maybe Context.run() should really be immutable and return the result of
> the called function *and* a new context? But I dislike such theorical API,
> since it would be complex to return the new context if the called function
> raises an exception.
>

It can't return a new context because the callable you're running can raise
an exception. In which case you'd lose modifications prior to the error.

ps i'm on vacation and don't always have an internet connection.

yury


> Victor
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2017-12-28 Thread Yury Selivanov
On Thu, Dec 28, 2017 at 4:51 AM, Victor Stinner
 wrote:
> Hi,
>
> I like the new version of the PEP using "read only mapping" and
> copy_context(). It's easier to understand.

Thanks, Victor!

>
> I'm ok with seeing a context as a mapping, but I am confused about a context
> variable considered as a mapping item. I still see a context variable as a
> variable, so something which has a value or not. I just propose to rename
> the default parameter of the ContextVar constructor.

As Nathaniel already explained, a 'default' for ContextVars is
literally a default -- default value returned when a ContextVar hasn't
been assigned a value in a context.  So my opinion on this is that
'default' is the less ambiguous name here.

[..]
>
> * a read-only attribute ``Token.old_value`` set to the value the
>   variable had before the ``set()`` call, or to ``Token.MISSING``
>   if the variable wasn't set before.
>
>
> Hum, I also suggest to rename Token.MISSING to Token.NOT_SET. It would be
> more conistent with the last sentence.

I like MISSING more than NOT_SET, but this is very subjective, of
course.  If Guido wants to rename it I rename it.


> C API
> -
>
>
> Would it be possible to make this API private?

We want _decimal and numpy to use the new API, and they will call
ContextVar.get() on basically all operations, so it needs to be as
fast as possible.  asyncio/uvloop also want the fastest copy_context()
and Context.run() possible, as they use them for *every* callback.  So
I think it's OK for us to add new C APIs here.


>
> 2. ``int PyContextVar_Get(PyContextVar *, PyObject *default_value,
> PyObject **value)``:
> (...)  ``value`` is always a borrowed
>reference.
>
>
> I'm not sure that it's a good idea to add a new public C function which
> returns a borrowed reference. I would prefer to only use (regular) strong
> references in the public API.

Sure, I'll change it.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 v2

2017-12-28 Thread Yury Selivanov
On Thu, Dec 28, 2017 at 5:28 AM, Chris Jerdonek
 wrote:
> I have a couple basic questions around how this API could be used in
> practice. Both of my questions are for the Python API as applied to Tasks in
> asyncio.
>
> 1) Would this API support looking up the value of a context variable for
> **another** Task? For example, if you're managing multiple tasks using
> asyncio.wait() and there is an exception in some task, you might want to
> examine and report the value of a context variable for that task.

No, unless that another Task explicitly shares the value or captures
its context and shares it.  Same as with threading.local.

>
> 2) Would an appropriate use of this API be to assign a unique task id to
> each task? Or can that be handled more simply? I'm wondering because I
> recently thought this would be useful, and it doesn't seem like asyncio
> means for one to subclass Task (though I could be wrong).

The API should be used to share one ID between a Task and tasks it
creates. You can use it to store individual Task IDs, but a
combination of a WeakKeyDictionary and Task.current_task() seems to be
a better/easier option.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 567 v2

2017-12-27 Thread Yury Selivanov
This is a second version of PEP 567.

A few things have changed:

1. I now have a reference implementation:
https://github.com/python/cpython/pull/5027

2. The C API was updated to match the implementation.

3. The get_context() function was renamed to copy_context() to better
reflect what it is really doing.

4. Few clarifications/edits here and there to address earlier feedback.


Yury


PEP: 567
Title: Context Variables
Version: $Revision$
Last-Modified: $Date$
Author: Yury Selivanov <y...@magic.io>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 12-Dec-2017
Python-Version: 3.7
Post-History: 12-Dec-2017, 28-Dec-2017


Abstract


This PEP proposes a new ``contextvars`` module and a set of new
CPython C APIs to support context variables.  This concept is
similar to thread-local storage (TLS), but, unlike TLS, it also allows
correctly keeping track of values per asynchronous task, e.g.
``asyncio.Task``.

This proposal is a simplified version of :pep:`550`.  The key
difference is that this PEP is concerned only with solving the case
for asynchronous tasks, not for generators.  There are no proposed
modifications to any built-in types or to the interpreter.

This proposal is not strictly related to Python Context Managers.
Although it does provide a mechanism that can be used by Context
Managers to store their state.


Rationale
=

Thread-local variables are insufficient for asynchronous tasks that
execute concurrently in the same OS thread.  Any context manager that
saves and restores a context value using ``threading.local()`` will
have its context values bleed to other code unexpectedly when used
in async/await code.

A few examples where having a working context local storage for
asynchronous code is desirable:

* Context managers like ``decimal`` contexts and ``numpy.errstate``.

* Request-related data, such as security tokens and request
  data in web applications, language context for ``gettext``, etc.

* Profiling, tracing, and logging in large code bases.


Introduction


The PEP proposes a new mechanism for managing context variables.
The key classes involved in this mechanism are ``contextvars.Context``
and ``contextvars.ContextVar``.  The PEP also proposes some policies
for using the mechanism around asynchronous tasks.

The proposed mechanism for accessing context variables uses the
``ContextVar`` class.  A module (such as ``decimal``) that wishes to
store a context variable should:

* declare a module-global variable holding a ``ContextVar`` to
  serve as a key;

* access the current value via the ``get()`` method on the
  key variable;

* modify the current value via the ``set()`` method on the
  key variable.

The notion of "current value" deserves special consideration:
different asynchronous tasks that exist and execute concurrently
may have different values for the same key.  This idea is well-known
from thread-local storage but in this case the locality of the value is
not necessarily bound to a thread.  Instead, there is the notion of the
"current ``Context``" which is stored in thread-local storage, and
is accessed via ``contextvars.copy_context()`` function.
Manipulation of the current ``Context`` is the responsibility of the
task framework, e.g. asyncio.

A ``Context`` is conceptually a read-only mapping, implemented using
an immutable dictionary.  The ``ContextVar.get()`` method does a
lookup in the current ``Context`` with ``self`` as a key, raising a
``LookupError``  or returning a default value specified in
the constructor.

The ``ContextVar.set(value)`` method clones the current ``Context``,
assigns the ``value`` to it with ``self`` as a key, and sets the
new ``Context`` as the new current ``Context``.


Specification
=

A new standard library module ``contextvars`` is added with the
following APIs:

1. ``copy_context() -> Context`` function is used to get a copy of
   the current ``Context`` object for the current OS thread.

2. ``ContextVar`` class to declare and access context variables.

3. ``Context`` class encapsulates context state.  Every OS thread
   stores a reference to its current ``Context`` instance.
   It is not possible to control that reference manually.
   Instead, the ``Context.run(callable, *args, **kwargs)`` method is
   used to run Python code in another context.


contextvars.ContextVar
--

The ``ContextVar`` class has the following constructor signature:
``ContextVar(name, *, default=_NO_DEFAULT)``.  The ``name`` parameter
is used only for introspection and debug purposes, and is exposed
as a read-only ``ContextVar.name`` attribute.  The ``default``
parameter is optional.  Example::

# Declare a context variable 'var' with the default value 42.
var = ContextVar('var', default=42)

(The ``_NO_DEFAULT`` is an internal sentinel object used to
detect if the default value was provided.)

``ContextVar.get()`` returns a value for context variab

Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-18 Thread Yury Selivanov
On Mon, Dec 18, 2017 at 6:00 PM, Ivan Levkivskyi <levkivs...@gmail.com> wrote:
> On 13 December 2017 at 22:35, Yury Selivanov <yselivanov...@gmail.com>
> wrote:
>>
>> [..]
>> >> A new standard library module ``contextvars`` is added
>> >
>> > Why not add this to contextlib instead of adding a new module?  IIRC
>> > this was discussed relative to PEP 550, but I don't remember the
>> > reason.  Regardless, it would be worth mentioning somewhere in the
>> > PEP.
>> >
>>
>> The mechanism is generic and isn't directly related to context
>> managers.  Context managers can (and in many cases should) use the new
>> APIs to store global state, but the contextvars APIs do not depend on
>> context managers or require them.
>>
>
> This was the main point of confusion for me when reading the PEP.
> Concept of TLS is independent of context managers, but using word "context"
> everywhere leads to doubts like "Am I getting everything right?" I think
> just adding the
> two quoted sentences will clarify the intent.

I'll try to clarify this in the Abstract section.

>
> Otherwise the PEP is easy to read, the proposed API looks simple, and this
> definitely will be a useful feature.

Thanks, Ivan!

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-18 Thread Yury Selivanov
> 3. The connection pool has a queue, and creates a task for each connection to 
> serve requests from that queue. Naively, each task could inherit the context 
> of the request that caused it to be created, but the task would outlive the 
> request and go on to serve other requests. The connection pool would need to 
> specifically suppress the caller's context when creating its worker tasks.

I haven't used this pattern myself, but it looks like a good case for
adding a keyword-only 'context' rgument to `loop.create_task()`.  This
way the pool can capture the context when some API method is called
and pass it down to the queue along with the request.  The queue task
can then run connection code in that context.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-17 Thread Yury Selivanov
Hi Ben,

On Sun, Dec 17, 2017 at 10:38 AM, Ben Darnell <b...@bendarnell.com> wrote:
> On Tue, Dec 12, 2017 at 12:34 PM Yury Selivanov <yselivanov...@gmail.com>
> wrote:
>>
>> Hi,
>>
>> This is a new proposal to implement context storage in Python.
>>
>> It's a successor of PEP 550 and builds on some of its API ideas and
>> datastructures.  Contrary to PEP 550 though, this proposal only focuses
>> on adding new APIs and implementing support for it in asyncio.  There
>> are no changes to the interpreter or to the behaviour of generator or
>> coroutine objects.
>
>
> I like this proposal. Tornado has a more general implementation of a similar
> idea
> (https://github.com/tornadoweb/tornado/blob/branch4.5/tornado/stack_context.py),
> but it also tried to solve the problem of exception handling of
> callback-based code so it had a significant performance cost (to interpose
> try/except blocks all over the place). Limiting the interface to
> coroutine-local variables should keep the performance impact minimal.

Thank you, Ben!

Yes, task local API of PEP 567 has no impact on generators/coroutines.
Impact on asyncio should be well within 1-2% slowdown, visible only in
microbenchmarks (and asyncio will be 3-6% faster in 3.7 at least due
to some asyncio.Task C optimizations).

[..]
> One caveat based on Tornado's experience with stack_context: There are times
> when the automatic propagation of contexts won't do the right thing (for
> example, a database client with a connection pool may end up hanging on to
> the context from the request that created the connection instead of picking
> up a new context for each query).

I can see two scenarios that could lead to that:

1. The connection pool explicitly captures the context with 'get_context()' at
the point where it was created. It later schedules all of its code within the
captured context with Context.run().

2. The connection pool calls ContextVar.get() once and _caches_ it.

Both (1) and (2) are anti-patterns.  The documentation of asyncio and
contextvars
module will explain that users are supposed to simply call
ContextVar.get() whenever
they need to get a context value (e.g. there's no need to
cache/persist it) and that
 they should not manage the context manually (just trust asyncio to do
that for you).

Thank you,
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Make __class_getitem__ a class method

2017-12-15 Thread Yury Selivanov
On Fri, Dec 15, 2017 at 12:45 PM, Serhiy Storchaka <storch...@gmail.com> wrote:
> 15.12.17 18:47, Yury Selivanov пише:
>>
>> Shouldn't we optimize the usability for pure-Python first, and then for C
>> API?
>>
>> Right now we have the '__new__' magic method, which isn't a
>> @classmethod.  Making '__class_getitem__' a @classmethod will confuse
>> regular Python users.  For example:
>>
>> class Foo:
>>def __new__(cls, ...): pass
>>
>>@classmethod
>>def __class_getitem__(cls, item): pass
>>
>> To me it makes sense that type methods that are supposed to be called
>> on type by the Python interpreter don't need the classmethod
>> decorator.
>>
>> METH_STATIC is a public working API, and in my opinion it's totally
>> fine if we use it. It's not even hard to use it, it's just *mildly*
>> inconvenient at most.
>
>
> __new__ is not a class method, it is an "automatic" static method.

I never said that __new__ is a class method :)

> The following two declarations are equivalent:
>
> class A:
> def __new__(cls): return cls.__name__
>
> class B:
> @staticmethod
> def __new__(cls): return cls.__name__


But nobody decorates __new__ with a @staticmethod.  And making
__class_getitem__ a @classmethod will only confuse users -- that's all
I'm saying.

So I'm +1 to keep the things exactly as they are now.  It would be
great do document that in order to implement __class_getitem__ in C
one should add it as METH_STATIC.  I also think we should merge your
PR that tests that it works the way it's expected.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Make __class_getitem__ a class method

2017-12-15 Thread Yury Selivanov
Shouldn't we optimize the usability for pure-Python first, and then for C API?

Right now we have the '__new__' magic method, which isn't a
@classmethod.  Making '__class_getitem__' a @classmethod will confuse
regular Python users.  For example:

   class Foo:
  def __new__(cls, ...): pass

  @classmethod
  def __class_getitem__(cls, item): pass

To me it makes sense that type methods that are supposed to be called
on type by the Python interpreter don't need the classmethod
decorator.

METH_STATIC is a public working API, and in my opinion it's totally
fine if we use it. It's not even hard to use it, it's just *mildly*
inconvenient at most.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Accepting PEP 560 -- Core support for typing module and generic types

2017-12-15 Thread Yury Selivanov
> I don't see any problems with implementing this on types defined in C. This 
> isn't harder than implementing __sizeof__ or pickling support, and NumPy 
> classes already have implemented both. Maybe Yury forgot about METH_STATIC 
> and METH_CLASS?

I just tested __class_getitem__ defined via METH_STATIC and it works.
This means we don't need to add slots.  Thanks for the hint, Serhiy!

Ivan, this might be worth mentioning in the PEP or in the docs.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Accepting PEP 560 -- Core support for typing module and generic types

2017-12-14 Thread Yury Selivanov
On Thu, Dec 14, 2017 at 9:00 PM, Guido van Rossum  wrote:
> In the light of Antoine's and Stephan's feedback I think this can be
> reconsidered -- while I want to take a cautious stance about resource
> consumption I don't want to stand in the way of progress.

I've created an issue to discuss this further:
https://bugs.python.org/issue32332

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Accepting PEP 560 -- Core support for typing module and generic types

2017-12-14 Thread Yury Selivanov
On Thu, Dec 14, 2017 at 7:03 PM, Guido van Rossum  wrote:
> A slot is pretty expensive, as *every* class in existence will be another 8
> bytes larger (and possibly more due to malloc rounding). So unless we find
> that there's a significant performance benefit I think we should hold back
> on this. IIRC Ivan has already measured an order of magnitude's speedup
> (well, 7x), so we may not need it. :-)

My motivation to add the slot wasn't the performance: it's just not
possible to have a class-level __getitem__ on types defined in C. The
only way is to define a base class in C and then extend it in
pure-Python.  This isn't too hard usually, though.

BTW that slot could also host the new __mro_entries__ method, and,
potentially, other magic methods like __subclasscheck__ and
__instancecheck__.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Accepting PEP 560 -- Core support for typing module and generic types

2017-12-14 Thread Yury Selivanov
Ivan, Guido,

Would it be possible to add a slot so that types defined in C can
implement __class_getitem__?

static PyClassMethodDef class_methods = {
foo_class_getitem /* cm_class_getitem */
}

static PyTypeObject Foo = {
.tp_class_methods = class_methods
}

Yury

On Mon, Dec 4, 2017 at 5:18 PM, Ivan Levkivskyi  wrote:
> Thank you! It looks like we have a bunch of accepted PEPs today.
> It is great to see all this! Thanks everyone who participated in discussions
> here, on python-ideas and
> on typing tracker. Special thanks to Mark who started this discussion.
>
> --
> Ivan
>
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-13 Thread Yury Selivanov
Hi Eric,

Thanks for a detailed review!

On Wed, Dec 13, 2017 at 3:59 PM, Eric Snow <ericsnowcurren...@gmail.com> wrote:
> Overall, I like this PEP.  It's definitely easier to follow
> conceptually than PEP 550.  Thanks for taking the time to re-think the
> idea.  I have a few comments in-line below.
>
> -eric
>
> On Tue, Dec 12, 2017 at 10:33 AM, Yury Selivanov
> <yselivanov...@gmail.com> wrote:
>> This is a new proposal to implement context storage in Python.
>
> +1
>
> This is something I've had on my back burner for years.  Getting this
> right is non-trivial, so having a stdlib implementation will help open
> up clean solutions in a number of use cases that are currently
> addressed in more error-prone ways.

Right!

>
>>
>> It's a successor of PEP 550 and builds on some of its API ideas and
>> datastructures.  Contrary to PEP 550 though, this proposal only focuses
>> on adding new APIs and implementing support for it in asyncio.  There
>> are no changes to the interpreter or to the behaviour of generator or
>> coroutine objects.
>
> Do you have any plans to revisit extension of the concept to
> generators and coroutine objects?  I agree they can be addressed
> separately, if necessary.  TBH, I'd expect this PEP to provide an
> approach that allows such applications of the concept to effectively
> be implementation details that can be supported later.

Maybe we'll extend the concept to work for generators in Python 3.8,
but that's a pretty remote topic to discuss (and we'll need a new PEP
for that).  In case we decide to do that, PEP 550 provides a good
implementation plan, and PEP 567 are forward-compatible with it.

>
>> Abstract
>> 
>>
>> This PEP proposes the new ``contextvars`` module and a set of new
>> CPython C APIs to support context variables.  This concept is
>> similar to thread-local variables but, unlike TLS, it allows
>
> s/it allows/it also allows/

Will fix it.

[..]
>> A new standard library module ``contextvars`` is added
>
> Why not add this to contextlib instead of adding a new module?  IIRC
> this was discussed relative to PEP 550, but I don't remember the
> reason.  Regardless, it would be worth mentioning somewhere in the
> PEP.
>

The mechanism is generic and isn't directly related to context
managers.  Context managers can (and in many cases should) use the new
APIs to store global state, but the contextvars APIs do not depend on
context managers or require them.

I also feel that contextlib is a big module already, so having the new
APIs in their separate module and having a separate documentation page
makes it more approachable.

>> with the
>> following APIs:
>>
>> 1. ``get_context() -> Context`` function is used to get the current
>>``Context`` object for the current OS thread.
>>
>> 2. ``ContextVar`` class to declare and access context variables.
>
> It may be worth explaining somewhere in the PEP the reason why you've
> chosen to add ContextVar instead of adding a new keyword (e.g.
> "context", a la global and nonlocal) to do roughly the same thing.
> Consider that execution contexts are very much a language-level
> concept, a close sibling to scope.  Driving that via a keyword would a
> reasonable approach, particularly since it introduces less coupling
> between a language-level feature and a stdlib module.  (Making it a
> builtin would sort of help with that too, but a keyword would seem
> like a better fit.)  A keyword would obviate the need for explicitly
> calling .get() and .set().
>
> FWIW, I agree with not adding a new keyword.  To me context variables
> are a low-level tool for library authors to implement their high-level
> APIs.  ContextVar, with its explicit .get() and .set() methods is a
> good fit for that and better communicates the conceptual intent of the
> feature.  However, it would still be worth explicitly mentioning the
> alternate keyword-based approach in the PEP.

Yeah, adding keywords is way harder than adding a new module.  It
would require a change in Grammar, new opcodes, changes to frameobject
etc. I also don't think that ContextVars will be that popular to have
their own syntax -- how many threadlocals do you see every day?

For PEP 567/550 a keyword isn't really needed, we can implement the
concept with a ContextVar class.

>>
>> 3. ``Context`` class encapsulates context state.  Every OS thread
>>stores a reference to its current ``Context`` instance.
>>It is not possible to control that reference manually.
>>Instead, the ``Context.run(callable, *args)`` method is used to run
>>Python code in another context.
>
> I'd call that "Context.call()" since its for cal

Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-12 Thread Yury Selivanov
On Tue, Dec 12, 2017 at 10:36 PM, Guido van Rossum  wrote:
> Some more feedback:
>
>> This proposal builds directly upon concepts originally introduced
>> in :pep:`550`.
>
> The phrase "builds upon" typically implies that the other resource must be
> read and understood first. I don't think that we should require PEP 550 for
> understanding of PEP 567. Maybe "This proposal is a simplified version of
> :pep:`550`." ?

I agree, "simplified version" is better.

>
>> The notion of "current value" deserves special consideration:
>> different asynchronous tasks that exist and execute concurrently
>> may have different values.  This idea is well-known from thread-local
>> storage but in this case the locality of the value is not always
>> necessarily to a thread.  Instead, there is the notion of the
>> "current ``Context``" which is stored in thread-local storage, and
>> is accessed via ``contextvars.get_context()`` function.
>> Manipulation of the current ``Context`` is the responsibility of the
>> task framework, e.g. asyncio.
>
> This begs two (related) questions:
> - If it's stored in TLS, why isn't it equivalent to TLS?
> - If it's read-only (as mentioned in the next paragraph) how can the
> framework modify it?
>
> I realize the answers are clear, but at this point in the exposition you
> haven't given the reader enough information to answer them, so this
> paragraph may confuse readers.

I'll think how to rephrase it.

>
>> Specification
>> =
>> [points 1, 2, 3]
>
> Shouldn't this also list Token? (It must be a class defined here so users
> can declare the type of variables/arguments in their code representing these
> tokens.)
>
>> The ``ContextVar`` class has the following constructor signature:
>> ``ContextVar(name, *, default=no_default)``.
>
> I think a word or two about the provenance of `no_default` would be good. (I
> think it's an internal singleton right?) Ditto for NO_DEFAULT in the C
> implementation sketch.

Fixed.

>
>> class Task:
>> def __init__(self, coro):
>
> Do we need a keyword arg 'context=None' here too? (I'm not sure what would
> be the use case, but somehow it stands out in comparison to call_later()
> etc.)

call_later() is low-level and it needs the 'context' argument as Task
and Future use it in their implementation.

It would be easy to add 'context' parameter to Task and
loop.create_task(), but I don't know about any concrete use-case for
that just yet.

>
>> CPython C API
>> -
>> TBD
>
> Yeah, what about it? :-)

I've added it: https://github.com/python/peps/pull/508/files

I didn't want to get into too much detail about the C API until I have
a working PR. Although I feel that the one I describe in the PEP now
is very close to what we'll have.

>
>> The internal immutable dictionary for ``Context`` is implemented
>> using Hash Array Mapped Tries (HAMT).  They allow for O(log N) ``set``
>> operation, and for O(1) ``get_context()`` function.  [...]
>
> I wonder if we can keep the HAMT out of the discussion at this point. I have
> nothing against it, but given that you already say you're leaving out
> optimizations and nothing in the pseudo code given here depends on them I
> wonder if they shouldn't be mentioned later. (Also the appendix with the
> perf analysis is the one thing that I think we can safely leave out, just
> reference PEP 550 for this.)

I've added a new section "Implementation Notes" that mentions HAMT and
ContextVar.get() cache.  Both refer to PEP 550's lengthy explanations.

>
>> class _ContextData
>
> Since this isn't a real class anyway I think the __mapping attribute might
> as well be named _mapping. Ditto for other __variables later.

Done.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 567 -- Context Variables

2017-12-12 Thread Yury Selivanov
On Tue, Dec 12, 2017 at 9:55 PM, Guido van Rossum <gu...@python.org> wrote:
> On Tue, Dec 12, 2017 at 5:35 PM, Yury Selivanov <yselivanov...@gmail.com>
> wrote:
>>
>> On Tue, Dec 12, 2017 at 6:49 PM, Victor Stinner
>> <victor.stin...@gmail.com> wrote:
>> > I like the overall idea and I prefer this PEP over PEP 550 since it's
>> > shorter and easier to read :-)
>> >
>> > Question: Is there an API to list all context variables?
>>
>> Context implements abc.Mapping, so 'get_context().keys()' will give
>> you a list of all ContextVars in the current context.
>
>
> This was hinted at in the PEP, but maybe an explicit example would be nice.

Sure.

>
>>
>> > Each get_context() call returns a new Context object. It may be worth
>> > to mention it. I understand why, but it's surprising that "assert
>> > get_context() is not get_context()" fails. Maybe it's a naming issue?
>> > Maybe rename it to contextvars.context()?
>>
>> I think the name is fine.  While get_context() will return a new instance
>> every time you call it, those instances will have the same context
>> variables/values in them, so I don't think it's a problem.
>
>
> I'm fine with this, but perhaps == should be supported so that those two are
> guaranteed to be considered equal? (Otherwise an awkward idiom to compare
> contexts using expensive dict() copies would be needed to properly compare
> two contexts for equality.)

I've no problem with implementing 'Context.__eq__'.  I think
abc.Mapping also implements it.

>
>>
>> > At the first read, I understood that that ctx.run() creates a new
>> > temporary context which is removed once ctx.run() returns.
>> >
>> > Now I understand that context variable values are restored to their
>> > previous values once run() completes. Am I right?
>>
>> ctx.run(func) runs 'func' in the 'ctx' context.  Any changes to
>> ContextVars that func makes will stay isolated to the 'ctx' context.
>>
>> >
>> > Maybe add a short comment to explain that?
>>
>> Added.
>
>
> The PEP still contains the following paragraph:
>
>> Any changes to the context will be contained and persisted in the
>> ``Context`` object on which ``run()`` is called on.
>
> This phrase is confusing; it could be read as implying that context changes
> made by the function *will* get propagated back to the caller of run(),
> contradicting what was said earlier. Maybe it's best to just delete it?
> Otherwise if you intend it to add something it needs to be rephrased. Maybe
> "persisted" is the key word causing confusion?

I'll remove "persisted" now, I agree it adds more confusion than
clarity.  Victor is also confused with how 'Context.run()' is
currently explained, I'll try to make it clearer.

Thank you,
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


  1   2   3   4   5   6   >