Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Nathaniel Smith
On Sun, Nov 26, 2017 at 9:33 PM, Caleb Hattingh
 wrote:
> The PEP only says that __await__ must return an iterator, but it turns out
> that it's also required that that iterator
> should not return any intermediate values.

I think you're confused :-). When the iterator yields an intermediate
value, it does two things:

(1) it suspends the current call stack and returns control to the
coroutine runner (i.e. the event loop)
(2) it sends some arbitrary value back to the coroutine runner

The whole point of `await` is that it can do (1) -- this is what lets
you switch between executing different tasks, so they can pretend to
execute in parallel. However, you do need to make sure that your
__await__ and your coroutine runner are on the same page with respect
to (2) -- if you send a value that the coroutine runner isn't
expecting, it'll get confused. Generally async libraries control both
the coroutine runner and the __await__ method, so they get to invent
whatever arbitrary convention they want.

In asyncio, the convention is that the values you send back must be
Future objects, and the coroutine runner interprets this as a request
to wait for the Future to be resolved, and then resume the current
call stack. In curio, the convention is that you send back a special
tuple describing some operation you want the event loop to perform
[1], and then it resumes your call stack once that operation has
finished. And Trio barely uses this channel at all. (It does transfer
a bit of information that way for convenience/speed, but the main work
of setting up the task to be resumed at the appropriate time happens
through other mechanisms.)

What you observed is that the asyncio coroutine runner gets cranky if
you send it an integer when it was expecting a Future.

Since most libraries assume that they control both __await__ and the
coroutine runner, they don't tend to give great error messages here
(though trio does [2] ;-)). I think this is also why the asyncio docs
don't talk about this. I guess in asyncio's case it is technically a
semi-public API because you need to know how it works if you're the
author of a library like tornado or twisted that wants to integrate
with asyncio. But most people aren't the authors of tornado or
twisted, and the ones who are already know how this works, so the lack
of docs isn't a huge deal in practice...

-n

[1] 
https://github.com/dabeaz/curio/blob/bd0e2cb7741278d1d9288780127dc0807b1aa5b1/curio/traps.py#L48-L156
[2] 
https://github.com/python-trio/trio/blob/2b8e297e544088b98ff758d37c7ad84f74c3f2f5/trio/_core/_run.py#L1521-L1530

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Antoine Pitrou
On Mon, 27 Nov 2017 00:41:55 -0800
Nathaniel Smith  wrote:
> 
> Since most libraries assume that they control both __await__ and the
> coroutine runner, they don't tend to give great error messages here
> (though trio does [2] ;-)). I think this is also why the asyncio docs
> don't talk about this. I guess in asyncio's case it is technically a
> semi-public API because you need to know how it works if you're the
> author of a library like tornado or twisted that wants to integrate
> with asyncio. But most people aren't the authors of tornado or
> twisted, and the ones who are already know how this works, so the lack
> of docs isn't a huge deal in practice...

This does seem to mean that it can be difficult to provide a __await__
method that works with different coroutine runners, though.  For
example, Tornado Futures implement __await__ for compatibility with the
asyncio event loop.  But what if Tornado wants to make its Future class
compatible with an event loop that requires a different __await__
convention?

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Paul Sokolovsky
Hello,

On Mon, 27 Nov 2017 15:33:51 +1000
Caleb Hattingh  wrote:

[]

> The PEP only says that __await__ must return an iterator, but it
> turns out that it's also required that that iterator
> should not return any intermediate values.  This requirement is only
> enforced in the event loop, not
> in the `await` call itself.  I was surprised by that:

[]

> So we drive the coroutine manually using send(), and we see that
> intermediate calls return the illegally-yielded values.  I broke the

You apparently mix up the language and a particular asynchronous
scheduling library (even if that library ships with the language).

There're gazillion of async scheduling libraries for Python, and at
least some of them welcome use of "yield" in coroutines (even if
old-style). Moreover, you can always use yield in your own generators,
over which you iterate yourself, all running in the coroutine async
scheduler. When you do this, you will need to check type of values you
get from iteration - if those are "yours", you consume them, if they're
not yours, you re-yield them for higher levels to consume (ultimately,
for the scheduler itself).


-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/2017 1:04 AM, Nick Coghlan wrote:

On 27 November 2017 at 15:04, Greg Ewing  wrote:

Nick Coghlan wrote:


Perhaps the check could be:

   (type(lhs) == type(rhs) or fields(lhs) == fields(rhs)) and all
(individual fields match)



I think the types should *always* have to match, or at least
one should be a subclass of the other. Consider:

@dataclass
class Point3d:
 x: float
 y: float
 z: float

@dataclass
class Vector3d:
 x: float
 y: float
 z: float

Points and vectors are different things, and they should never
compare equal, even if they have the same field names and values.


And I guess if folks actually want more permissive structure-based
matching, that's one of the features that collections.namedtuple
offers that data classes don't.


And in this case you could also do:
astuple(point) == astuple(vector)

Eric.

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Caleb Hattingh
On 27 November 2017 at 18:41, Nathaniel Smith  wrote:
>
> In asyncio, the convention is that the values you send back must be
> Future objects,


Thanks this is useful. I didn't pick this up from the various PEPs or
documentation.  I guess I need to go through the src :)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Sebastian Rittau

On 25.11.2017 22:06, Eric V. Smith wrote:
The updated version should show up at 
https://www.python.org/dev/peps/pep-0557/ shortly.


This PEP looks very promising and will make my life quite a bit easier, 
since we are using a pattern involving data classes. Currently, we write 
the constructor by hand.



The major changes from the previous version are:

- Add InitVar to specify initialize-only fields. 


This is the only feature that does not sit right with me. It looks very 
obscure and "hacky". From what I understand, we are supposed to use the 
field syntax to define constructor arguments. I'd argue that the name 
"initialize-only fields" is a misnomer, which only hides the fact that 
this has nothing to do with fields at all. Couldn't dataclassses just 
pass *args and **kwargs to __post_init__()? Type checkers need to be 
special-cases for InitVar anyway, couldn't they instead be special cased 
to look at __post_init__ argument types?


 - Sebastian
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/2017 6:01 AM, Sebastian Rittau wrote:

On 25.11.2017 22:06, Eric V. Smith wrote:



The major changes from the previous version are:

- Add InitVar to specify initialize-only fields. 


This is the only feature that does not sit right with me. It looks very 
obscure and "hacky". From what I understand, we are supposed to use the 
field syntax to define constructor arguments. I'd argue that the name 
"initialize-only fields" is a misnomer, which only hides the fact that 
this has nothing to do with fields at all. Couldn't dataclassses just 
pass *args and **kwargs to __post_init__()? Type checkers need to be 
special-cases for InitVar anyway, couldn't they instead be special cased 
to look at __post_init__ argument types?


First off, I expect this feature to be used extremely rarely. I'm 
tempted to remove it, since it's infrequently needed and it could be 
added later. And as the PEP points out, you can get most of the way with 
an alternate classmethod constructor.


I had something like your suggestion half coded up, except I inspected 
the args to __post_init__() and added them to __init__, avoiding the 
API-unfriendly *args and **kwargs.


So in:
@dataclass
class C:
x: int
y: int

def __post_init__(self, database: DatabaseType): pass

Then the __init__ signature became:

def __init__(self, x:int, y:int, database:DatabaseType):

In the end, that seems like a lot of magic (but what about this isn't?), 
it required the inspect module to be imported, and I thought it made 
more sense for all of the init params to be near each other:


@dataclass
class C:
x: int
y: int
database: InitVar[DatabaseType]

def __post_init__(self, database): pass

No matter what we do here, static type checkers are going to have to be 
aware of either the InitVars or the hoisting of params from 
__post_init__ to __init__.


One other thing about InitVar: it lets you control where the init-only 
parameter goes in the __init__ call. This is especially important with 
default values:


@dataclass
class C:
x: int
database: InitVar[DatabaseType]
y: int = 0

def __post_init__(self, database): pass

In this case, if I were hoisting params from __post_init__ to __init__, 
the __init__ call would be:


def __init__(self, x, y=0, database)

Which is an error. I guess you could say the init-only parameters would 
go first in the __init__ definition, but then you have the same problem 
if any of them have default values.


Eric.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Sebastian Rittau

On 27.11.2017 12:01, Sebastian Rittau wrote:



The major changes from the previous version are:

- Add InitVar to specify initialize-only fields. 


This is the only feature that does not sit right with me. It looks 
very obscure and "hacky". From what I understand, we are supposed to 
use the field syntax to define constructor arguments. I'd argue that 
the name "initialize-only fields" is a misnomer, which only hides the 
fact that this has nothing to do with fields at all. Couldn't 
dataclassses just pass *args and **kwargs to __post_init__()? Type 
checkers need to be special-cases for InitVar anyway, couldn't they 
instead be special cased to look at __post_init__ argument types? 
I am sorry for the double post, but I thought a bit more about why this 
does not right with me:


 * As written above, InitVars look like fields, but aren't.
 * InitVar goes against the established way to pass through arguments,
   *args and **kwargs. While type checking those is an unsolved
   problem, from what I understand, I don't think we should introduce a
   second way just for dataclasses.
 * InitVars look like a way to satisfy the type checker without
   providing any benefit to the programmer. Even when I'm not
   interested in type checking, I have to declare init vars.
 * InitVars force me to repeat myself. I have the InitVar declaration
   and then I have the repeat myself in the signature of
   __post_init__(). This has all the usual problems of repeated code.

I hope I did not misunderstood the purpose of InitVar.

 - Sebastian

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Sebastian Rittau

On 27.11.2017 13:23, Eric V. Smith wrote:
I had something like your suggestion half coded up, except I inspected 
the args to __post_init__() and added them to __init__, avoiding the 
API-unfriendly *args and **kwargs.
I understand your concerns with *args and **kwargs. I think we need to 
find a solution for that eventually.


One other thing about InitVar: it lets you control where the init-only 
parameter goes in the __init__ call. This is especially important with 
default values:


This is indeed a nice property. I was thinking about that myself and how 
to best handle it. One use case that could occur in out codebase is 
passing in a "context" argument. By convention, this is always the first 
argument to the constructor, so it would be nice if this would also work 
for dataclasses.


 - Sebastian

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/2017 7:31 AM, Sebastian Rittau wrote:

On 27.11.2017 13:23, Eric V. Smith wrote:
I had something like your suggestion half coded up, except I inspected 
the args to __post_init__() and added them to __init__, avoiding the 
API-unfriendly *args and **kwargs.
I understand your concerns with *args and **kwargs. I think we need to 
find a solution for that eventually.


One other thing about InitVar: it lets you control where the init-only 
parameter goes in the __init__ call. This is especially important with 
default values:


This is indeed a nice property. I was thinking about that myself and how 
to best handle it. One use case that could occur in out codebase is 
passing in a "context" argument. By convention, this is always the first 
argument to the constructor, so it would be nice if this would also work 
for dataclasses.


And that's the one thing that you can't do with an alternate classmethod 
constructor, and is the reason I added InitVar: you can't force a 
non-field parameter such as a context (or in my example, a database) to 
be always present when instances are constructed. And also consider the 
"replace()" module method. InitVars must also be supplied there, whereas 
with a classmethod constructor, they wouldn't be.


This is for the case where a context or database is needed to construct 
the instance, but isn't stored as a field on the instance. Again, not 
super-common, but it does happen. My point here is not that InitVar is 
better than __post_init__ parameter hoisting for this specific need, but 
that both of them provide something that classmethod constructors do 
not. I'll add some wording on this to the PEP.


Eric.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/2017 7:26 AM, Sebastian Rittau wrote:

On 27.11.2017 12:01, Sebastian Rittau wrote:



The major changes from the previous version are:

- Add InitVar to specify initialize-only fields. 


This is the only feature that does not sit right with me. It looks 
very obscure and "hacky". From what I understand, we are supposed to 
use the field syntax to define constructor arguments. I'd argue that 
the name "initialize-only fields" is a misnomer, which only hides the 
fact that this has nothing to do with fields at all. Couldn't 
dataclassses just pass *args and **kwargs to __post_init__()? Type 
checkers need to be special-cases for InitVar anyway, couldn't they 
instead be special cased to look at __post_init__ argument types? 
I am sorry for the double post, but I thought a bit more about why this 
does not right with me:


  * As written above, InitVars look like fields, but aren't.


Same as with ClassVars, which is where the inspiration came from.


  * InitVar goes against the established way to pass through arguments,
*args and **kwargs. While type checking those is an unsolved
problem, from what I understand, I don't think we should introduce a
second way just for dataclasses.
  * InitVars look like a way to satisfy the type checker without
providing any benefit to the programmer. Even when I'm not
interested in type checking, I have to declare init vars.


Same as with ClassVars, if you're using them. And that's not just a 
dataclasses thing, although dataclasses is the first place I know of 
where it would change the code semantics.



  * InitVars force me to repeat myself. I have the InitVar declaration
and then I have the repeat myself in the signature of
__post_init__(). This has all the usual problems of repeated code.


There was some discussion about this starting at 
https://github.com/ericvsmith/dataclasses/issues/17#issuecomment-345529717, 
in particular a few messages down where we discussed what would be 
repeated, and what mypy would be able to deduce. You won't need to 
repeat the type declaration.



I hope I did not misunderstood the purpose of InitVar.


I think you understand it perfectly well, especially with the "context" 
discussion. Thanks for bringing it up.


Eric.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Greg Ewing

Chris Angelico wrote:

I'm not sure there's any distinction between a "point" and a "vector
from the origin to a point".


They transform differently. For example, translation affects
a point, but makes no difference to a vector.

There are two ways of dealing with that. One is to use vectors
to represent both and have two different operations, "transform
point" and "transform vector".

The other is to represent them using different types and have
one operation that does different things depending on the
type.

The advantage of the latter is that you can't accidentally
apply the wrong operation, e.g. transform_point on something
that's actually a vector.

(There's actually a third way -- use homogeneous coordinates
and represent points as (x, y, z, 1) and vectors as
(x, y, z, 0). But that's really a variation on the "different
types" idea.)

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/17 10:51 AM, Guido van Rossum wrote:

Following up on this subthread (inline below).

On Mon, Nov 27, 2017 at 2:56 AM, Eric V. Smith mailto:[email protected]>> wrote:

On 11/27/2017 1:04 AM, Nick Coghlan wrote:

On 27 November 2017 at 15:04, Greg Ewing
mailto:[email protected]>> wrote:

Nick Coghlan wrote:


Perhaps the check could be:

   (type(lhs) == type(rhs) or fields(lhs) ==
fields(rhs)) and all
(individual fields match)



I think the types should *always* have to match, or at least
one should be a subclass of the other. Consider:

@dataclass
class Point3d:
 x: float
 y: float
 z: float

@dataclass
class Vector3d:
 x: float
 y: float
 z: float

Points and vectors are different things, and they should never
compare equal, even if they have the same field names and
values.


And I guess if folks actually want more permissive structure-based
matching, that's one of the features that collections.namedtuple
offers that data classes don't.


And in this case you could also do:
astuple(point) == astuple(vector)


Didn't we at one point have something like

isinstance(other, self.__class__) and fields(other) == fields(self) and


(plus some optimization if the types are identical)?

That feels ideal, because it means you can subclass Point just to add
some methods and it will stay comparable, but if you add fields it will
always be unequal.


I don't think we had that before, but it sounds right to me. I think it 
could be:


isinstance(other, self.__class__) and len(fields(other)) == 
len(fields(self)) and 


Since by definition if you're a subclass you'll start with all of the 
same fields. So if the len's match, you won't have added any new fields. 
That should be sufficiently cheap.


Then the optimized version would be:

(self.__class__ is other.__class__) or (isinstance(other, 
self.__class__) and len(fields(other)) == len(fields(self))) and individual fields match>


I'd probably further optimize len(fields(obj)), but that's the general idea.

Eric.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Guido van Rossum
Following up on this subthread (inline below).

On Mon, Nov 27, 2017 at 2:56 AM, Eric V. Smith  wrote:

> On 11/27/2017 1:04 AM, Nick Coghlan wrote:
>
>> On 27 November 2017 at 15:04, Greg Ewing 
>> wrote:
>>
>>> Nick Coghlan wrote:
>>>

 Perhaps the check could be:

(type(lhs) == type(rhs) or fields(lhs) == fields(rhs)) and all
 (individual fields match)

>>>
>>>
>>> I think the types should *always* have to match, or at least
>>> one should be a subclass of the other. Consider:
>>>
>>> @dataclass
>>> class Point3d:
>>>  x: float
>>>  y: float
>>>  z: float
>>>
>>> @dataclass
>>> class Vector3d:
>>>  x: float
>>>  y: float
>>>  z: float
>>>
>>> Points and vectors are different things, and they should never
>>> compare equal, even if they have the same field names and values.
>>>
>>
>> And I guess if folks actually want more permissive structure-based
>> matching, that's one of the features that collections.namedtuple
>> offers that data classes don't.
>>
>
> And in this case you could also do:
> astuple(point) == astuple(vector)
>

Didn't we at one point have something like

isinstance(other, self.__class__) and fields(other) == fields(self) and


(plus some optimization if the types are identical)?

That feels ideal, because it means you can subclass Point just to add some
methods and it will stay comparable, but if you add fields it will always
be unequal.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Guido van Rossum
Sounds good.

On Nov 27, 2017 8:00 AM, "Eric V. Smith"  wrote:

> On 11/27/17 10:51 AM, Guido van Rossum wrote:
>
>> Following up on this subthread (inline below).
>>
>> On Mon, Nov 27, 2017 at 2:56 AM, Eric V. Smith > > wrote:
>>
>> On 11/27/2017 1:04 AM, Nick Coghlan wrote:
>>
>> On 27 November 2017 at 15:04, Greg Ewing
>> > > wrote:
>>
>> Nick Coghlan wrote:
>>
>>
>> Perhaps the check could be:
>>
>>(type(lhs) == type(rhs) or fields(lhs) ==
>> fields(rhs)) and all
>> (individual fields match)
>>
>>
>>
>> I think the types should *always* have to match, or at least
>> one should be a subclass of the other. Consider:
>>
>> @dataclass
>> class Point3d:
>>  x: float
>>  y: float
>>  z: float
>>
>> @dataclass
>> class Vector3d:
>>  x: float
>>  y: float
>>  z: float
>>
>> Points and vectors are different things, and they should never
>> compare equal, even if they have the same field names and
>> values.
>>
>>
>> And I guess if folks actually want more permissive structure-based
>> matching, that's one of the features that collections.namedtuple
>> offers that data classes don't.
>>
>>
>> And in this case you could also do:
>> astuple(point) == astuple(vector)
>>
>>
>> Didn't we at one point have something like
>>
>> isinstance(other, self.__class__) and fields(other) == fields(self) and
>> 
>>
>> (plus some optimization if the types are identical)?
>>
>> That feels ideal, because it means you can subclass Point just to add
>> some methods and it will stay comparable, but if you add fields it will
>> always be unequal.
>>
>
> I don't think we had that before, but it sounds right to me. I think it
> could be:
>
> isinstance(other, self.__class__) and len(fields(other)) ==
> len(fields(self)) and 
>
> Since by definition if you're a subclass you'll start with all of the same
> fields. So if the len's match, you won't have added any new fields. That
> should be sufficiently cheap.
>
> Then the optimized version would be:
>
> (self.__class__ is other.__class__) or (isinstance(other, self.__class__)
> and len(fields(other)) == len(fields(self))) and  match>
>
> I'd probably further optimize len(fields(obj)), but that's the general
> idea.
>
> Eric.
>
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] generator vs iterator etc. (was: How assignment should work with generators?)

2017-11-27 Thread Koos Zevenhoven
On Mon, Nov 27, 2017 at 3:55 PM, Steven D'Aprano 
 wrote:

> On Mon, Nov 27, 2017 at 12:17:31PM +0300, Kirill Balunov wrote:
> ​​
>
> > 2. Should this work only for generators or for any iterators?
>
> I don't understand why you are even considering singling out *only*
> generators. A generator is a particular implementation of an iterator. I
> can write:
>
> def gen():
>yield 1; yield 2; yield 3
>
> it = gen()
>
> or I can write:
>
> it = iter([1, 2, 3])
>
> and the behaviour of `it` should be identical.
>
>
>
​I can see where this is coming from. The thing is that "iterator" and
"generator" are mostly synonymous, except two things:

(1) Generators are iterators that are produced by a generator function

(2) Generator functions are sometimes referred to as just "generators"

The concept of "generator" thus overlaps with both "iterator" and
"generator function".

Then there's also "iterator" and "iterable", which are two different things:

(3) If `obj` is an *iterable*, then `it = iter(obj)` is an *iterator* (over
the contents of `obj`)

(
​4) ​Iterators yield values, for example on explicit calls to next(it).

Personally I have leaned towards keeping a clear distinction between
"generator function" and "generator"​, which leads to the situation that
"generator" and "iterator" are mostly synonymous for me. Sometimes, for
convenience, I use the term "generator" to refer to "iterators" more
generally. This further seems to have a minor benefit that "generators" and
"iterables" are less easily confused with each other than "iterators" and
"iterables".

I thought about this issue some time ago for the `views` package, which has
a separation between sequences (seq) and other iterables (gen):

https://github.com/k7hoven/views

The functionality provided by `views.gen` is not that interesting—it's
essentially a subset of itertools functionality, but with an API that
parallels `views.seq` which works with sequences (iterable, sliceable,
chainable, etc.). I used the name `gen`, because iterator/iterable variants
of the functionality can be implemented with generator functions (although
also with other kinds of iterators/iterables). Calling the thing `iter`
would have conflicted with the builtin `iter`.

HOWEVER, this naming can be confusing for those that lean more towards
using "generator" to also mean "generator function", and for those that are
comfortable with the term "iterator" despite its resemblance to "iterable".

Now I'm actually seriously considering to consider renaming `views.gen` to `
views.iter` when I have time. After all, there's already `views.range`
which "conflicts" with the builtin range.

​Anyway, the point is that the naming is suboptimal.​

SOLUTION: Maybe (a) all iterators should be called iterators or (b) all
iterators should be called generators, regardless of whether they are
somehow a result of a generator function having been called in the past.

(I'm not going into the distinction between things that can receive values
via `send` or any other possible distinctions between different types of
iterators and iterables.)

​—Koos​

​(discussion originated from python-ideas, but cross-posted to python-dev
in case there's more interest there)​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Guido van Rossum
On Mon, Nov 27, 2017 at 1:12 AM, Antoine Pitrou  wrote:

> On Mon, 27 Nov 2017 00:41:55 -0800
> Nathaniel Smith  wrote:
> >
> > Since most libraries assume that they control both __await__ and the
> > coroutine runner, they don't tend to give great error messages here
> > (though trio does [2] ;-)). I think this is also why the asyncio docs
> > don't talk about this. I guess in asyncio's case it is technically a
> > semi-public API because you need to know how it works if you're the
> > author of a library like tornado or twisted that wants to integrate
> > with asyncio. But most people aren't the authors of tornado or
> > twisted, and the ones who are already know how this works, so the lack
> > of docs isn't a huge deal in practice...
>
> This does seem to mean that it can be difficult to provide a __await__
> method that works with different coroutine runners, though.  For
> example, Tornado Futures implement __await__ for compatibility with the
> asyncio event loop.  But what if Tornado wants to make its Future class
> compatible with an event loop that requires a different __await__
> convention?
>

Someone would have to write a PEP proposing a standard interoperability API
for event loops.

There is already such a PEP (PEP 3156, which standardized the asyncio event
loop, including the interop API) but curio and trio intentionally set out
to invent their own conventions. At least asyncio has an API that allows
overriding the factory for Futures, so if someone comes up with a Future
that is interoperable between asyncio and curio, for example, it might be
possible. But likely curio would have to be modified somewhat too.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Guido van Rossum
On Sun, Nov 26, 2017 at 7:43 PM, Chris Angelico  wrote:

> Honestly, this is one of Python's biggest problems when it comes to
> async functions. I don't know the answer to that question, and I don't
> know where in the docs I'd go looking for it. In JavaScript, async
> functions are built on top of promises, so you can just say "well, you
> return a promise, tada". But in Python, this isn't well documented.
> Snooping the source code for asyncio.sleep() shows that it uses
> @coroutine and yield, and I have no idea what magic @coroutine does,
> nor how you'd use it without yield.
>

The source for sleep() isn't very helpful -- e.g. @coroutine is mostly a
backwards compatibility thing. The heart of it is that it creates a Future
and schedules a callback at a later time to complete that Future and then
awaits it -- this gives control back to the scheduler and when the callback
has made the Future complete, the coroutine will (eventually) be resumed.

(JS Promises are equivalent to Futures, but the event loop in JS is more
built in so things feel more natural there.)

What we need here is not just documentation of how it works, but a good
tutorial showing a pattern for writing ad-hoc event loops using a simple
Future class. A good example would be some kind of parser (similar to
Nathaniel's websockets example). I wish I had the time to write this
example -- I have some interest in parsers that work this way (in fact I
think most parsers can and probably should be written this way). But I've
got a huge list of things to do already... :-(

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Tricky way of of creating a generator via a comprehension expression

2017-11-27 Thread Guido van Rossum
I need to cut this debate short (too much to do already) but I'd like to
press that I wish async/await to be available for general tinkering (like
writing elegant parsers), not just for full fledged event loops.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Larry Hastings



First, a thirty-second refresher, so we're all using the same terminology:

   A *parameter* is a declared input variable to a function.
   An *argument* is a value passed into a function.  (*Arguments* are
   stored in *parameters.*)

   So in the example "def foo(clonk): pass; foo(3)", clonk is a
   parameter, and 3 is an argument. ++


Keyword-only arguments were conceived of as being unordered. They're 
stored in a dictionary--by convention called **kwargs--and dictionaries 
didn't preserve order.  But knowing the order of arguments is 
occasionally very useful.  PEP 468 proposed that Python preserve the 
order of keyword-only arguments in kwargs.  This became easy with the 
order-preserving dictionaries added to Python 3.6.  I don't recall the 
order of events, but in the end PEP 468 was accepted, and as of 3.6 
Python guarantees order in **kwargs.


But that's arguments.  What about parameters?

Although this isn't as directly impactful, the order of keyword-only 
parameters *is* visible to the programmer.  The best way to see a 
function's parameters is with inspect.signature, although there's also 
the deprecated inspect.getfullargspec; in CPython you can also directly 
examine fn.__code__.co_varnames.  Two of these methods present their 
data in a way that preserves order for all parameters, including 
keyword-only parameters--and the third one is deprecated.


Python must (and does) guarantee the order of positional and 
positional-or-keyword parameters, because it uses position to map 
arguments to parameters when the function is called.  But conceptually 
this isn't necessary for keyword-only parameters because their position 
is irrelevant.  I only see one place in the language & library that 
addresses the ordering of keyword-only parameters, by way of omission.  
The PEP for inspect.signature (PEP 362) says that when comparing two 
signatures for equality, their positional and positional-or-keyword 
parameters must be in the same order.  It makes a point of *not* 
requiring that the two functions' keyword-only parameters be in the same 
order.


For every currently supported version of Python 3, inspect.signature and 
fn.__code__.co_varnames preserve the order of keyword-only parameters.  
This isn't surprising; it's basically the same code path implementing 
those as the two types of positional-relevant parameters, so the most 
straightforward implementation would naturally preserve their order.  
It's just not guaranteed.


I'd like inspect.signature to guarantee that the order of keyword-only 
parameters always matches the order they were declared in.  Technically 
this isn't a language feature, it's a library feature.  But making this 
guarantee would require that CPython internally cooperate, so it's kind 
of a language feature too.


Does this sound reasonable?  Would it need a PEP?  I'm hoping for "yes" 
and "no", respectively.



Three final notes:

 * Yes, I do have a use case.  I'm using inspect.signature metadata to
   mechanically map arguments from an external domain (command-line
   arguments) to a Python function.  Relying on the declaration order
   of keyword-only parameters would elegantly solve one small problem.
 * I asked Armin Rigo about PyPy's support for Python 3.  He said it
   should already maintain the order of keyword-only parameters, and if
   I ever catch it not maintaining them in order I should file a bug. 
   I assert that making this guarantee would be nearly zero effort for
   any Python implementation--I bet they all already behave this way,
   all they need is a test case and some documentation.
 * One can extend this concept to functools.partial and
   inspect.Signature.bind: should its transformations of keyword-only
   parameters also maintain order in a consistent way?  I suspect the
   answer there is much the same--there's an obvious way it should
   behave, it almost certainly already behaves that way, but it doesn't
   guarantee it.  I don't think I need this for my use case.



//arry/

++ Yes, that means "Argument Clinic" should really have been called 
"Parameter Clinic".  But the "Parameter Clinic" sketch is nowhere near 
as funny.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Robert Collins
Plus 1 from me. I'm not 100% sure the signature / inspect backport does
this, but as you say, it should be trivial to do, to whatever extent the
python version we're hosted on does it.

Rob

On 28 Nov. 2017 07:14, "Larry Hastings"  wrote:

>
>
> First, a thirty-second refresher, so we're all using the same terminology:
>
> A *parameter* is a declared input variable to a function.
> An *argument* is a value passed into a function.  (*Arguments* are stored
> in *parameters.*)
>
> So in the example "def foo(clonk): pass; foo(3)", clonk is a parameter,
> and 3 is an argument. ++
>
>
> Keyword-only arguments were conceived of as being unordered.  They're
> stored in a dictionary--by convention called **kwargs--and dictionaries
> didn't preserve order.  But knowing the order of arguments is occasionally
> very useful.  PEP 468 proposed that Python preserve the order of
> keyword-only arguments in kwargs.  This became easy with the
> order-preserving dictionaries added to Python 3.6.  I don't recall the
> order of events, but in the end PEP 468 was accepted, and as of 3.6 Python
> guarantees order in **kwargs.
>
> But that's arguments.  What about parameters?
>
> Although this isn't as directly impactful, the order of keyword-only
> parameters *is* visible to the programmer.  The best way to see a
> function's parameters is with inspect.signature, although there's also the
> deprecated inspect.getfullargspec; in CPython you can also directly examine
> fn.__code__.co_varnames.  Two of these methods present their data in a way
> that preserves order for all parameters, including keyword-only
> parameters--and the third one is deprecated.
>
> Python must (and does) guarantee the order of positional and
> positional-or-keyword parameters, because it uses position to map arguments
> to parameters when the function is called.  But conceptually this isn't
> necessary for keyword-only parameters because their position is
> irrelevant.  I only see one place in the language & library that addresses
> the ordering of keyword-only parameters, by way of omission.  The PEP for
> inspect.signature (PEP 362) says that when comparing two signatures for
> equality, their positional and positional-or-keyword parameters must be in
> the same order.  It makes a point of *not* requiring that the two
> functions' keyword-only parameters be in the same order.
>
> For every currently supported version of Python 3, inspect.signature and
> fn.__code__.co_varnames preserve the order of keyword-only parameters.
> This isn't surprising; it's basically the same code path implementing those
> as the two types of positional-relevant parameters, so the most
> straightforward implementation would naturally preserve their order.  It's
> just not guaranteed.
>
> I'd like inspect.signature to guarantee that the order of keyword-only
> parameters always matches the order they were declared in.  Technically
> this isn't a language feature, it's a library feature.  But making this
> guarantee would require that CPython internally cooperate, so it's kind of
> a language feature too.
>
> Does this sound reasonable?  Would it need a PEP?  I'm hoping for "yes"
> and "no", respectively.
>
>
> Three final notes:
>
>- Yes, I do have a use case.  I'm using inspect.signature metadata to
>mechanically map arguments from an external domain (command-line arguments)
>to a Python function.  Relying on the declaration order of keyword-only
>parameters would elegantly solve one small problem.
>- I asked Armin Rigo about PyPy's support for Python 3.  He said it
>should already maintain the order of keyword-only parameters, and if I ever
>catch it not maintaining them in order I should file a bug.  I assert that
>making this guarantee would be nearly zero effort for any Python
>implementation--I bet they all already behave this way, all they need is a
>test case and some documentation.
>- One can extend this concept to functools.partial and
>inspect.Signature.bind: should its transformations of keyword-only
>parameters also maintain order in a consistent way?  I suspect the answer
>there is much the same--there's an obvious way it should behave, it almost
>certainly already behaves that way, but it doesn't guarantee it.  I don't
>think I need this for my use case.
>
>
>
> */arry*
>
> ++ Yes, that means "Argument Clinic" should really have been called
> "Parameter Clinic".  But the "Parameter Clinic" sketch is nowhere near as
> funny.
>
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/
> robertc%40robertcollins.net
>
>
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-

Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Larry Hastings


On 11/27/2017 12:19 PM, Robert Collins wrote:
Plus 1 from me. I'm not 100% sure the signature / inspect backport 
does this, but as you say, it should be trivial to do, to whatever 
extent the python version we're hosted on does it.


I'm not sure exactly what you mean when you say "signature / inspect 
backport".  If you mean backporting inspect.signature to Python 2, this 
topic is irrelevant, as Python 2 doesn't have keyword-only parameters.



//arry/
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Greg Ewing

Guido van Rossum wrote:
The source for sleep() isn't very helpful -- e.g. @coroutine is mostly a 
backwards compatibility thing.


So how are you supposed to write that *without* using @coroutine?

--
Greg

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Guido van Rossum
On Mon, Nov 27, 2017 at 1:58 PM, Greg Ewing 
wrote:

> Guido van Rossum wrote:
>
>> The source for sleep() isn't very helpful -- e.g. @coroutine is mostly a
>> backwards compatibility thing.
>>
>
> So how are you supposed to write that *without* using @coroutine?
>

A simplified version using async def/await:

async def sleep(delay):
f = Future()
get_event_loop().call_later(delay, f.set_result)
await f

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Gregory P. Smith
On Mon, Nov 27, 2017 at 10:13 AM Larry Hastings  wrote:

>
>
> First, a thirty-second refresher, so we're all using the same terminology:
>
> A *parameter* is a declared input variable to a function.
> An *argument* is a value passed into a function.  (*Arguments* are stored
> in *parameters.*)
>
> So in the example "def foo(clonk): pass; foo(3)", clonk is a parameter,
> and 3 is an argument. ++
>
>
> Keyword-only arguments were conceived of as being unordered.  They're
> stored in a dictionary--by convention called **kwargs--and dictionaries
> didn't preserve order.  But knowing the order of arguments is occasionally
> very useful.  PEP 468 proposed that Python preserve the order of
> keyword-only arguments in kwargs.  This became easy with the
> order-preserving dictionaries added to Python 3.6.  I don't recall the
> order of events, but in the end PEP 468 was accepted, and as of 3.6 Python
> guarantees order in **kwargs.
>
> But that's arguments.  What about parameters?
>
> Although this isn't as directly impactful, the order of keyword-only
> parameters *is* visible to the programmer.  The best way to see a
> function's parameters is with inspect.signature, although there's also the
> deprecated inspect.getfullargspec; in CPython you can also directly examine
> fn.__code__.co_varnames.  Two of these methods present their data in a way
> that preserves order for all parameters, including keyword-only
> parameters--and the third one is deprecated.
>
> Python must (and does) guarantee the order of positional and
> positional-or-keyword parameters, because it uses position to map arguments
> to parameters when the function is called.  But conceptually this isn't
> necessary for keyword-only parameters because their position is
> irrelevant.  I only see one place in the language & library that addresses
> the ordering of keyword-only parameters, by way of omission.  The PEP for
> inspect.signature (PEP 362) says that when comparing two signatures for
> equality, their positional and positional-or-keyword parameters must be in
> the same order.  It makes a point of *not* requiring that the two
> functions' keyword-only parameters be in the same order.
>
> For every currently supported version of Python 3, inspect.signature and
> fn.__code__.co_varnames preserve the order of keyword-only parameters.
> This isn't surprising; it's basically the same code path implementing those
> as the two types of positional-relevant parameters, so the most
> straightforward implementation would naturally preserve their order.  It's
> just not guaranteed.
>
> I'd like inspect.signature to guarantee that the order of keyword-only
> parameters always matches the order they were declared in.  Technically
> this isn't a language feature, it's a library feature.  But making this
> guarantee would require that CPython internally cooperate, so it's kind of
> a language feature too.
>
> Does this sound reasonable?  Would it need a PEP?  I'm hoping for "yes"
> and "no", respectively.
>

Seems reasonable to me.  I'm in the "yes" and "no" respectively "just do
it" camp on this if want to see it happen.  The groundwork was already laid
for this by using the order preserving dict in 3.6.  Having the inspect
module behave in a similar manner follows naturally from that.

-gps


>
>
> Three final notes:
>
>- Yes, I do have a use case.  I'm using inspect.signature metadata to
>mechanically map arguments from an external domain (command-line arguments)
>to a Python function.  Relying on the declaration order of keyword-only
>parameters would elegantly solve one small problem.
>- I asked Armin Rigo about PyPy's support for Python 3.  He said it
>should already maintain the order of keyword-only parameters, and if I ever
>catch it not maintaining them in order I should file a bug.  I assert that
>making this guarantee would be nearly zero effort for any Python
>implementation--I bet they all already behave this way, all they need is a
>test case and some documentation.
>- One can extend this concept to functools.partial and
>inspect.Signature.bind: should its transformations of keyword-only
>parameters also maintain order in a consistent way?  I suspect the answer
>there is much the same--there's an obvious way it should behave, it almost
>certainly already behaves that way, but it doesn't guarantee it.  I don't
>think I need this for my use case.
>
>
>
> */arry*
>
> ++ Yes, that means "Argument Clinic" should really have been called
> "Parameter Clinic".  But the "Parameter Clinic" sketch is nowhere near as
> funny.
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/greg%40krypto.org
>
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/lis

Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Eric Snow
On Mon, Nov 27, 2017 at 10:05 AM, Larry Hastings  wrote:
> I'd like inspect.signature to guarantee that the order of keyword-only
> parameters always matches the order they were declared in.  Technically this
> isn't a language feature, it's a library feature.  But making this guarantee
> would require that CPython internally cooperate, so it's kind of a language
> feature too.
>
> Does this sound reasonable?  Would it need a PEP?  I'm hoping for "yes" and
> "no", respectively.

+1

There is definitely significant information in source code text that
gets thrown away in some cases, which I'm generally in favor of
preserving (see PEP 468 and PEP 520).  The use case here is unclear to
me, but the desired guarantee is effectively the status quo and it is
a minor guarantee as well, so I don't see the harm.  Furthermore, I
don't see a need for a PEP given the small scale and impact.

-eric
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 565: show DeprecationWarning in __main__ (round 2)

2017-11-27 Thread Guido van Rossum
I am basically in agreement with this now. Some remarks:

- I would recommend adding a note to the abstract about the recommendation
for test runners to also enable these warnings by default.

- In some sense, simple scripts that are distributed informally (e.g. as
email attachments or via shared drives) are the most likely victims of
unwanted warnings, and originally I wasn't happy with this. But such
scripts are also the most likely victims of other sloppiness on their
authors' part, like not specifying the needed Python version or
dependencies, not checking command line arguments or input data carefully,
and so on. And I now think that warnings just come with the territory.

- Would be nice to know whether IPython/Jupyter is happy with this.

- The sentence "As a result, API deprecation warnings encountered by
development tools written in Python should continue to be hidden by default
for users of those tools" is missing a final period; I also think that the
argument here is stronger if "development" is left out. (Maybe development
tools could be called out in a "for example" clause.)

- I can't quite put my finger on it, but reading the three bullets of
distinct categories of warnings something seems slightly off, perhaps due
to independent editing of various phrases. Perhaps the three bullets could
be rewritten for better correspondence between the various properties and
audiences? And what should test runners do for each?

- Also, is SyntaxWarning worth adding to the list?

- The thing about FutureWarning being present since 2.3 feels odd -- if
your library cares about supporting 2.7 and higher, should it use
FutureWarning or DeprecationWarning?

- "re-enabling deprecation warnings by default in __main__ doesn't help in
  handling cases where software has been factored out into support modules,
but
  those modules still have little or no automated test coverage."
  This and all bullets in the same list should have an initial capital
letter and trailing period. This sentence in particular also reads odd: the
"but" seems to apply to everything that comes before, but actually is meant
to apply only to "cases where ...". Maybe rephrasing this can help the
sentence flow better.

Most of these (the question about IPython/Jupyter approval excepted) are
simple editing comments, so I expect this PEP will be able to move forward
soon. Thanks for your patience, Nick!

--Guido

On Fri, Nov 24, 2017 at 9:33 PM, Nick Coghlan  wrote:

> This is a new version of the proposal to show DeprecationWarning in
> __main__.
>
> The proposal itself hasn't changed (it's still recommending a new
> entry in the default filter list), but there have been several updates
> to the PEP text based on further development work and comments in the
> initial thread:
>
> - there's now a linked issue and reference implementation
> - it turns out we don't currently support the definition of module
> based filters at startup time, so I've explicitly noted the relevant
> enhancement that turned out to be necessary (allowing
> plain-string-or-compiled-regex in stored filter definitions where we
> currently only allow compiled regexes)
> - I've noted the intended changes to the warnings-related documentation
> - I've noted a couple of other relevant changes that Victor already
> implemented for 3.7
> - I've noted that the motivation for the change in 2.7 & 3.1 covered
> all Python applications, not just developer tools (developer tools
> just provide a particularly compelling example of why "revert to the
> Python 2.6 behaviour" isn't a good answer)
>
> Cheers,
> Nick.
>
> =
> PEP: 565
> Title: Show DeprecationWarning in __main__
> Author: Nick Coghlan 
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 12-Nov-2017
> Python-Version: 3.7
> Post-History: 12-Nov-2017, 25-Nov-2017
>
>
> Abstract
> 
>
> In Python 2.7 and Python 3.2, the default warning filters were updated to
> hide
> DeprecationWarning by default, such that deprecation warnings in
> development
> tools that were themselves written in Python (e.g. linters, static
> analysers,
> test runners, code generators), as well as any other applications that
> merely
> happened to be written in Python, wouldn't be visible to their users unless
> those users explicitly opted in to seeing them.
>
> However, this change has had the unfortunate side effect of making
> DeprecationWarning markedly less effective at its primary intended purpose:
> providing advance notice of breaking changes in APIs (whether in CPython,
> the
> standard library, or in third party libraries) to users of those APIs.
>
> To improve this situation, this PEP proposes a single adjustment to the
> default warnings filter: displaying deprecation warnings attributed to the
> main
> module by default.
>
> This change will mean that code entered at the interactive prompt and code
> in
> single file scripts will revert to reporting these warnings by default,
> while
> they will 

Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Yury Selivanov
On Mon, Nov 27, 2017 at 12:05 PM, Larry Hastings  wrote:
[..]
> The PEP for
> inspect.signature (PEP 362) says that when comparing two signatures for
> equality, their positional and positional-or-keyword parameters must be in
> the same order.  It makes a point of *not* requiring that the two functions'
> keyword-only parameters be in the same order.

Yes, and I believe Signature.__eq__ should stay that way.

>
> For every currently supported version of Python 3, inspect.signature and
> fn.__code__.co_varnames preserve the order of keyword-only parameters.  This
> isn't surprising; it's basically the same code path implementing those as
> the two types of positional-relevant parameters, so the most straightforward
> implementation would naturally preserve their order.  It's just not
> guaranteed.
>
> I'd like inspect.signature to guarantee that the order of keyword-only
> parameters always matches the order they were declared in.  Technically this
> isn't a language feature, it's a library feature.  But making this guarantee
> would require that CPython internally cooperate, so it's kind of a language
> feature too.

We can update the documentation and say that we preserve the order in
simple cases:

   def foo(*, a=1, b=2): pass

   s = inspect.signature(foo)
   assert list(s.parameters.keys()) == ['a', 'b']

We can't say anything about the order if someone passes a partial
object, or sets custom Signature objects to func.__signature__.

Yury
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] 3.7.0a3 cutoff extended a week to 12-04

2017-11-27 Thread Ned Deily
We are extending the cutoff for the next 3.7 alpha preview (3.7.0a3) a week, 
moving it from today to 12-04 12:00 UTC.  The main reason is a selfish one: I 
have been traveling and mainly offline for the last few weeks and I am still 
catching up with activity.  Since we are getting close to feature code cutoff 
for the 3.7 cycle, it would be better to get things in sooner than later.  
Following alpha 3, we will have one more alpha preview, 3.7.0a4 on 2018-01-08, 
prior to the feature code cutoff with 3.7.0b1 on 2018-01-29.  Note that 12-04 
is also the scheduled date for the next 3.6.x maintenance release release 
candidate, 3.6.4rc1.  So I hope you can take advantage of the extra days for 
both release cycles.

Thanks again for all your efforts!
--Ned

--
  Ned Deily
  [email protected] -- []

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Can Python guarantee the order of keyword-only parameters?

2017-11-27 Thread Larry Hastings



On 11/27/2017 03:58 PM, Yury Selivanov wrote:

We can't say anything about the order if someone passes a partial
object


Sure we could.  We could ensure that functools.partial behaves in a sane 
way, then document and guarantee that behavior.




or sets custom Signature objects to func.__signature__.


Consenting Adults rule applies here.  Obviously we should honor the 
signature they set.



//arry/
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Using async/await in place of yield expression

2017-11-27 Thread Terry Reedy

On 11/27/2017 5:05 PM, Guido van Rossum wrote:
On Mon, Nov 27, 2017 at 1:58 PM, Greg Ewing > wrote:


Guido van Rossum wrote:
The source for sleep() isn't very helpful -- e.g. @coroutine is
mostly a backwards compatibility thing.

So how are you supposed to write that *without* using @coroutine?

A simplified version using async def/await:


---

async def sleep(delay):
     f = Future()


This must be asyncio.Future as (by experiment) a 
concurrent.futures.Future cannot be awaited.  The a.F doc could add this 
as a difference.  Future needs an argument for the keyword-only loop 
parameter; as I remember, the default None gets replaced by the default 
asyncio loop.



     get_event_loop().call_later(delay, f.set_result)


A result is needed, such as None or delay, to pass to f.set_result.


     await f


I gather that
1. the value of the expression is the result set on the future, which 
would normally be needed, though not here;
2. the purpose of 'await f' here is simply to block exit from the 
coroutine, without blocking the loop, so that users of 'await sleep(n)' 
will actually pause (without blocking other code).


Since a coroutine must be awaited, and not just called, and await can 
only be used in a coroutine, there seems to be a problem of where to 
start.  The asyncio answer, in the PEP, is to wrap a coroutine call in a 
Task, which is, as I remember, done by the loop run methods.


Based on the notes above, and adding some prints, I got this to run:

---
import asyncio
import time

loop = asyncio.get_event_loop()

async def sleep(delay):
f = asyncio.Future(loop=loop)
loop.call_later(delay, f.set_result, delay)
print('start')
start = time.perf_counter()
d = await f
stop = time.perf_counter()
print(f'requested sleep = {d}; actual = {stop-start:f}')

loop.run_until_complete(sleep(1))
---
This produces:
start
requested sleep = 1; actual = .9--- [usually < 1]

---
Now, the question I've had since async and await were introduced, is how 
to drive async statements with tkinter.  With the help of the working 
example above, I make a start.


---
from asyncio import Future, Task
import tkinter as tk
import time


class ATk(tk.Tk):
"Enable tkinter program to use async def, etc, and await sleep."

def __init__(self):
super().__init__()

def task(self, coro):
"Connect async def coroutine to tk loop."
Task(coro, loop=self)

def get_debug(self):
"Internal method required by Future."
print('debug')
return False

def call_soon(self, callback, *args):
"Internal method required by Task and Future."
# TaskStep/Wakeup/MethWrapper has no .__name__ attribute.
# Tk.after requires callbacks to have one (bug, I think).
print('soon', callback, *args, hasattr(callback, '__name__'))
def wrap2(): callback(*args)
return self.after(0, wrap2)


root = ATk()

async def sleep(delay):
f = Future(loop=root)
def cb():
print('cb called')
f.set_result(delay)
root.after(int(delay*1000), cb)
print('start')
start = time.perf_counter()
d = await f
stop = time.perf_counter()
print(f'requested sleep = {d}; actual = {stop-start:f}')

root.task(sleep(1))
root.mainloop()
---

Output:
debug
soon  False
debug
start
cb called
soon  finished result=1> False

requested sleep = 1; actual = 1.01--- [always about 1.01]

Replacing the last two lines with

async def myloop(seconds):
while True:
print(f'*** {seconds} ***')
await sleep(seconds)

root.task(myloop(1.2))
root.task(myloop(.77))
root.mainloop()

prints interleaved 1.2 and .77 lines.

I will next work on animating tk widgets in the loops.

--
Terry Jan Reedy


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Second post: PEP 557, Data Classes

2017-11-27 Thread Eric V. Smith

On 11/27/2017 10:51 AM, Guido van Rossum wrote:

Following up on this subthread (inline below).



Didn't we at one point have something like

isinstance(other, self.__class__) and fields(other) == fields(self) and 



(plus some optimization if the types are identical)?

That feels ideal, because it means you can subclass Point just to add 
some methods and it will stay comparable, but if you add fields it will 
always be unequal.


One thing this doesn't let you do is compare instances of two different 
subclasses of a base type:


@dataclass
class B:
i: int

@dataclass
class C1(B): pass

@dataclass
class C2(B): pass

You can't compare C1(0) and C2(0), because neither one is an instance of 
the other's type. The test to get this case to work would be expensive: 
find the common ancestor, and then make sure no fields have been added 
since then. And I haven't thought through multiple inheritance.


I suggest we don't try to support this case.

Eric.

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 565: show DeprecationWarning in __main__ (round 2)

2017-11-27 Thread Serhiy Storchaka

25.11.17 07:33, Nick Coghlan пише:

* ``FutureWarning``: always reported by default. The intended audience is users
   of applications written in Python, rather than other Python developers
   (e.g. warning about use of a deprecated setting in a configuration file
   format).

Given its presence in the standard library since Python 2.3, ``FutureWarning``
would then also have a secondary use case for libraries and frameworks that
support multiple Python versions: as a more reliably visible alternative to
``DeprecationWarning`` in Python 2.7 and versions of Python 3.x prior to 3.7.


I think it is worth to say more explicitly that the primary purpose of 
FutureWarning (warn about future behavior changes that will not be 
errors) is kept. It is just added a secondary purpose: a replacement for 
DeprecationWarning if you want to be sure that it is visible to end users.


I think that showing DeprecationWarning in __main__.py is just a first 
step. In future we can extend the scope of showing DeprecationWarning.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com