Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg Ewing

PJ Eby wrote:

I find this a little weird.  Why not just have `with` and `for` inside
a coroutine dynamically check the iterator or context manager, and
either behave sync or async accordingly?  Why must there be a
*syntactic* difference?


It depends on whether you think it's important to
have a syntactic marker for points where the code can
potentially be suspended.

In my original vision for PEP 3152, there was no
"cocall" syntax -- you just wrote an ordinary call,
and whether to make a cocall or not was determined
at run time. But Guido and others felt that it would
be better for suspension points to be explicit, so
I ended up with cocall.

The same reasoning presumably applies to asynchronous
'for' and 'with'. If you think that it's important
to make suspendable calls explicit, you probably want
to mark them as well.


...which, incidentally, highlights one of the things that's been
bothering me about all this "async foo" stuff: "async def" looks like
it *defines the function* asynchronously


That bothers me a bit, too, but my main problem with it
is the way it displaces the function name. "def f() async:"
would solve both of those problems.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg Ewing

Yury Selivanov wrote:

I think there is another way... instead of pushing

GET_ITER
...
YIELD_FROM

opcodes, we'll need to replace GET_ITER with another one:

GET_ITER_SPECIAL
...
YIELD_FROM


I'm lost. What Python code are you suggesting this
would be generated from?

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg

On 23/04/2015 6:32 a.m., Andrew Svetlov wrote:

If we forbid to call `async def` from regualr code how asyncio should
work? I'd like to push `async def` everywhere in asyncio API where
asyncio.coroutine required.


As I suggested earlier, a way could be provided to mark a
function as callable using either yield from f() or await f().
That would water down the error catching ability a bit, but
it would allow interoperability with existing asyncio code.

--
Greg

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg Ewing

On 04/23/2015 04:18 AM, Yury Selivanov wrote:


2. We'll hack Gen(/ceval.c?) objects to raise an error if they
are called directly and have a 'CO_COROUTINE' flag.


By "Gen", do you mean the generator-function or the
generator-iterator?

That flag has to be on the generator-function, not the
generator-iterator, otherwise by the time ceval sees it,
the call that should have been forbidden has already
been made.

To make this work without flagging the function, it
would be necessary to check the result of every function
call that wasn't immediately awaited and raise an
exception if it were awaitable. But that would mean
awaitable objects not being fully first-class
citizens, since there would be some perfectly
reasonable things that you can't do with them.
I suspect it would make writing the kernel of a
coroutine-scheduling system such as asyncio very
awkward, perhaps impossible, to write in pure
Python.


3. Task(), create_task() and async() will be modified to call
'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag.


Or, as I pointed out earlier, the caller can wrap the
argument in something equivalent to costart().


4. 'await' will require parentheses grammatically. That will
make it different from 'yield' expression. For instance,
I still don't know what would 'await coro(123)()' mean.


In PEP 3152, cocall binds to the nearest set of function-calling
parens, so 'cocall f()()' is parsed as '(cocall f())()'.
If you want it the other way, you have to write it as
'cocall (f())()'.

I know that's a somewhat arbitrary thing to remember, and
it makes chained function calls a bit harder to write and
read. But chaining calls like that is a fairly rare thing
to do, in contrast with using a call expression as an
argument to another call, which is very common.

That's not the only case, either. Just about any unparenthesised
use of yield-from other than the sole contents of the RHS of
an assignment seems to be disallowed. All of these are
currently syntax errors, for example:

   yield from f(x) + yield from g(x)

   x + yield from g(x)

   [yield from f(x)]


5. 'await foo(*a, **k)' will be an equivalent to
'yield from type(coro).__await__(coro, *a, **k)'


Again, I'm not sure whether you're proposing to make the
functions the await-able objects rather than the iterators
(which would effectively be PEP 3152 with __cocall__
renamed to __await__) or something else. I won't comment
further on this point until that's clearer.


6. If we ever decide to implement coroutine-generators --
async def functions with 'await' *and* some form of 'yield' --
we'll need to reverse the rule -- allow __call__ and
disallow __await__ on such objects (so that you'll be able
to write 'async for item in coro_gen()' instead of
'async for item in await coro_gen()'.


Maybe. I haven't thought that idea through properly yet.
Possibly the answer is that you define such a function
using an ordinary "def", to match the way it's called.
The fact that it's an async generator is then indicated
by the fact that it contains "async yield".

--
Greg

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
I guess to raise exception on unwinded async generator in destructor
even in non-debug mode.

Debug mode may have more complex info with source_traceback included,
as Victor Stinner does for CoroWrapper.

On Thu, Apr 23, 2015 at 4:27 AM, Yury Selivanov  wrote:
> Greg,
>
> On 2015-04-22 7:47 PM, Greg Ewing wrote:
>>
>> Yury Selivanov wrote:
>>
>>> On the other hand, I hate the idea
>>> of grammatically requiring parentheses for 'await'
>>> expressions.  That feels non-pytonic to me.
>>
>>
>> How is it any different from grammatically requiring
>> parens in an ordinary function call? Nobody ever
>> complained about that.
>
>
> It is different.
>
> 1. Because 'await' keyword might be at a great distance
> from the object you're really calling:
>
> await foo.bar.baz['spam']()
>   +---+
>
> Can I chain the calls:
>
> await foo()() ?
>
> or await foo().bar()?
>
> 2. Because there is no other keyword in python
> with similar behaviour.
>
> 3. Moreover: unless I can write 'await future' - your
> proposal *won't* work with a lot of existing code
> and patterns.  It's going to be radically different
> from all other languages that implement 'await' too.
>
> Yury
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Greg,

On 2015-04-22 7:47 PM, Greg Ewing wrote:

Yury Selivanov wrote:


On the other hand, I hate the idea
of grammatically requiring parentheses for 'await'
expressions.  That feels non-pytonic to me.


How is it any different from grammatically requiring
parens in an ordinary function call? Nobody ever
complained about that.


It is different.

1. Because 'await' keyword might be at a great distance
from the object you're really calling:

await foo.bar.baz['spam']()
  +---+

Can I chain the calls:

await foo()() ?

or await foo().bar()?

2. Because there is no other keyword in python
with similar behaviour.

3. Moreover: unless I can write 'await future' - your
proposal *won't* work with a lot of existing code
and patterns.  It's going to be radically different
from all other languages that implement 'await' too.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
On Thu, Apr 23, 2015 at 3:35 AM, Guido van Rossum  wrote:
> On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing 
> wrote:
>>
>> Guido van Rossum wrote:
>>>
>>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
>>> wrap a coroutine in a Task, the way its done in asyncio by the Task()
>>> constructor, the loop.create_task() method, and the async() function
>>
>>
>> That's easy. You can always use costart() to adapt a cofunction
>> for use with something expecting a generator-based coroutine,
>> e.g.
>>
>> codef my_task_func(arg):
>>   ...
>>
>> my_task = Task(costart(my_task_func, arg))
>>
>> If you're willing to make changes, Task() et al could be made to
>> recognise cofunctions and apply costart() where needed.
>
>
> Hm, that feels backwards incompatible (since currently I can write
> Task(my_task_func(arg)) and also a step backwards in elegance (having to
> pass the args separately).
>
> OTOH the benefit is that it's much harder to accidentally forget to wait for
> a coroutine. And maybe the backward compatibility issue is not really a
> problem because you have to opt in by using codef or async def.
>
> So I'm still torn. :-)
>


> Somebody would need to take a mature asyncio app and see how often this is
> used (i.e. how many place would require adding costart() as in the above
> example).
>

I have not found fresh patch for 3152 to play with, but at least
aiohttp [1] library very often creates new tasks by `async(coro(...))`
call. The same for aiozmq, aioredis, sockjs (aiohttp-based library for
sock.js), aiokafka etc.

My applications created for my job also has a `async(...)` calls or
direct `Task(f(arg))` creations -- the numbers are between 3 and 10
usage lines per application. Not a big deal to fix them all but it's
backward incompatibility.

In opposite, I've finished experimental branch [2] of aiomysql library
(asyncio driver for MySQL database) with support for `async for` and
`async with`.

The main problem with public released version is impossibility to
handle transactions (requires async context manager) and iteration
with async fetching data from cursor (required for server-side cursors
for example).

Now both problems are solved with keeping full backward compatibility.
The library can be used with Python 3.3+ but obviously no new features
are available for old Pythons.

I use asyncio coroutines, not async functions, e.g.:

class Cursor:

# ...

@asyncio.coroutine
def __aiter__(self):
return self

@asyncio.coroutine
def __anext__(self):
ret = yield from self.fetchone()
if ret is not None:
return ret
else:
raise StopAsyncIteration

The whole aiomysql code is correct from Python 3.3+ perspective. For
testing new features I use new syntax of in separate test files, test
runner will skip test modules with syntax errors on old Python but run
those modules on python from PEP 492 branch.

Usage example (table 'tbl' is pre-filled, DB engine is connected to server):

async def go(engine):
async with engine.connect() as conn:
async with (await conn.begin()) as tr:
await conn.execute("DELETE FROM tbl WHERE (id % 2) = 0")

async for row in conn.execute("SELECT * FROM tbl"):
print(row['id'], row['name'])


[1] https://github.com/KeepSafe/aiohttp

[2] https://github.com/aio-libs/aiomysql/tree/await

> --
> --Guido van Rossum (python.org/~guido)
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com
>



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov



On 2015-04-22 9:04 PM, Guido van Rossum wrote:

On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov 
wrote:


On 2015-04-22 8:35 PM, Guido van Rossum wrote:


On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing 
wrote:

  Guido van Rossum wrote:

  On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to

wrap a coroutine in a Task, the way its done in asyncio by the Task()
constructor, the loop.create_task() method, and the async() function

  That's easy. You can always use costart() to adapt a cofunction

for use with something expecting a generator-based coroutine,
e.g.

codef my_task_func(arg):
...

my_task = Task(costart(my_task_func, arg))

If you're willing to make changes, Task() et al could be made to
recognise cofunctions and apply costart() where needed.


Hm, that feels backwards incompatible (since currently I can write
Task(my_task_func(arg)) and also a step backwards in elegance (having to
pass the args separately).

OTOH the benefit is that it's much harder to accidentally forget to wait
for a coroutine. And maybe the backward compatibility issue is not really
a
problem because you have to opt in by using codef or async def.

So I'm still torn. :-)

Somebody would need to take a mature asyncio app and see how often this is
used (i.e. how many place would require adding costart() as in the above
example).


Somewhere in this thread Victor Stinner wrote:

"""A huge part of the asyncio module is based on "yield from fut" where
fut is a Future object."""

So how would we do "await fut" if await requires parentheses?


We could make Future a valid co-callable object.


So you would have to write 'await fut()'?  This is non-intuitive.
To make Greg's proposal work it'd be a *requirement* for 'await'
(enforced by the grammar!) to have '()' after it.


Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov 
wrote:

> On 2015-04-22 8:35 PM, Guido van Rossum wrote:
>
>> On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing 
>> wrote:
>>
>>  Guido van Rossum wrote:
>>>
>>>  On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
 wrap a coroutine in a Task, the way its done in asyncio by the Task()
 constructor, the loop.create_task() method, and the async() function

  That's easy. You can always use costart() to adapt a cofunction
>>> for use with something expecting a generator-based coroutine,
>>> e.g.
>>>
>>> codef my_task_func(arg):
>>>...
>>>
>>> my_task = Task(costart(my_task_func, arg))
>>>
>>> If you're willing to make changes, Task() et al could be made to
>>> recognise cofunctions and apply costart() where needed.
>>>
>>
>> Hm, that feels backwards incompatible (since currently I can write
>> Task(my_task_func(arg)) and also a step backwards in elegance (having to
>> pass the args separately).
>>
>> OTOH the benefit is that it's much harder to accidentally forget to wait
>> for a coroutine. And maybe the backward compatibility issue is not really
>> a
>> problem because you have to opt in by using codef or async def.
>>
>> So I'm still torn. :-)
>>
>> Somebody would need to take a mature asyncio app and see how often this is
>> used (i.e. how many place would require adding costart() as in the above
>> example).
>>
>
> Somewhere in this thread Victor Stinner wrote:
>
> """A huge part of the asyncio module is based on "yield from fut" where
> fut is a Future object."""
>
> So how would we do "await fut" if await requires parentheses?
>

We could make Future a valid co-callable object.


> I think that the problem of forgetting 'yield from' is a bit exaggerated.
> Yes, I myself forgot 'yield from' once or twice. But that's it, it has
> never happened since.


Maybe, but it *is* a part of everybody's learning curve.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

On 2015-04-22 8:35 PM, Guido van Rossum wrote:

On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing 
wrote:


Guido van Rossum wrote:


On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
wrap a coroutine in a Task, the way its done in asyncio by the Task()
constructor, the loop.create_task() method, and the async() function


That's easy. You can always use costart() to adapt a cofunction
for use with something expecting a generator-based coroutine,
e.g.

codef my_task_func(arg):
   ...

my_task = Task(costart(my_task_func, arg))

If you're willing to make changes, Task() et al could be made to
recognise cofunctions and apply costart() where needed.


Hm, that feels backwards incompatible (since currently I can write
Task(my_task_func(arg)) and also a step backwards in elegance (having to
pass the args separately).

OTOH the benefit is that it's much harder to accidentally forget to wait
for a coroutine. And maybe the backward compatibility issue is not really a
problem because you have to opt in by using codef or async def.

So I'm still torn. :-)

Somebody would need to take a mature asyncio app and see how often this is
used (i.e. how many place would require adding costart() as in the above
example).


Somewhere in this thread Victor Stinner wrote:

"""A huge part of the asyncio module is based on "yield from fut" where 
fut is a Future object."""


So how would we do "await fut" if await requires parentheses?

I think that the problem of forgetting 'yield from' is a bit 
exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's 
it, it has never happened since.


Yury

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Type hints -- a mediocre programmer's reaction

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 2:38 PM, Chris Barker  wrote:

>
>
> Oh wait, maybe it won't -- a string IS a sequence of strings. That's why
>> this is an insidious bug in the first place.
>
> On Tue, Apr 21, 2015 at 11:32 PM, Terry Reedy  wrote:
>
>
>> I was just thinking today that for this, typing needs a subtraction
>> (difference) operation in addition to an addition (union) operation:
>> Difference(Iterable(str), str)
>>
>
> Yup -- that might solve, it, but it feels a bit odd -- I can take any
> Iterable of string, except a string. -- but what if there are others that
> won't work??? But I guess that's the core of putting type hints on a
> dynamic language.
>
> Still, I tend to think that this particular issue is really a limitation
> with Python's type system -- nothing to do with type hinting.
>
> I can see that a character type seems useless in Python, but there are
> lessons from other places: a numpy array is a collection of (usually)
> numbers that can be treated as a single entity -- much like a string is a
> collection of characters that is treated as a single entity -- in both
> cases, it's core to convenience and performance to do that. But with numpy,
> when you index an array, you get something back with one less dimension:
>
> index into a 3-d array, you get a 2-d array
> index into a 2-d array, you get a 1-d array
> index into a 1-d array, you get a scalar -- NOT a length-one 1-d array
>
> Sometimes this is a pain for generic code, but more often than not it's
> critical to writing dynamic code -- not because you couldn't do the
> operations you want, but because it's important to distinguish between a
> scalar and an array that happens to have only one value.
>
> Anyway, the point is that being able to say "all these types, except this
> one" would solve this particular problem -- but would it solve any others?
> Do we want this to work around a quirk in Pythons string type?
>
> NOTE: I know full well that adding a character type to Python is not worth
> it.
>

If you switch to bytes the problem goes away. :-P

More seriously, I doubt there are other important use cases for Difference.

Given that even if Difference existed, and even if we had a predefined type
alias for Difference[Iterable[str], str], you' still have to remember to
mark up all those functions with that annotation. It almost sounds simpler
to just predefine this function:

def make_string_list(a: Union[str, Iterable[str]]) -> Iterable[str]:
if isinstance(a, str):
return [a]
else:
return a

and call this in those functions that have an Interable[str] argument. Now
instead of getting errors for all the places where a caller mistakenly
passes a single str, you've *fixed* all those call sites. Isn't that more
Pythonic? :-)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing 
wrote:

> Guido van Rossum wrote:
>
>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to
>> wrap a coroutine in a Task, the way its done in asyncio by the Task()
>> constructor, the loop.create_task() method, and the async() function
>>
>
> That's easy. You can always use costart() to adapt a cofunction
> for use with something expecting a generator-based coroutine,
> e.g.
>
> codef my_task_func(arg):
>   ...
>
> my_task = Task(costart(my_task_func, arg))
>
> If you're willing to make changes, Task() et al could be made to
> recognise cofunctions and apply costart() where needed.


Hm, that feels backwards incompatible (since currently I can write
Task(my_task_func(arg)) and also a step backwards in elegance (having to
pass the args separately).

OTOH the benefit is that it's much harder to accidentally forget to wait
for a coroutine. And maybe the backward compatibility issue is not really a
problem because you have to opt in by using codef or async def.

So I'm still torn. :-)

Somebody would need to take a mature asyncio app and see how often this is
used (i.e. how many place would require adding costart() as in the above
example).

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Type hints -- a mediocre programmer's reaction

2015-04-22 Thread Chris Barker
> Oh wait, maybe it won't -- a string IS a sequence of strings. That's why
> this is an insidious bug in the first place.

On Tue, Apr 21, 2015 at 11:32 PM, Terry Reedy  wrote:


> I was just thinking today that for this, typing needs a subtraction
> (difference) operation in addition to an addition (union) operation:
> Difference(Iterable(str), str)
>

Yup -- that might solve, it, but it feels a bit odd -- I can take any
Iterable of string, except a string. -- but what if there are others that
won't work??? But I guess that's the core of putting type hints on a
dynamic language.

Still, I tend to think that this particular issue is really a limitation
with Python's type system -- nothing to do with type hinting.

I can see that a character type seems useless in Python, but there are
lessons from other places: a numpy array is a collection of (usually)
numbers that can be treated as a single entity -- much like a string is a
collection of characters that is treated as a single entity -- in both
cases, it's core to convenience and performance to do that. But with numpy,
when you index an array, you get something back with one less dimension:

index into a 3-d array, you get a 2-d array
index into a 2-d array, you get a 1-d array
index into a 1-d array, you get a scalar -- NOT a length-one 1-d array

Sometimes this is a pain for generic code, but more often than not it's
critical to writing dynamic code -- not because you couldn't do the
operations you want, but because it's important to distinguish between a
scalar and an array that happens to have only one value.

Anyway, the point is that being able to say "all these types, except this
one" would solve this particular problem -- but would it solve any others?
Do we want this to work around a quirk in Pythons string type?

NOTE: I know full well that adding a character type to Python is not worth
it.

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg Ewing

Guido van Rossum wrote:
On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine 
in a Task, the way its done in asyncio by the Task() constructor, the 
loop.create_task() method, and the async() function


That's easy. You can always use costart() to adapt a cofunction
for use with something expecting a generator-based coroutine,
e.g.

codef my_task_func(arg):
  ...

my_task = Task(costart(my_task_func, arg))

If you're willing to make changes, Task() et al could be made to
recognise cofunctions and apply costart() where needed.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Greg Ewing

Yury Selivanov wrote:


On the other hand, I hate the idea
of grammatically requiring parentheses for 'await'
expressions.  That feels non-pytonic to me.


How is it any different from grammatically requiring
parens in an ordinary function call? Nobody ever
complained about that.

In the PEP 3152 way of thinking, a cocall is just
a function call that happens to be suspendable.
The fact that there is an iterator object involved
behind the scenes is an implementation detail. You
don't have to think about it or even know about it
in order to write or understand suspendable code.

It's possible to think about "yield from f(x)" or
"await f(x)" that way, but only by exploiting a kind
of pun in the code, where you think of f(x) as doing
all the work and the rest as a syntactic marker
indicating that the call is suspendable. PEP 3152
removes the pun by making this the *actual*
interpretation of "cocall f(x)".

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Paul Sokolovsky
Hello,

On Wed, 22 Apr 2015 09:53:39 -0700
Rajiv Kumar  wrote:

> I'd like to suggest another way around some of the issues here, with
> apologies if this has already been discussed sometime in the past.
> 
> From the viewpoint of a Python programmer, there are two distinct
> reasons for wanting to suspend execution in a block of code:
> 
> 1. To yield a value from an iterator, as Python generators do today.
> 
> 2. To cede control to the event loop while waiting for an
> asynchronous task to make progress in a coroutine.
> 
> As of today both of these reasons to suspend are supported by the same
> underlying mechanism, i.e. a "yield" at the end of the chain of "yield
> from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the
> bottom of it all there's effectively still a yield as I understand it.
> 
> I think that the fact that these two concepts use the same mechanism
> is what leads to the issues with coroutine-generators that Greg and
> Yury have raised.
> 
> With that in mind, would it be possible to introduce a second form of
> suspension to Python to specifically handle the case of ceding to the
> event loop? 

Barring adding an adhoc statement "yield_to_a_main_loop", there's a
generic programming device to do it: symmetric coroutines. But it's
unlikely to help with your sentiment that the same device is used for
different purposes. At least with asymmetric coroutines as currently in
Python, you have next() to "call" a coroutine and yield to "return"
from. With symmetric coroutines, you don't have a place to return -
you can only "call" another coroutine, and then have freedom to call
any (including a main loop), but need to bother to always know whom you
want to call.

But I guess it's already sounds confusing enough for folks who didn't
hear about symmetric coroutines, whereas call/return paradigm is much
more familiar and understandable. That's certainly why Python
implements asymmetric model. And having both asymmetric and symmetric
would quite confusing, especially that symmetric are more powerful and
asymmetric can be easily implemented in terms of symmetric using
continuation-passing style. On the last occurrence of "easily" mere
users of course start to run away, shouting that if they wanted to
use Scheme, they'd have taken classes on it and used long ago.   


So, the real problem with dichotomy you describe above is not
technical, but rather educational/documentational. And the current
approach asyncio takes is "you should not care if a coroutine yields
to main loop, or how it is done". Actually, the current approach is to
forbid and deny you knowledge how it is done, quoting Victor Stinner
from another mail: "(A) asyncio coroutine in Python 3.4: use yield
from, yield denied". So, just pretend that there's no yield, only
yield from, problem solved. But people know there's yield - they
knew it for long time before "yield from". And there're valid usages
for yield in a coroutine, like implementing your, application-level
generator/generation. Currently, any generation ability is usurped by
asyncio's main loop.

Much better approach IMHO is given in David Beazley's presentations on
generators and coroutines, http://www.dabeaz.com/generators/ . He says
that coroutines provided by framework are essentially "system calls".
And that's why you don't want to know how they work, and shouldn't
care - because users usually don't care how OS kernel implements
system calls while sitting in the web browser. But if you want, you can,
and you will discover that they're implemented by yield'ing objects
of a special class. That's why you're *suggested* not to use yield's in
coroutines - because if you want to catch yours, application-level,
yields, you may also get any time a system yield object. You would need
to expect such possibility, filter such yields and pass them up
(re-yield). But there's no forbidden magic in all that, and
understanding that helps a lot IMHO.



-- 
Best regards,
 Paul  mailto:pmis...@gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Paul Sokolovsky
Hello,

On Wed, 22 Apr 2015 13:31:18 -0700
Guido van Rossum  wrote:

> On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov
>  wrote:
> 
> > On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby 
> > wrote:
> > > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov
> > > 
> > wrote:
> > >> It is an error to pass a regular context manager without
> > >> ``__aenter__`` and ``__aexit__`` methods to ``async with``.  It
> > >> is a ``SyntaxError`` to use ``async with`` outside of a
> > >> coroutine.
> > >
> > > I find this a little weird.  Why not just have `with` and `for`
> > > inside a coroutine dynamically check the iterator or context
> > > manager, and either behave sync or async accordingly?  Why must
> > > there be a *syntactic* difference?
> >
> > IIRC Guido always like to have different syntax for calling regular
> > functions and coroutines.
> > That's why we need explicit syntax for asynchronous context managers
> > and iterators.
> >
> 
> To clarify: the philosophy behind asyncio coroutines is that you
> should be able to tell statically where a task may be suspended
> simply by looking for `yield from`. This means that *no* implicit
> suspend points may exist, and it rules out gevent, stackless and
> similar microthreading frameworks.

I always wanted to ask - does that mean that Python could have
symmetric coroutines (in a sense that it would be Pythonic feature), as
long as the call syntax is different from a function call?

E.g.:

sym def coro1(val):
while True:
val = coro2.corocall(val)

sym def coro2(val):
while True:
val = coro1.corocall(val)

coro1.call(1)


-- 
Best regards,
 Paul  mailto:pmis...@gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Committing into which branches?

2015-04-22 Thread Łukasz Langa

> On Apr 22, 2015, at 2:45 PM, Facundo Batista  wrote:
> 
> Hola!
> 
> I just commited a simple improvement to HTTPError repr, and checking
> in the source code page [0], I see that my commit has a small
> "default" besides it; and other commits don't have that, but have 2.7,
> or 3.4, etc...
> 
> So, question: Did I commit in the correct branch? Should I have done
> anything different?

This is an enhancement (http://bugs.python.org/issue23887 
) so it should go to default. You committed 
to the right branch. The mark on the website simply tells you that this is the 
newest commit on default. Otherwise it’s, well, default.

-- 
Best regards,
Łukasz Langa

WWW: http://lukasz.langa.pl/
Twitter: @llanga
IRC: ambv on #python-dev___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Committing into which branches?

2015-04-22 Thread Brett Cannon
The default branch is going to become 3.5, so you're fine.

On Wed, Apr 22, 2015 at 5:45 PM Facundo Batista 
wrote:

> Hola!
>
> I just commited a simple improvement to HTTPError repr, and checking
> in the source code page [0], I see that my commit has a small
> "default" besides it; and other commits don't have that, but have 2.7,
> or 3.4, etc...
>
> So, question: Did I commit in the correct branch? Should I have done
> anything different?
>
> Thanks!
>
> [0] https://hg.python.org/cpython/
>
> --
> .Facundo
>
> Blog: http://www.taniquetil.com.ar/plog/
> PyAr: http://www.python.org/ar/
> Twitter: @facundobatista
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Committing into which branches?

2015-04-22 Thread Facundo Batista
On Wed, Apr 22, 2015 at 6:46 PM, Brett Cannon  wrote:

> The default branch is going to become 3.5, so you're fine.

Thanks!!

-- 
.Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/
Twitter: @facundobatista
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Committing into which branches?

2015-04-22 Thread Facundo Batista
Hola!

I just commited a simple improvement to HTTPError repr, and checking
in the source code page [0], I see that my commit has a small
"default" besides it; and other commits don't have that, but have 2.7,
or 3.4, etc...

So, question: Did I commit in the correct branch? Should I have done
anything different?

Thanks!

[0] https://hg.python.org/cpython/

-- 
.Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/
Twitter: @facundobatista
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Ludovic,

On 2015-04-22 5:00 PM, Ludovic Gasc wrote:

Not related, but one of my coworkers asked me if with the new syntax it
will be possible to write an async decorator for coroutines.
If I understand correctly new grammar in PEP, it seems to be yes, but could
you confirm ?


There shouldn't be any problems with writing a decorator.

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Ludovic Gasc
2015-04-22 22:46 GMT+02:00 Victor Stinner :
>
> Kind (A):
>
> - "yield-from coroutines" or "coroutines based on yield-from"
> - maybe "asyncio coroutines"
> - "legacy coroutines"?
>

"legacy coroutines" name has the advantage to be directly clear it isn't a
good idea to write new source code with that.


> Kind (B):
>
> - "awaitable coroutines" or "coroutines based on await"
> - "asynchronous coroutine" to remember the "async" keyword even if it
> sounds
> wrong to repeat that a coroutine can be interrupted (it's almost the
> definition of a coroutine, no?)
> - or just "asynchronous function" (coroutine function) & "asynchronous
> object" (coroutine object)
>

Personally, if I've a vote right, "async coroutine" is just enough, even if
it's a repetition. Or just "coroutine" ?
I'm not fan for "new-style coroutines" like name.
By the way, I hope you don't change a third time how to write async code in
Python, because it will be harder to define a new name.

Not related, but one of my coworkers asked me if with the new syntax it
will be possible to write an async decorator for coroutines.
If I understand correctly new grammar in PEP, it seems to be yes, but could
you confirm ?
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Victor Stinner
Greg Ewing  canterbury.ac.nz> writes:
> I still don't like the idea of hijacking the generic
> term "coroutine" and using it to mean this particular
> type of object.

There are only two hard things in Computer Science: cache invalidation
and naming things.

-- Phil Karlton

:-)

When reviewing Yury's PEP, I read Wikipedia's article of Coroutine because I
didn't know if a "coroutine" is something new in Python nor if it was well
defined.

https://en.wikipedia.org/wiki/Coroutine

Answer: it's not new, and it's implemented in many languages, and it's well
defined.

But coroutines are not always directly called "coroutines" in other
programming languages.

Using a custom name like "cofunction" may confuse users coming from other
programming languages. I prefer to keep "coroutine", but I agree that we
should make some effort to define the different categories of "Python
coroutines". Well, there are two kind kinds of coroutines:

(A) asyncio coroutine in Python 3.4: use yield from, yield denied, decorated
with @asyncio.coroutine
(B) PEP 492 coroutine in Python 3.5: use await, yield & yield from denied,
function definition prefixed by "async"

Yury proposed "generator-based coroutine for the kind (A). Maybe not a great
name, since we can learn in the PEP 492 that the kind (B) is also
(internally) based on generators.

I don't think that we should use distinct names for the two kinds in common
cases. But when we need to clearly use distinct names, I propose the
following names:

Kind (A):

- "yield-from coroutines" or "coroutines based on yield-from"
- maybe "asyncio coroutines"
- "legacy coroutines"?

Kind (B):

- "awaitable coroutines" or "coroutines based on await"
- "asynchronous coroutine" to remember the "async" keyword even if it sounds
wrong to repeat that a coroutine can be interrupted (it's almost the
definition of a coroutine, no?)
- or just "asynchronous function" (coroutine function) & "asynchronous
object" (coroutine object)

Victor

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov 
wrote:

> On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby  wrote:
> > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov 
> wrote:
> >> It is an error to pass a regular context manager without ``__aenter__``
> >> and ``__aexit__`` methods to ``async with``.  It is a ``SyntaxError``
> >> to use ``async with`` outside of a coroutine.
> >
> > I find this a little weird.  Why not just have `with` and `for` inside
> > a coroutine dynamically check the iterator or context manager, and
> > either behave sync or async accordingly?  Why must there be a
> > *syntactic* difference?
>
> IIRC Guido always like to have different syntax for calling regular
> functions and coroutines.
> That's why we need explicit syntax for asynchronous context managers
> and iterators.
>

To clarify: the philosophy behind asyncio coroutines is that you should be
able to tell statically where a task may be suspended simply by looking for
`yield from`. This means that *no* implicit suspend points may exist, and
it rules out gevent, stackless and similar microthreading frameworks.

In the new PEP this would become `await`, plus specific points dictated by
`async for` and `async with` -- `async for` can suspend (block) at each
iteration step, and `async with` can suspend at the enter and exit points.
The use case for both is database drivers: `async for` may block for the
next record to become available from the query, and `async with` may block
in the implied `finally` clause in order to wait for a commit. (Both may
also suspend at the top, but that's less important.)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Victor Stinner
Hi,

Guido van Rossum  python.org> writes:
> I'm slowly warming up to Greg's notion that you can't call a coroutine (or
whatever it's called) without a special keyword.

A huge part of the asyncio module is based on "yield from fut" where fut is
a Future object.

How do you write this using the PEP 3152? Do you need to call an artifical
method like "cocall fut.return_self()" where the return_self() method simply
returns fut?


When I discovered Python for the first time, I fall into the trap of trying
to call a function without parenthesis: "hello_world" instead of
"hello_world()". I was very surprised that the language didn't protect me
against such obvious bug. But later, I used this feature in almost all my
applications: passing a callback is just a must have feature, and Python
syntax for this is great! (just pass the function without parenthesis)

Would it be possible that creating coroutine object by calling a coroutine
function is a feature, and not a bug? I mean that it may be used in some
cases. I worked a lot of asyncio and I saw a lot of hacks to solve some
corner case issues, to be able to propose a nice API at the end.

@asyncio.coroutine currently calls a function and *then* check if it should
yields from it or not:

res = func(*args, **kw)
if isinstance(res, futures.Future) or inspect.isgenerator(res):
res = yield from res

With the PEP 3152, it's no more possible to write such code. I fear that we
miss cases where it would be needed.

maybeDeferred() is an important function in Twisted. As expected, users ask
for a similar function in asyncio:

http://stackoverflow.com/questions/20730248/maybedeferred-analog-with-asyncio

Currently, it's possible to implement it using yield from.


> OTOH I'm still struggling with what you have to do to wrap a coroutine in
a Task, the way its done in asyncio by the Task() constructor, the
loop.create_task() method, and the async() function (perhaps to be renamed
to ensure_task() to make way for the async keyword).

Logging a warning when a coroutine object is not "consumed" ("yielded
from"?) is only one of the asyncio.CoroWrapper features. It's now also used
to remember where the coroutine object was created: it's very useful to
rebuild the chain of function calls/tasks/coroutines to show where the bug
comes from.

(I still have a project to enhance debugging to create a full stack where a
task was created. Currently, the stack stops at the last "Task._step()", but
it's technically possible to go further (I have a PoC somewhere). I already
introduced BaseEventLoop._current_handle as a first step.)

Oh, and CoroWrapper also provides a better representation. But we might
enhance repr(coroutine_object) directly in Python.

Yury proposed to store the source (filename, line number) of the most recent
frame where a coroutine object was created. But a single frame is not enough
(usually, the interesting frame is at least the 3rd frame, not the most
recent one). Storing more frames would kill performances in debug mode
(and/or create reference cycles if we keep frame objects, not only
filename+line number).

For all these reasons, I'm in favor of keeping the ability of wrapping
coroutine objects. It has a negligible impact in release mode and you can do
whatever you want in debug mode which is very convenient.

Victor

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Hi PJ,

On 2015-04-22 3:44 PM, PJ Eby wrote:

On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov  wrote:

It is an error to pass a regular context manager without ``__aenter__``
and ``__aexit__`` methods to ``async with``.  It is a ``SyntaxError``
to use ``async with`` outside of a coroutine.

I find this a little weird.  Why not just have `with` and `for` inside
a coroutine dynamically check the iterator or context manager, and
either behave sync or async accordingly?  Why must there be a
*syntactic* difference?


One of the things that we try to avoid is to have implicit
places where code execution might be suspended. For that
we use 'yield from' right now, and want to use 'await' with
PEP 492.

To have implicit context switches there is Stackless Python
and greenlets, however, it's harder to reason about the code
written in such a way.  Having explicit 'yield from/await'
is the selling point of asyncio and other frameworks that
use generator-based coroutines.

Hence, we want to stress that 'async with' and 'async for'
do suspend the execution in their protocols.

I don't want to loose control over what kind of iteration
or context manager I'm using.  I don't want to iterate
through a cursor that doesn't do prefetching, I want to
make sure that it does.  This problem is solved by the PEP.



Not only would this simplify the syntax, it would also allow dropping
the need for `async` to be a true keyword, since functions could be
defined via "def async foo():" rather than "async def foo():"

...which, incidentally, highlights one of the things that's been
bothering me about all this "async foo" stuff: "async def" looks like
it *defines the function* asynchronously (as with "async with" and
"async for"), rather than defining an asynchronous function.  ISTM it
should be "def async bar():" or even "def bar() async:".


If we keep 'async with', then we'll have to keep 'async def'
to make it symmetric and easier to remember.  But, in theory,
I'd be OK with 'def async'.

'def name() async' is something that will be extremely hard
to notice in the code.



Also, even that seems suspect to me: if `await` looks for an __await__
method and simply returns the same object (synchronously) if the
object doesn't have an await method, then your code sample that
supposedly will fail if a function ceases to be a coroutine *will not
actually fail*.


It doesn't just do that. In the reference implementation, a
single 'await o' compiles to:

(o) # await arg on top of the stack
GET_AWAITABLE
LOAD_CONST None
YIELD_FROM

Where GET_AWAITABLE does the following:

- If it's a coroutine-object -- return it
- If it's an object with __await__, return iter(object.__await__())
- Raise a TypeError of two above steps don't return

If you had a code like that:

   await coro()

where coro is

async def coro(): pass

you then can certainly refactor core to:

def coro(): return future # or some awaitable, please refer to PEP492

And it won't break anything.

So I'm not sure I understand your remark about
"*will not actually fail*".


In my experience working with coroutine systems, making a system
polymorphic (do something appropriate with what's given) and
idempotent (don't do anything if what's wanted is already done) makes
it more robust.  In particular, it eliminates the issue of mixing
coroutines and non-coroutines.


Unfortunately, to completely eliminate the issue of reusing
existing "non-coroutine" code, or of writing "coroutine" code
that can be used with "non-coroutine" code, you have to use
gevent-kind of libraries.



To sum up: I can see the use case for a new `await` distinguished from
`yield`, but I don't see the need to create new syntax for everything;
ISTM that adding the new asynchronous protocols and using them on
demand is sufficient.  Marking a function asynchronous so it can use
asynchronous iteration and context management seems reasonably useful,
but I don't think it's terribly important for the type of function
result.  Indeed, ISTM that the built-in `object` class could just
implement `__await__` as a no-op returning self, and then *all*
results are trivially asynchronous results and can be awaited
idempotently, so that awaiting something that has already been waited
for is a no-op.


I see all objects implementing __await__ returning "self" as
a very error prone approach.  It's totally OK to write code
like that:

async def coro():
   return fut
future = await coro()

In the above example, if coro ceases to be a coroutine,
'future' will be a result of 'fut', not 'fut' itself.


  (Prior art: the Javascript Promise.resolve() method,
which takes either a promise or a plain value and returns a promise,
so that you can write code which is always-async in the presence of
values that may already be known.)

Finally, if the async for and with operations have to be distinguished
by syntax at the point of use (vs. just always being used in
coroutines), then ISTM that they should be `with async foo:` and `for
async x in bar:`, 

Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Ludovic Gasc
+1 about Andrew Svetlov proposition: please help to migrate as smoothly as
possible to async/await.

--
Ludovic Gasc (GMLudo)
http://www.gmludo.eu/

2015-04-22 20:32 GMT+02:00 Andrew Svetlov :

> For now I can use mix asyncio.coroutines and `async def` functions, I
> mean I can write `await f()` inside async def to call
> asyncio.coroutine `f` and vise versa: I can use `yield from g()`
> inside asyncio.coroutine to call `async def g(): ...`.
>
> If we forbid to call `async def` from regualr code how asyncio should
> work? I'd like to push `async def` everywhere in asyncio API where
> asyncio.coroutine required.
>
> On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov 
> wrote:
> > Hi Rajiv,
> >
> > On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
> >>
> >> I'd like to suggest another way around some of the issues here, with
> >> apologies if this has already been discussed sometime in the past.
> >>
> >>  From the viewpoint of a Python programmer, there are two distinct
> reasons
> >> for wanting to suspend execution in a block of code:
> >>
> >> 1. To yield a value from an iterator, as Python generators do today.
> >>
> >> 2. To cede control to the event loop while waiting for an asynchronous
> >> task
> >> to make progress in a coroutine.
> >>
> >> As of today both of these reasons to suspend are supported by the same
> >> underlying mechanism, i.e. a "yield" at the end of the chain of "yield
> >> from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the
> >> bottom
> >> of it all there's effectively still a yield as I understand it.
> >>
> >> I think that the fact that these two concepts use the same mechanism is
> >> what leads to the issues with coroutine-generators that Greg and Yury
> have
> >> raised.
> >>
> >> With that in mind, would it be possible to introduce a second form of
> >> suspension to Python to specifically handle the case of ceding to the
> >> event
> >> loop? I don't know what the implementation complexity of this would be,
> or
> >> if it's even feasible. But roughly speaking, the syntax for this could
> use
> >> "await", and code would look just like it does in the PEP. The semantics
> >> of
> >> "await " would be analogous to "yield from " today, with the
> >> difference that the Task would go up the chain of "await"s to the
> >> outermost
> >> caller, which would typically be asyncio, with some modifications from
> its
> >> form today. Progress would be made via __anext__ instead of __next__.
> >
> >
> > I think that what you propose is a great idea. However, its
> > implementation will be far more invasive than what PEP 492
> > proposes.  I doubt that we'll be able to make it in 3.5 if
> > we choose this route.
> >
> > BUT: With my latest proposal to disallow for..in loops and
> > iter()/list()-like builtins, the fact that coroutines are
> > based internally on generators is just an implementation
> > detail.
> >
> > There is no way users can exploit the underlying generator
> > object.  Coroutine-objects only provide 'send()' and 'throw()'
> > methods, which they would also have with your implementation
> > idea.
> >
> > This gives us freedom to consider your approach in 3.6 if
> > we decide to add coroutine-generators.  To make this work
> > we might want to patch inspect.py to make isgenerator() family
> > of functions to return False for coroutines/coroutine-objects.
> >
> > Thanks a lot for the feedback!
> >
> > Yury
> > ___
> > Python-Dev mailing list
> > Python-Dev@python.org
> > https://mail.python.org/mailman/listinfo/python-dev
> > Unsubscribe:
> >
> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com
>
>
>
> --
> Thanks,
> Andrew Svetlov
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Ethan Furman
On 04/22, PJ Eby wrote:

> tl;dr: I like the overall ideas but hate the syntax and type
> segregation involved: declaring a function async at the top is OK to
> enable async with/for semantics and await expressions, but the rest
> seems unnecessary and bad for writing robust code.  (e.g. note that
> requiring different syntax means a function must either duplicate code
> or restrict its input types more, and type changes in remote parts of
> the program will propagate syntax changes throughout.)

Agreed.

--
~Ethan~
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby  wrote:
> On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov  
> wrote:
>> It is an error to pass a regular context manager without ``__aenter__``
>> and ``__aexit__`` methods to ``async with``.  It is a ``SyntaxError``
>> to use ``async with`` outside of a coroutine.
>
> I find this a little weird.  Why not just have `with` and `for` inside
> a coroutine dynamically check the iterator or context manager, and
> either behave sync or async accordingly?  Why must there be a
> *syntactic* difference?

IIRC Guido always like to have different syntax for calling regular
functions and coroutines.
That's why we need explicit syntax for asynchronous context managers
and iterators.

>
> Not only would this simplify the syntax, it would also allow dropping
> the need for `async` to be a true keyword, since functions could be
> defined via "def async foo():" rather than "async def foo():"
>
> ...which, incidentally, highlights one of the things that's been
> bothering me about all this "async foo" stuff: "async def" looks like
> it *defines the function* asynchronously (as with "async with" and
> "async for"), rather than defining an asynchronous function.  ISTM it
> should be "def async bar():" or even "def bar() async:".
>
> Also, even that seems suspect to me: if `await` looks for an __await__
> method and simply returns the same object (synchronously) if the
> object doesn't have an await method, then your code sample that
> supposedly will fail if a function ceases to be a coroutine *will not
> actually fail*.
>
> In my experience working with coroutine systems, making a system
> polymorphic (do something appropriate with what's given) and
> idempotent (don't do anything if what's wanted is already done) makes
> it more robust.  In particular, it eliminates the issue of mixing
> coroutines and non-coroutines.
>
> To sum up: I can see the use case for a new `await` distinguished from
> `yield`, but I don't see the need to create new syntax for everything;
> ISTM that adding the new asynchronous protocols and using them on
> demand is sufficient.  Marking a function asynchronous so it can use
> asynchronous iteration and context management seems reasonably useful,
> but I don't think it's terribly important for the type of function
> result.  Indeed, ISTM that the built-in `object` class could just
> implement `__await__` as a no-op returning self, and then *all*
> results are trivially asynchronous results and can be awaited
> idempotently, so that awaiting something that has already been waited
> for is a no-op.  (Prior art: the Javascript Promise.resolve() method,
> which takes either a promise or a plain value and returns a promise,
> so that you can write code which is always-async in the presence of
> values that may already be known.)
>
> Finally, if the async for and with operations have to be distinguished
> by syntax at the point of use (vs. just always being used in
> coroutines), then ISTM that they should be `with async foo:` and `for
> async x in bar:`, since the asynchronousness is just an aspect of how
> the main keyword is executed.
>
> tl;dr: I like the overall ideas but hate the syntax and type
> segregation involved: declaring a function async at the top is OK to
> enable async with/for semantics and await expressions, but the rest
> seems unnecessary and bad for writing robust code.  (e.g. note that
> requiring different syntax means a function must either duplicate code
> or restrict its input types more, and type changes in remote parts of
> the program will propagate syntax changes throughout.)
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread PJ Eby
On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov  wrote:
> It is an error to pass a regular context manager without ``__aenter__``
> and ``__aexit__`` methods to ``async with``.  It is a ``SyntaxError``
> to use ``async with`` outside of a coroutine.

I find this a little weird.  Why not just have `with` and `for` inside
a coroutine dynamically check the iterator or context manager, and
either behave sync or async accordingly?  Why must there be a
*syntactic* difference?

Not only would this simplify the syntax, it would also allow dropping
the need for `async` to be a true keyword, since functions could be
defined via "def async foo():" rather than "async def foo():"

...which, incidentally, highlights one of the things that's been
bothering me about all this "async foo" stuff: "async def" looks like
it *defines the function* asynchronously (as with "async with" and
"async for"), rather than defining an asynchronous function.  ISTM it
should be "def async bar():" or even "def bar() async:".

Also, even that seems suspect to me: if `await` looks for an __await__
method and simply returns the same object (synchronously) if the
object doesn't have an await method, then your code sample that
supposedly will fail if a function ceases to be a coroutine *will not
actually fail*.

In my experience working with coroutine systems, making a system
polymorphic (do something appropriate with what's given) and
idempotent (don't do anything if what's wanted is already done) makes
it more robust.  In particular, it eliminates the issue of mixing
coroutines and non-coroutines.

To sum up: I can see the use case for a new `await` distinguished from
`yield`, but I don't see the need to create new syntax for everything;
ISTM that adding the new asynchronous protocols and using them on
demand is sufficient.  Marking a function asynchronous so it can use
asynchronous iteration and context management seems reasonably useful,
but I don't think it's terribly important for the type of function
result.  Indeed, ISTM that the built-in `object` class could just
implement `__await__` as a no-op returning self, and then *all*
results are trivially asynchronous results and can be awaited
idempotently, so that awaiting something that has already been waited
for is a no-op.  (Prior art: the Javascript Promise.resolve() method,
which takes either a promise or a plain value and returns a promise,
so that you can write code which is always-async in the presence of
values that may already be known.)

Finally, if the async for and with operations have to be distinguished
by syntax at the point of use (vs. just always being used in
coroutines), then ISTM that they should be `with async foo:` and `for
async x in bar:`, since the asynchronousness is just an aspect of how
the main keyword is executed.

tl;dr: I like the overall ideas but hate the syntax and type
segregation involved: declaring a function async at the top is OK to
enable async with/for semantics and await expressions, but the rest
seems unnecessary and bad for writing robust code.  (e.g. note that
requiring different syntax means a function must either duplicate code
or restrict its input types more, and type changes in remote parts of
the program will propagate syntax changes throughout.)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
On Wed, Apr 22, 2015 at 10:24 PM, Yury Selivanov
 wrote:
>
>
> On 2015-04-22 2:53 PM, Andrew Svetlov wrote:
>>
>> On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov 
>> wrote:
>
> [...]
>>>
>>>
 If we forbid to call `async def` from regualr code how asyncio should
 work? I'd like to push `async def` everywhere in asyncio API where
 asyncio.coroutine required.
>>>
>>>
>>> You'll have to use a wrapper that will do the following:
>>>
>>> async def foo():
>>>  return 'spam'
>>>
>>> @asyncio.coroutine
>>> def bar():
>>>  what = yield from foo.__await__(foo, *args, **kwargs)
>>>  # OR:
>>>  what = yield from await_call(foo, *args, **kwargs)
>>>
>> If I cannot directly use `yield from f()` with `async def f():` then
>> almost every `yield from` inside asyncio library should be wrapped in
>> `await_call()`. Every third-party asyncio-based library should do the
>> same.
>>
>> Also I expect a performance degradation on `await_call()` calls.
>>
>

> I think there is another way... instead of pushing
>
> GET_ITER
> ...
> YIELD_FROM
>
> opcodes, we'll need to replace GET_ITER with another one:
>
> GET_ITER_SPECIAL
> ...
> YIELD_FROM
>
>
> Where "GET_ITER_SPECIAL (obj)" (just a working name) would check
> that if the current code object has CO_COROUTINE and the
> object that you will yield-from has it as well, it would
> push to the stack the result of (obj.__await__())
>
GET_ITER_SPECIAL sounds better than wrapper for `coro.__await__()` call.

> Yury



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov



On 2015-04-22 2:53 PM, Andrew Svetlov wrote:

On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov  wrote:

[...]



If we forbid to call `async def` from regualr code how asyncio should
work? I'd like to push `async def` everywhere in asyncio API where
asyncio.coroutine required.


You'll have to use a wrapper that will do the following:

async def foo():
 return 'spam'

@asyncio.coroutine
def bar():
 what = yield from foo.__await__(foo, *args, **kwargs)
 # OR:
 what = yield from await_call(foo, *args, **kwargs)


If I cannot directly use `yield from f()` with `async def f():` then
almost every `yield from` inside asyncio library should be wrapped in
`await_call()`. Every third-party asyncio-based library should do the
same.

Also I expect a performance degradation on `await_call()` calls.



I think there is another way... instead of pushing

GET_ITER
...
YIELD_FROM

opcodes, we'll need to replace GET_ITER with another one:

GET_ITER_SPECIAL
...
YIELD_FROM


Where "GET_ITER_SPECIAL (obj)" (just a working name) would check
that if the current code object has CO_COROUTINE and the
object that you will yield-from has it as well, it would
push to the stack the result of (obj.__await__())

Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov  wrote:
> Andrew,
>
> On 2015-04-22 2:32 PM, Andrew Svetlov wrote:
>>
>> For now I can use mix asyncio.coroutines and `async def` functions, I
>> mean I can write `await f()` inside async def to call
>> asyncio.coroutine `f` and vise versa: I can use `yield from g()`
>> inside asyncio.coroutine to call `async def g(): ...`.
>
>
> That's another good point that I forgot to add to the list.
> Thanks for bringing this up.
>
>>
>> If we forbid to call `async def` from regualr code how asyncio should
>> work? I'd like to push `async def` everywhere in asyncio API where
>> asyncio.coroutine required.
>
>

> You'll have to use a wrapper that will do the following:
>
> async def foo():
> return 'spam'
>
> @asyncio.coroutine
> def bar():
> what = yield from foo.__await__(foo, *args, **kwargs)
> # OR:
> what = yield from await_call(foo, *args, **kwargs)
>
If I cannot directly use `yield from f()` with `async def f():` then
almost every `yield from` inside asyncio library should be wrapped in
`await_call()`. Every third-party asyncio-based library should do the
same.

Also I expect a performance degradation on `await_call()` calls.

> Thanks,
> Yury



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Andrew,

On 2015-04-22 2:32 PM, Andrew Svetlov wrote:

For now I can use mix asyncio.coroutines and `async def` functions, I
mean I can write `await f()` inside async def to call
asyncio.coroutine `f` and vise versa: I can use `yield from g()`
inside asyncio.coroutine to call `async def g(): ...`.


That's another good point that I forgot to add to the list.
Thanks for bringing this up.



If we forbid to call `async def` from regualr code how asyncio should
work? I'd like to push `async def` everywhere in asyncio API where
asyncio.coroutine required.


You'll have to use a wrapper that will do the following:

async def foo():
return 'spam'

@asyncio.coroutine
def bar():
what = yield from foo.__await__(foo, *args, **kwargs)
# OR:
what = yield from await_call(foo, *args, **kwargs)

Thanks,
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Andrew Svetlov
For now I can use mix asyncio.coroutines and `async def` functions, I
mean I can write `await f()` inside async def to call
asyncio.coroutine `f` and vise versa: I can use `yield from g()`
inside asyncio.coroutine to call `async def g(): ...`.

If we forbid to call `async def` from regualr code how asyncio should
work? I'd like to push `async def` everywhere in asyncio API where
asyncio.coroutine required.

On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov  wrote:
> Hi Rajiv,
>
> On 2015-04-22 12:53 PM, Rajiv Kumar wrote:
>>
>> I'd like to suggest another way around some of the issues here, with
>> apologies if this has already been discussed sometime in the past.
>>
>>  From the viewpoint of a Python programmer, there are two distinct reasons
>> for wanting to suspend execution in a block of code:
>>
>> 1. To yield a value from an iterator, as Python generators do today.
>>
>> 2. To cede control to the event loop while waiting for an asynchronous
>> task
>> to make progress in a coroutine.
>>
>> As of today both of these reasons to suspend are supported by the same
>> underlying mechanism, i.e. a "yield" at the end of the chain of "yield
>> from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the
>> bottom
>> of it all there's effectively still a yield as I understand it.
>>
>> I think that the fact that these two concepts use the same mechanism is
>> what leads to the issues with coroutine-generators that Greg and Yury have
>> raised.
>>
>> With that in mind, would it be possible to introduce a second form of
>> suspension to Python to specifically handle the case of ceding to the
>> event
>> loop? I don't know what the implementation complexity of this would be, or
>> if it's even feasible. But roughly speaking, the syntax for this could use
>> "await", and code would look just like it does in the PEP. The semantics
>> of
>> "await " would be analogous to "yield from " today, with the
>> difference that the Task would go up the chain of "await"s to the
>> outermost
>> caller, which would typically be asyncio, with some modifications from its
>> form today. Progress would be made via __anext__ instead of __next__.
>
>
> I think that what you propose is a great idea. However, its
> implementation will be far more invasive than what PEP 492
> proposes.  I doubt that we'll be able to make it in 3.5 if
> we choose this route.
>
> BUT: With my latest proposal to disallow for..in loops and
> iter()/list()-like builtins, the fact that coroutines are
> based internally on generators is just an implementation
> detail.
>
> There is no way users can exploit the underlying generator
> object.  Coroutine-objects only provide 'send()' and 'throw()'
> methods, which they would also have with your implementation
> idea.
>
> This gives us freedom to consider your approach in 3.6 if
> we decide to add coroutine-generators.  To make this work
> we might want to patch inspect.py to make isgenerator() family
> of functions to return False for coroutines/coroutine-objects.
>
> Thanks a lot for the feedback!
>
> Yury
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com



-- 
Thanks,
Andrew Svetlov
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 9:26 AM, Ian Cordasco 
wrote:

> As the other maintainer of requests, I think having hints *might* help
> some developers, but looking at what Cory generated (which looks to be
> valid), I'm wondering about something else with Type Hints.
>
> I've heard several people say "Just create an aliased type for the hint so
> it's shorter!" but doesn't that mean we then have to document that alias
> for our users? I mean if the IDE suggests that the developer use XYZ for
> this parameter and there's no explanation for XYZ actually is (in the IDE),
> doesn't this just add a lot more maintenance to adding hints? Maintainers
> now have to:
>
> - Keep the stubs up-to-date
> - Document the stubs  (and if the stubs are in typeshed, does $MyPackage
> link to the docs in typeshed to avoid users filing bugs on $MyPackage's
> issue tracker?)
> - Version the stubs (assuming they're maintained in a third-party
> location, e.g., typeshed)
>
> Don't get me wrong. I really like the idea of moving towards Type Hints.
> I'm not even particularly against adding type hints for Requests to
> typeshed. I'm just hesitant that it will be easy to make them entirely
> accurate.
>

To be useful for the users of a package, type aliases need to be exported
by the package, which means that the package itself grows a dependency on
typing.py. You could probably make that a conditional dependency, e.g.

try:
  from typing import Union, Tuple, AnyStr, Optional
  HeaderTuple = Union[Tuple[AnyStr, AnyStr], Tuple[AnyStr, AnyStr,
Optional[AnyStr]]]
  # etc.
except ImportError:
  pass  # Don't define type aliases

and use a stub file for the actual signatures. User code that itself has a
hard dependency on typing could import and use the type aliases
unconditionally; user code with a conditional dependency on typing should
stick to stubs (or similar import hacks).

If you use type hints this way you should probably maintain the stubs as
part of your package (as .pyi files living alongside the .py files) so that
you don't have to deal with typeshed being out of date.

There are many other possible workflows; we haven't discovered the best
one(s) yet. It's a work in progress.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 9:12 AM, Skip Montanaro 
wrote:

>
> On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum 
> wrote:
>
>> For Requests, it looks like it may be better not to have stubs at all.
>
>
> Can you expand on this? Why would Requests be any different than any other
> module/package?
>

Did you see the crazy union that Cory came up with for one of the
parameters of get() in another thread? Requests assumes duck typing for
most of its arguments *and* it handles different forms of many arguments
(e.g. you can specify an iterable or a mapping).

Requests is different for a number of reasons; it has evolved for years to
support many different use cases with a few methods, so it is using every
trick in the book from dynamic typing.

Type checking (using the current state of the tools) is probably more
useful for "business logic" code than for "library" code. We'll iterate
over the next few Python versions (and even while Python 3.5 is out we can
tweak typing.py because it's provisional in the PEP 411 sense) and
eventually it may be possible to provide stubs for Requests. But since PEP
484 doesn't address duck typing it's simply too early. (As I said
elsewhere, duck typing is an important next step, but we need PEP 484 as
the foundation first).


>
> As for versioning, I think stub files would absolutely have to declare the
> appropriate version(s) to which they apply (probably via embedded
> directives), so type checkers can ignore them when necessary. That also
> means that type checkers must be able to figure out the version of the
> package used by the application being analyzed.
>
> Not sure I'm being too clear, so I will provide an example. I have app
> "myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't
> yet provided type annotations, so someone else contributed a stub to the
> typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0.
> (Semantic versioning tells us the API has changed in an incompatible way
> because of the major version bump.) I decide I need some of its new
> features and update "myapp". There is no new stub file in the typeshed yet.
> When I run my fancy type checker (someone suggested I will shortly have 50
> to choose from!), it needs to recognize that the stub no longer matches the
> version of "yourmod" I am using, and must ignore it.
>
> Does that suggest the typeshed needs some sort of structure which allows
> all versions of stubs for the same package to be gathered together?
>
> My apologies if I'm following along way behind the curve.
>

No, I think you can start filing (or adding your view to) issues with the
typeshed tracker: https://github.com/JukkaL/typeshed/issues

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Hi Rajiv,

On 2015-04-22 12:53 PM, Rajiv Kumar wrote:

I'd like to suggest another way around some of the issues here, with
apologies if this has already been discussed sometime in the past.

 From the viewpoint of a Python programmer, there are two distinct reasons
for wanting to suspend execution in a block of code:

1. To yield a value from an iterator, as Python generators do today.

2. To cede control to the event loop while waiting for an asynchronous task
to make progress in a coroutine.

As of today both of these reasons to suspend are supported by the same
underlying mechanism, i.e. a "yield" at the end of the chain of "yield
from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom
of it all there's effectively still a yield as I understand it.

I think that the fact that these two concepts use the same mechanism is
what leads to the issues with coroutine-generators that Greg and Yury have
raised.

With that in mind, would it be possible to introduce a second form of
suspension to Python to specifically handle the case of ceding to the event
loop? I don't know what the implementation complexity of this would be, or
if it's even feasible. But roughly speaking, the syntax for this could use
"await", and code would look just like it does in the PEP. The semantics of
"await " would be analogous to "yield from " today, with the
difference that the Task would go up the chain of "await"s to the outermost
caller, which would typically be asyncio, with some modifications from its
form today. Progress would be made via __anext__ instead of __next__.


I think that what you propose is a great idea. However, its
implementation will be far more invasive than what PEP 492
proposes.  I doubt that we'll be able to make it in 3.5 if
we choose this route.

BUT: With my latest proposal to disallow for..in loops and
iter()/list()-like builtins, the fact that coroutines are
based internally on generators is just an implementation
detail.

There is no way users can exploit the underlying generator
object.  Coroutine-objects only provide 'send()' and 'throw()'
methods, which they would also have with your implementation
idea.

This gives us freedom to consider your approach in 3.6 if
we decide to add coroutine-generators.  To make this work
we might want to patch inspect.py to make isgenerator() family
of functions to return False for coroutines/coroutine-objects.

Thanks a lot for the feedback!
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available

2015-04-22 Thread Steve Dower
Whoops, sorry. Yeah, I knew about that behavior, but in hindsight it's 
obviously going to be a surprise for others :)

One plain zip file coming up for the next release. All the binaries will be 
signed and paranoid people can check the GPG sig and embed their own hashes if 
not the files themselves.

Cheers,
Steve

Top-posted from my Windows Phone

From: Paul Moore
Sent: ‎4/‎22/‎2015 5:48
To: Steve Dower
Cc: Larry Hastings; Python 
Dev
Subject: Re: [python-committers] [RELEASED] Python 3.5.0a4 is now available

On 22 April 2015 at 13:45, Paul Moore  wrote:
> On 21 April 2015 at 23:05, Steve Dower  wrote:
>> I made it a self-extracting RAR file so it could be signed, but I've already 
>> had multiple people query it so the next release will probably just be a 
>> plain ZIP file. I just need to figure out some reliable way of validating 
>> the download other than GPG, since I'd like installers to be able to do the 
>> download transparently and ideally without hard-coding hash values. I might 
>> add a CSV of SHA hashes to the zip too.
>
> You could probably just leave it as is (or make it a self-extracting
> zip file) and just describe it on the web page as "Windows amd64
> embeddable self-extracting archive". People are (I think) pretty used
> to the idea that they can open a self-extracting archive in tools like
> 7-zip, so those who didn't want to run the exe could do that (and
> would know they could). Obviously extracting that way you don't get
> the signature check, but that's to be expected.

Whoops, no - I changed my mind. If you double click on the downloaded
file (which I just did) it unpacks it into the directory you
downloaded the exe to, with no option to put it anywhere else, and no
UI telling you what it's doing. That's going to annoy people badly.
Better make it a simple zipfile in that case.

Paul (off to tidy up his download directory :-()
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Rajiv Kumar
I'd like to suggest another way around some of the issues here, with
apologies if this has already been discussed sometime in the past.

>From the viewpoint of a Python programmer, there are two distinct reasons
for wanting to suspend execution in a block of code:

1. To yield a value from an iterator, as Python generators do today.

2. To cede control to the event loop while waiting for an asynchronous task
to make progress in a coroutine.

As of today both of these reasons to suspend are supported by the same
underlying mechanism, i.e. a "yield" at the end of the chain of "yield
from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom
of it all there's effectively still a yield as I understand it.

I think that the fact that these two concepts use the same mechanism is
what leads to the issues with coroutine-generators that Greg and Yury have
raised.

With that in mind, would it be possible to introduce a second form of
suspension to Python to specifically handle the case of ceding to the event
loop? I don't know what the implementation complexity of this would be, or
if it's even feasible. But roughly speaking, the syntax for this could use
"await", and code would look just like it does in the PEP. The semantics of
"await " would be analogous to "yield from " today, with the
difference that the Task would go up the chain of "await"s to the outermost
caller, which would typically be asyncio, with some modifications from its
form today. Progress would be made via __anext__ instead of __next__.

Again, this might be impossible to do, but the mental model for the Python
programmer becomes cleaner, I think. Most of the issues around combining
generators and coroutines would go away - you could freely use "await"
inside a generator since it cedes control to the event loop, not the caller
of the generator. All of the "async def"/"await" examples in PEP 492 would
work as is. It might also make it easier in the future to add support for
async calls insider __getattr__ etc.

Thanks for reading!
Rajiv
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Ian Cordasco
On Wed, Apr 22, 2015 at 11:30 AM, Skip Montanaro 
wrote:

>
>
> On Wed, Apr 22, 2015 at 11:26 AM, Ian Cordasco  > wrote:
>
>> On a separate thread Cory provided an example of what the hints would
>> look like for *part* of one function in the requests public functional API.
>>
>
> Thanks. That encouraged me to look around for recent posts from Cory.
> Wow...
>

You're welcome! And yeah. That union that Cory posted was for *one*
parameter if I remember correctly. I won't speak for Cory, but I'm not
against the type hints in 484 but they will be difficult for us as a
project. They'll be marginally less difficult for me in a different project
of mine.

I also wonder about importing type definitions from other packages. The
Requests-Toolbelt adds a few features that are enhanced versions of what's
already in Requests. I can think of a few type hints that we might create
to represent certain parameters, but I don't want to have to copy those for
the features in the Requests-Toolbelt. I would expect this to "Just Work",
but I wonder if anyone else has considered the possibility of this being a
need.

Cheers,
Ian
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Ian Cordasco
On Wed, Apr 22, 2015 at 11:12 AM, Skip Montanaro 
wrote:

>
> On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum 
> wrote:
>
>> For Requests, it looks like it may be better not to have stubs at all.
>
>
> Can you expand on this? Why would Requests be any different than any other
> module/package?
>
>
On a separate thread Cory provided an example of what the hints would look
like for *part* of one function in the requests public functional API.
While our API is outwardly simple, the values we accept in certain cases
are actually non-trivially represented. Getting the hints *exactly* correct
would be extraordinarily difficult.


> As for versioning, I think stub files would absolutely have to declare the
> appropriate version(s) to which they apply (probably via embedded
> directives), so type checkers can ignore them when necessary. That also
> means that type checkers must be able to figure out the version of the
> package used by the application being analyzed.
>
> Not sure I'm being too clear, so I will provide an example. I have app
> "myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't
> yet provided type annotations, so someone else contributed a stub to the
> typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0.
> (Semantic versioning tells us the API has changed in an incompatible way
> because of the major version bump.) I decide I need some of its new
> features and update "myapp". There is no new stub file in the typeshed yet.
> When I run my fancy type checker (someone suggested I will shortly have 50
> to choose from!), it needs to recognize that the stub no longer matches the
> version of "yourmod" I am using, and must ignore it.
>
>
Which of course also assumes that the author of that library is even using
Semantic Versioning (which is not a universal release strategy, even in the
Ruby community). I understand where you're coming from, but I think this is
a reason as to why typeshed as a catch-all for third party type-hints may
not be feasible.


>
> Does that suggest the typeshed needs some sort of structure which allows
> all versions of stubs for the same package to be gathered together?
>
> My apologies if I'm following along way behind the curve.
>

No need to apologize. =)

As the other maintainer of requests, I think having hints *might* help some
developers, but looking at what Cory generated (which looks to be valid),
I'm wondering about something else with Type Hints.

I've heard several people say "Just create an aliased type for the hint so
it's shorter!" but doesn't that mean we then have to document that alias
for our users? I mean if the IDE suggests that the developer use XYZ for
this parameter and there's no explanation for XYZ actually is (in the IDE),
doesn't this just add a lot more maintenance to adding hints? Maintainers
now have to:

- Keep the stubs up-to-date
- Document the stubs  (and if the stubs are in typeshed, does $MyPackage
link to the docs in typeshed to avoid users filing bugs on $MyPackage's
issue tracker?)
- Version the stubs (assuming they're maintained in a third-party location,
e.g., typeshed)

Don't get me wrong. I really like the idea of moving towards Type Hints.
I'm not even particularly against adding type hints for Requests to
typeshed. I'm just hesitant that it will be easy to make them entirely
accurate.

Cheers,
Ian
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Hi Guido,

On 2015-04-22 11:50 AM, Guido van Rossum wrote:

On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov 
wrote:

On the one hand I like your idea to disallow calling
coroutines without a special keyword (await in case of
PEP 492).  It has downsides, but there is some
elegance in it.  On the other hand, I hate the idea
of grammatically requiring parentheses for 'await'
expressions.  That feels non-pytonic to me.

I'd be happy to hear others opinion on this topic.


I'm slowly warming up to Greg's notion that you can't call a coroutine (or
whatever it's called) without a special keyword. This makes a whole class
of bugs obvious the moment the code is executed.

OTOH I'm still struggling with what you have to do to wrap a coroutine in a
Task, the way its done in asyncio by the Task() constructor, the
loop.create_task() method, and the async() function (perhaps to be renamed
to ensure_task() to make way for the async keyword).


If we apply Greg's ideas to PEP 492 we will have the following
(Greg, correct me if I'm wrong):

1. '_typeobject' struct will get a new field 'tp_await'. We can
reuse 'tp_reserved' for that probably.

2. We'll hack Gen(/ceval.c?) objects to raise an error if they
are called directly and have a 'CO_COROUTINE' flag.

3. Task(), create_task() and async() will be modified to call
'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag.

4. 'await' will require parentheses grammatically. That will
make it different from 'yield' expression. For instance,
I still don't know what would 'await coro(123)()' mean.

5. 'await foo(*a, **k)' will be an equivalent to
'yield from type(coro).__await__(coro, *a, **k)'

6. If we ever decide to implement coroutine-generators --
async def functions with 'await' *and* some form of 'yield' --
we'll need to reverse the rule -- allow __call__ and
disallow __await__ on such objects (so that you'll be able
to write 'async for item in coro_gen()' instead of
'async for item in await coro_gen()'.


To be honest, that's a lot of steps and hacks to make this
concept work.  I think that 'set_coroutine_wrapper()'
solves all these problems while keeping the grammar and
implementation simpler.  Moreover, it allows to add
some additional features to the wrapped coroutines,
such as nicer repr() in debug mode (CoroWrapper in
asyncio already does that) and other runtime checks.

Thanks,
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Skip Montanaro
On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum  wrote:

> For Requests, it looks like it may be better not to have stubs at all.


Can you expand on this? Why would Requests be any different than any other
module/package?

As for versioning, I think stub files would absolutely have to declare the
appropriate version(s) to which they apply (probably via embedded
directives), so type checkers can ignore them when necessary. That also
means that type checkers must be able to figure out the version of the
package used by the application being analyzed.

Not sure I'm being too clear, so I will provide an example. I have app
"myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't
yet provided type annotations, so someone else contributed a stub to the
typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0.
(Semantic versioning tells us the API has changed in an incompatible way
because of the major version bump.) I decide I need some of its new
features and update "myapp". There is no new stub file in the typeshed yet.
When I run my fancy type checker (someone suggested I will shortly have 50
to choose from!), it needs to recognize that the stub no longer matches the
version of "yourmod" I am using, and must ignore it.

Does that suggest the typeshed needs some sort of structure which allows
all versions of stubs for the same package to be gathered together?

My apologies if I'm following along way behind the curve.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Guido van Rossum
On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov 
wrote:
>
> On the one hand I like your idea to disallow calling
> coroutines without a special keyword (await in case of
> PEP 492).  It has downsides, but there is some
> elegance in it.  On the other hand, I hate the idea
> of grammatically requiring parentheses for 'await'
> expressions.  That feels non-pytonic to me.
>
> I'd be happy to hear others opinion on this topic.
>

I'm slowly warming up to Greg's notion that you can't call a coroutine (or
whatever it's called) without a special keyword. This makes a whole class
of bugs obvious the moment the code is executed.

OTOH I'm still struggling with what you have to do to wrap a coroutine in a
Task, the way its done in asyncio by the Task() constructor, the
loop.create_task() method, and the async() function (perhaps to be renamed
to ensure_task() to make way for the async keyword).

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Guido van Rossum
I definitely think that we shouldn't jump the gun here and tread carefully.
Both Marc-André and Cory brought up good things to watch out for.

For closed-source software the only way to obtain stubs is presumably from
the author, if they care. As Gregory Smith said in another thread, the
tooling will have to prove itself to the point where library developers
*want* to use it.

For Requests, it looks like it may be better not to have stubs at all. You
can assign any and all tracker issues asking for stubs to me, I'll gladly
explain why in Requests' case it's better to live without stubs.

On Wed, Apr 22, 2015 at 4:22 AM, Cory Benfield  wrote:

> On 22 April 2015 at 11:46, M.-A. Lemburg  wrote:
> > Unlike with translations, where missing or poor ones don't have
> > much effect on the usefulness of the software, a type checker
> > would complain loudly and probably show lots of false positives
> > (if you read a type bug as "positive"), causing the usual complaints
> > by users to the software authors.
> >
> > I don't really think that users would instead complain to the type
> > checker authors or find the actual source of the problem which are
> > the broken stub files.
>
> This is my expectation as well.
>
> Requests receives bug reports for bugs that were introduced by
> downstream packagers, or for bugs that are outright in unrelated
> projects. I field IRC questions about 'requests bugs' that are
> actually bugs in the web application on the other end of the HTTP
> connection! I can *guarantee* that if a stub file is bad, I'll get
> told, not the author of the stub file.
>
>
> > OTOH, if the type checkers are written in a way where they can
> > detect authoritative stubs compared to non-authoritative ones
> > and point users to the possible type stub file problem, this
> > could be resolved, I guess.
> >
> > The stub files would then need an "authoritative" flag and
> > probably also be versioned to get this working.
>
> This would be great: +1.
>
> > As I've explained above, in my experience, people (*) often first go
> > to the authors of the software and not do research to find out
> > that the tool they were using has a problem (via the non-authoritative
> > stub files it's using).
> >
> > (*) More experienced users of pylint like tools will probably think
> > twice due to the many false positives these tools tend to generate.
> > I'm not sure whether people using type checkers would have the same
> > approach, though, esp. not if they are coming from the land of
> > statically typed languages.
>
> I can back this up, as can a search through Requests' past GitHub
> issues. pylint in particular has caused me pain due to at least one
> GitHub issue that was nothing more than a dump of pylint output when
> run over Requests', where *every single line* was a false positive.
>
> This ends up having negative network effects as well, because the next
> time someone opens a GitHub issue with the word 'pylint' in it I'm
> simply going to close it, rather than waste another 45 minutes
> visually confirming that every line is a false positive. I worry that
> stub files will have the same problems.
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] async/await in Python; v2

2015-04-22 Thread Yury Selivanov

Hi Greg,

On 2015-04-22 2:05 AM, Greg Ewing wrote:

Yury Selivanov wrote:

1. CO_ASYNC flag was renamed to CO_COROUTINE;

2. sys.set_async_wrapper() was renamed to
   sys.set_coroutine_wrapper();

3. New function: sys.get_coroutine_wrapper();

4. types.async_def() renamed to types.coroutine();


I still don't like the idea of hijacking the generic
term "coroutine" and using it to mean this particular
type of object.


In my opinion it's OK.  We propose a new dedicated syntax
to define coroutines.  Old approach to use generators to
implement coroutines will, eventually, be obsolete.

That's why, in PEP 492, we call 'async def' functions
"coroutines", and the ones that are defined with generators
"generator-based coroutines".  You can also have
"greenlets-based coroutines" and "stackless coroutines",
but those aren't native Python concepts.

I'm not sure if you can apply term "cofunctions" to
coroutines in PEP 492.  I guess we can call them
"async functions".




2. I propose to disallow using of 'for..in' loops,
   and builtins like 'list()', 'iter()', 'next()',
   'tuple()' etc on coroutines.


PEP 3152 takes care of this automatically from the fact
that you can't make an ordinary call to a cofunction,
and cocall combines a call and a yield-from. You have
to go out of your way to get hold of the underlying
iterator to use in a for-loop, etc.



On the one hand I like your idea to disallow calling
coroutines without a special keyword (await in case of
PEP 492).  It has downsides, but there is some
elegance in it.  On the other hand, I hate the idea
of grammatically requiring parentheses for 'await'
expressions.  That feels non-pytonic to me.

I'd be happy to hear others opinion on this topic.

Thanks!
Yury
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available

2015-04-22 Thread Paul Moore
On 22 April 2015 at 13:45, Paul Moore  wrote:
> On 21 April 2015 at 23:05, Steve Dower  wrote:
>> I made it a self-extracting RAR file so it could be signed, but I've already 
>> had multiple people query it so the next release will probably just be a 
>> plain ZIP file. I just need to figure out some reliable way of validating 
>> the download other than GPG, since I'd like installers to be able to do the 
>> download transparently and ideally without hard-coding hash values. I might 
>> add a CSV of SHA hashes to the zip too.
>
> You could probably just leave it as is (or make it a self-extracting
> zip file) and just describe it on the web page as "Windows amd64
> embeddable self-extracting archive". People are (I think) pretty used
> to the idea that they can open a self-extracting archive in tools like
> 7-zip, so those who didn't want to run the exe could do that (and
> would know they could). Obviously extracting that way you don't get
> the signature check, but that's to be expected.

Whoops, no - I changed my mind. If you double click on the downloaded
file (which I just did) it unpacks it into the directory you
downloaded the exe to, with no option to put it anywhere else, and no
UI telling you what it's doing. That's going to annoy people badly.
Better make it a simple zipfile in that case.

Paul (off to tidy up his download directory :-()
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available

2015-04-22 Thread Paul Moore
On 21 April 2015 at 23:05, Steve Dower  wrote:
> I made it a self-extracting RAR file so it could be signed, but I've already 
> had multiple people query it so the next release will probably just be a 
> plain ZIP file. I just need to figure out some reliable way of validating the 
> download other than GPG, since I'd like installers to be able to do the 
> download transparently and ideally without hard-coding hash values. I might 
> add a CSV of SHA hashes to the zip too.

You could probably just leave it as is (or make it a self-extracting
zip file) and just describe it on the web page as "Windows amd64
embeddable self-extracting archive". People are (I think) pretty used
to the idea that they can open a self-extracting archive in tools like
7-zip, so those who didn't want to run the exe could do that (and
would know they could). Obviously extracting that way you don't get
the signature check, but that's to be expected.

Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread Cory Benfield
On 22 April 2015 at 11:46, M.-A. Lemburg  wrote:
> Unlike with translations, where missing or poor ones don't have
> much effect on the usefulness of the software, a type checker
> would complain loudly and probably show lots of false positives
> (if you read a type bug as "positive"), causing the usual complaints
> by users to the software authors.
>
> I don't really think that users would instead complain to the type
> checker authors or find the actual source of the problem which are
> the broken stub files.

This is my expectation as well.

Requests receives bug reports for bugs that were introduced by
downstream packagers, or for bugs that are outright in unrelated
projects. I field IRC questions about 'requests bugs' that are
actually bugs in the web application on the other end of the HTTP
connection! I can *guarantee* that if a stub file is bad, I'll get
told, not the author of the stub file.


> OTOH, if the type checkers are written in a way where they can
> detect authoritative stubs compared to non-authoritative ones
> and point users to the possible type stub file problem, this
> could be resolved, I guess.
>
> The stub files would then need an "authoritative" flag and
> probably also be versioned to get this working.

This would be great: +1.

> As I've explained above, in my experience, people (*) often first go
> to the authors of the software and not do research to find out
> that the tool they were using has a problem (via the non-authoritative
> stub files it's using).
>
> (*) More experienced users of pylint like tools will probably think
> twice due to the many false positives these tools tend to generate.
> I'm not sure whether people using type checkers would have the same
> approach, though, esp. not if they are coming from the land of
> statically typed languages.

I can back this up, as can a search through Requests' past GitHub
issues. pylint in particular has caused me pain due to at least one
GitHub issue that was nothing more than a dump of pylint output when
run over Requests', where *every single line* was a false positive.

This ends up having negative network effects as well, because the next
time someone opens a GitHub issue with the word 'pylint' in it I'm
simply going to close it, rather than waste another 45 minutes
visually confirming that every line is a false positive. I worry that
stub files will have the same problems.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Type hints -- a mediocre programmer's reaction

2015-04-22 Thread Cory Benfield
On 21 April 2015 at 17:59, Guido van Rossum  wrote:
> For me, PEP 484 is a stepping stone. Among the authors of PEP 484 there was
> much discussion about duck typing, and mypy even has some limited support
> for duck typing (I think you can still find it by searching the mypy code
> for "protocol"). But we ran out of time getting all the details written up
> and agreed upon, so we decided to punt -- for now. But duck typing still
> needs to have a way to talk about things like "seek method with this type
> signature" (something like `def seek(self, offset: int, whence:
> int=SEEK_SET) -> int`) so the current proposal gets us part of the way
> there.
>
> The hope is that once 3.5 is out (with PEP 484's typing.py included
> *provisional* mode) we can start working on the duck typing specification.
> The alternative would have been to wait until 3.6, but we didn't think that
> there would be much of an advantage to postponing the more basic type
> hinting syntax (it would be like refusing to include "import" until you've
> sorted out packages). During the run of 3.5 we'll hopefully get feedback on
> where duck typing is most needed and how to specify it -- valuable input
> that would be much harder to obtain of *no* part of the type hints notation
> were standardized.

This makes a lot of sense.

If PEP 484 is intended to be a stepping stone (or compromise, or beta,
or whatever word one wants to use), then it is easy to forgive it its
limitations, and I'm looking forward to seeing it improve.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Type hints -- a mediocre programmer's reaction

2015-04-22 Thread Cory Benfield
On 21 April 2015 at 18:12, Steven D'Aprano  wrote:
> I expect that dealing with duck typing will be very high on the list
> of priorities for the future. In the meantime, for this specific use-case,
> you're probably not going to be able to statically check this type hint.
> Your choices would be:
>
> - don't type check anything;

To be clear, for the moment this is what requests will do, unless the
other maintainers strongly disagree with me (they don't). I am quite
convinced that PEP 484 is insufficiently powerful to make it
worthwhile for requests to provide 'official' type hints.

I suspect someone will provide hints to typeshed, and I certainly hope
they're good, because if they're bad we'll definitely field bug
reports about them (more on this in a different thread I think).
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] typeshed for 3rd party packages

2015-04-22 Thread M.-A. Lemburg
On 21.04.2015 18:08, Guido van Rossum wrote:
> On Tue, Apr 21, 2015 at 12:33 AM, M.-A. Lemburg  wrote:
> 
>> On 21.04.2015 05:37, Guido van Rossum wrote:
>>> On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich 
>> wrote:
 * Uploading stubs for other people's code is a terrible idea. Who do I
 contact when I update the interface to my library? The random Joe who
 "helped" by uploading annotations three months ago and then quit the
 internet? I don't even want to think about people maliciously adding
>> stubs
 to PyPI.

>>>
>>> We're certainly not planning to let arbitrary people upload stubs for
>>> arbitrary code to PyPI that will automatically be used by the type
>>> checkers. (We can't stop people from publishing their stubs, just as you
>>> can't stop people from writing blog posts or stackoverflow answers with
>>> examples for your library.)
>>>
>>> The actual plan is to have a common repository of stubs (a prototype
>> exists
>>> at https://github.com/JukkaL/typeshed) but we also plan some kind of
>>> submission review. I've proposed that when submitting stubs for a package
>>> you don't own, the typeshed owners ask the package owner what their
>>> position is, and we expect the answers to fall on the following spectrum:
>>>
>>> - I don't want stubs uploaded for my package
>>> - I'll write the stubs myself
>>> - I want to review all stubs that are uploaded for my package before they
>>> are accepted
>>> - Please go ahead and add stubs for my package and let me know when
>> they're
>>> ready
>>> - Go ahead, I trust you
>>>
>>> This seems a reasonable due diligence policy that avoids the scenarios
>>> you're worried about. (Of course if you refuse stubs a black market for
>>> stubs might spring into existence. That sounds kind of exciting... :-)
>>
>> Hmm, that's the first time I've heard about this. I agree with
>> Jack that it's a terrible idea to allow this for 3rd party
>> packages.
>>
>> If people want to contribute stubs, they should contribute them
>> to the package in the usual ways, not in a side channel. The important
>> part missing in the above setup is maintenance and to some extent
>> an external change of the API definitions.
>>
>> Both require active participation in the package project,
>> not the separated setup proposed above, to be effective and
>> actually work out in the long run.
>>
>> For the stdlib, typeshed looks like a nice idea to spread the
>> workload.
>>
> 
> I hesitate to speak for others, but IIUC the reason why typeshed was
> started is that companies like PyCharm and Google (and maybe others) are
> *already* creating their own stubs for 3rd party packages, because they
> have a need to type-check code that *uses* 3rd party packages. Their use
> cases are otherwise quite different (the user code type-checked by PyCharm
> is that of PyCharm users, and the code type-checked by Google is their own
> proprietary code) but they both find themselves needing stubs for commonly
> used 3rd party packages. mypy finds itself in a similar position.
> 
> Think of it this way. Suppose you wrote an app that downloaded some files
> from the web using the popular Requests package. Now suppose you wanted to
> run mypy over your app. You're willing to do the work of adding signatures
> to your own app, and presumably there are stubs for those parts of the
> stdlib that you're using, but without stubs for Requests, mypy won't do a
> very good job type-checking your calls into Requests. It's not rocket
> science to come up with stubs for Requests (there aren't that many classes
> and methods) but the Requests package is in maintenance mode, and while
> they respond quickly to security issues they might take their time to
> release a new version that includes your stub files, and until there are a
> lot of people clamoring for stubs for Requests, they might not care at all.

For projects in maintenance mode, it does make sense indeed.

For active ones, I think you'd quickly run into a situation similar
to translation projects: there are always parts which haven't been
translated yet or where the translation no longer matches the original
intent.

Unlike with translations, where missing or poor ones don't have
much effect on the usefulness of the software, a type checker
would complain loudly and probably show lots of false positives
(if you read a type bug as "positive"), causing the usual complaints
by users to the software authors.

I don't really think that users would instead complain to the type
checker authors or find the actual source of the problem which are
the broken stub files.

OTOH, if the type checkers are written in a way where they can
detect authoritative stubs compared to non-authoritative ones
and point users to the possible type stub file problem, this
could be resolved, I guess.

The stub files would then need an "authoritative" flag and
probably also be versioned to get this working.

> So what does Requests have to lose if, instead of including the