On 5 March 2015 at 19:38, Guido van Rossum <[email protected]> wrote:
> Well, TimeoutError is raised when the timeout (optionally) passed to the
> as_completed() call is reached. CancelledError means that cancel() was
> called, which can be for a different reason (e.g. an impatient user or a
> disconnected socket).
>
> Note that on TimeoutError, the tasks being waited for are *not* cancelled
> (if you want that to happen you should catch TimeoutError and cancel them
> yourself).
>
OK, I take it you are no longer swayed by my argument.
Not saying it's unreasonable, but to be clear this is what needs to happen
in the example:
@asyncio.coroutine
def process_as_results_come_in():
coroutines = [get_url(url) for url in ['URL1', 'URL2', 'URL3']]
try:
for coroutine in asyncio.as_completed(coroutines):
url, wait_time = yield from coroutine
print('Coroutine for {} is done'.format(url))
except CancelledError: # "probably" timeout
for task in coroutines:
task.cancel()
I just think it would be nicer if the waited for tasks were cancelled by
default. I claim this is usually what is wanted (but without providing any
evidence!).
>
> On Thu, Mar 5, 2015 at 10:35 AM, Gustavo Carneiro <[email protected]>
> wrote:
>
>>
>>
>> On 5 March 2015 at 18:10, Guido van Rossum <[email protected]> wrote:
>>
>>> On Thu, Mar 5, 2015 at 9:04 AM, Gustavo Carneiro <[email protected]>
>>> wrote:
>>>
>>>>
>>>>
>>>>
>>>> On 3 March 2015 at 19:01, Gustavo Carneiro <[email protected]>
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>> On 2 March 2015 at 10:12, Victor Stinner <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> To answer Luciano's question on as_completed(), I read the source code
>>>>>> of the function. It's not clear to me what happens on timeout.
>>>>>>
>>>>>>
>>>>>> Should we abort something on timeout? The documentation doesn't
>>>>>> explain if tasks are cancelled or if an exception is set. In the code,
>>>>>> it's clear that the task is not touched. It's probably just a
>>>>>> documentation issue.
>>>>>>
>>>>>
>>>>> I think it should cancel the wrapped task. I think this is what the
>>>>> programmer will want most of the time, and in case he doesn't want it he
>>>>> can use shield() to prevent it.
>>>>>
>>>>
>>>> I wasn't very clear: I think it should cancel the still running
>>>> uncompleted tasks AND raise an exception, both. That would be consistent
>>>> with the way wait_for() behaves.
>>>>
>>>
>>> *What* exactly should raise an exception? Note that we're speaking of:
>>>
>>> @asyncio.generator
>>> def blah():
>>> for cf in as_completed(...):
>>> r = yield from cf # <---------- HERE
>>> <do something with r>
>>>
>>> and what's happening is that blah is being cancelled when it is waiting
>>> for cf at the line marked "HERE". The cancellation mechanism already works
>>> by sending an exception (CancelledError) into the blocked task (i.e. cf,
>>> which represents _wait_for_one() in as_completed()). This exception
>>> normally comes right out, and that's the desired behavior here. I don't
>>> think a different exception should be raised, unless I am missing
>>> something?
>>>
>>
>> You're not missing anything, I was. Anyway, as long as there is an
>> exception somewhere, I'm happy.
>>
>> Well, I could be slightly happier if the exception was TimeoutError
>> instead of CancelledError, but I'm happy enough :-)
>>
>> Cheers.
>>
>> --
>> Gustavo J. A. M. Carneiro
>> Gambit Research
>> "The universe is always one step beyond logic." -- Frank Herbert
>>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>
--
Gustavo J. A. M. Carneiro
Gambit Research
"The universe is always one step beyond logic." -- Frank Herbert