I never used as_completed() and I still learn how to use it. So it cannot
explain how I plan to use it :-)

Victor
Le 3 mars 2015 20:34, "Guido van Rossum" <[email protected]> a écrit :

> I looked a bit more. The only thing that ever waits is the caller in the
> for-loop (for f in as_completed(...): x = yield from f; ...). It will
> always wait for _wait_for_one(), which in turn will always wait for
> done.get(). So if the above yield-from is cancelled, _wait_for_one() is
> cancelled, which in turn cancels the Queue.get() call. If this is
> cancelled, the corresponding put_nowait() call may well cancelled (I
> stopped following the code carefully) but this is called from the
> completion callback for a completed future, so that future isn't
> cancellable, and there's nothing that cancels the futures still in the todo
> queue.
>
> I'm skeptical that it's desirable to cancel the original futures though.
> Have you thought through the use case? (Admittedly I haven't.)
>
> On Tue, Mar 3, 2015 at 11:01 AM, Gustavo Carneiro <[email protected]>
> wrote:
>
>>
>>
>> On 2 March 2015 at 10:12, Victor Stinner <[email protected]>
>> wrote:
>>
>>> Hi,
>>>
>>> To answer Luciano's question on as_completed(), I read the source code
>>> of the function. It's not clear to me what happens on timeout.
>>>
>>>
>>> Should we abort something on timeout? The documentation doesn't
>>> explain if tasks are cancelled or if an exception is set. In the code,
>>> it's clear that the task is not touched. It's probably just a
>>> documentation issue.
>>>
>>
>> I think it should cancel the wrapped task.  I think this is what the
>> programmer will want most of the time, and in case he doesn't want it he
>> can use shield() to prevent it.
>>
>> Also it makes it consistent with wait_for:
>>
>> https://code.google.com/p/tulip/issues/detail?id=107
>> http://bugs.python.org/issue23219
>>
>>
>>>
>>>
>>> Coroutine objects are wrapped into tasks: on timeout, these temporary
>>> tasks may be destroyed before their completion. A warning is emited in
>>> this case. The warning is useless, since the caller doesn't have
>>> access to all these temporary tasks. As gather(), we should probably
>>> set the _log_destroy_pending attribute of these tasks to False. What
>>> do you think?
>>
>>
>>> Victor
>>>
>>
>>
>>
>> --
>> Gustavo J. A. M. Carneiro
>> Gambit Research
>> "The universe is always one step beyond logic." -- Frank Herbert
>>
>
>
>
> --
> --Guido van Rossum (python.org/~guido)
>

Reply via email to