> To be honest, I see "async with" being abused everywhere in asyncio,
> lately. I like to have objects with start() and stop() methods, but
> everywhere I see async context managers.>
> Fine, add nursery or whatever, but please also have a simple start() /
> stop() public API.
>
> "async with"
On Fri, 15 Jun 2018 at 09:18, Michel Desmoulin
wrote:
>
> >
> > The strict API compatibility requirements of core Python stdlib, coupled
> > with the very long feature release life-cycles of Python, make me think
> > this sort of thing perhaps is better built in an utility library on top
> > of
Le 14/06/2018 à 04:09, Nathaniel Smith a écrit :
> How about:
>
> async def wait_to_run(async_fn, *args):
> await wait_for_something()
> return await async_fn(*args)
>
> task = loop.create_task(wait_to_run(myfunc, ...))
>
It's quite elegant, although figuring out the
>
> The strict API compatibility requirements of core Python stdlib, coupled
> with the very long feature release life-cycles of Python, make me think
> this sort of thing perhaps is better built in an utility library on top
> of asyncio, rather than inside asyncio itself? 18 months is a long
On Thu, Jun 14, 2018 at 8:14 PM, Chris Barker via Python-Dev <
python-dev@python.org> wrote:
> Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
> understand the problem here.
>
> So why do queries fail with 1 tasks? or ANY number? If the async DB
> access code is
On Thu, Jun 14, 2018 at 3:31 PM, Tin Tvrtković wrote:
> * my gut feeling is spawning a thousand tasks and having them all fighting
> over the same semaphore and scheduling is going to be much less efficient
> than a small number of tasks draining a queue.
Fundamentally, a Semaphore is a queue:
On Thu, Jun 14, 2018 at 10:03 PM Steve Dower wrote:
> I often use
> semaphores for this when I need it, and it looks like
> asyncio.Semaphore() is sufficient for this:
>
>
> import asyncio
> task_limiter = asyncio.Semaphore(4)
>
> async def my_task():
> await task_limiter.acquire()
>
Other folks have already chimed in, so I'll be to the point. Try writing a
simple asyncio web scraper (using maybe the aiohttp library) and create
5000 tasks for scraping different sites. My prediction is a whole lot of
them will time out due to various reasons.
Other responses inline.
On Thu,
On 14Jun2018 1214, Chris Barker via Python-Dev wrote:
Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying
to understand the problem here.
But if I have this right:
I've been using asyncio a lot lately and have encountered this
problem several times. Imagine you want
On Thu, Jun 14, 2018 at 9:17 PM Chris Barker via Python-Dev <
python-dev@python.org> wrote:
> Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
> understand the problem here.
>
Vocabulary-wise 'queue depth' might be a suitable mental aid for what
people actually want to
Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
understand the problem here.
But if I have this right:
I've been using asyncio a lot lately and have encountered this problem
> several times. Imagine you want to do a lot of queries against a database,
> spawning 1
tter built in an utility library on top of
asyncio, rather than inside asyncio itself? 18 months is a long long time
to iterate on these features. I can't wait for Python 3.8...
>
> Date: Wed, 13 Jun 2018 22:45:22 +0200
>> From: Michel Desmoulin
>> To: python-dev@python.or
On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtković wrote:
>
> Hi,
>
> I've been using asyncio a lot lately and have encountered this problem
> several times. Imagine you want to do a lot of queries against a database,
> spawning 1 tasks in parallel will probably cause a lot of them to fail.
>
point.
Date: Wed, 13 Jun 2018 22:45:22 +0200
> From: Michel Desmoulin
> To: python-dev@python.org
> Subject: [Python-Dev] A more flexible task creation
> Message-ID:
> Content-Type: text/plain; charset=utf-8
>
> I was working on a concurrency limiting code for asyncio, so
How about:
async def wait_to_run(async_fn, *args):
await wait_for_something()
return await async_fn(*args)
task = loop.create_task(wait_to_run(myfunc, ...))
-
Whatever strategy you use, you should also think about what semantics you
want if one of these delayed tasks is cancelled
On Wed, Jun 13, 2018 at 4:47 PM Michel Desmoulin
wrote:
>
> I was working on a concurrency limiting code for asyncio, so the user
> may submit as many tasks as one wants, but only a max number of tasks
> will be submitted to the event loop at the same time.
What does that "concurrency limiting
I was working on a concurrency limiting code for asyncio, so the user
may submit as many tasks as one wants, but only a max number of tasks
will be submitted to the event loop at the same time.
However, I wanted that passing an awaitable would always return a task,
no matter if the task was
17 matches
Mail list logo