Guido:
> It seems a little disingenuous to claim discussions about annotations
don’t
> concern you when you’re actively using them (for typing, no less, in the
> case of pydantic). And I am sure a project as popular (by their own
> description) as pydantic will find a way forward if PEP 649 is
d say this style of
programming is quite readable and understandable.
On Tue, Oct 27, 2020 at 1:16 AM Tin Tvrtković wrote:
> Hello,
>
> Go channels are indeed very similar to asyncio Queues, with some added
> features like channels being closable. (There is also special syntax in the
; In any case, perhaps it's not the match statement that needs to change,
> but rather asyncio API that needs to be enhanced.
>
>
> On Sun, 25 Oct 2020 at 01:14, Nick Coghlan wrote:
>
>> On Sat., 24 Oct. 2020, 4:21 am Guido van Rossum,
>> wrote:
>>
>>&
Hi,
first of all, I'm a big fan of the changes being proposed here since in my
code I prefer the 'union' style of logic over the OO style.
I was curious, though, if there are any plans for the match operator to
support async stuff. I'm interested in the problem of waiting on multiple
asyncio
Hi,
PEP 544 specifies this address as "Discussions-To" so I hope I'm at the
right address.
I think protocols as defined in the PEP are a very interesting idea and I'm
thinking of ways of applying them. The first use case is in the context of
attrs.
attrs has a number of functions that work only
On Thu, Jun 14, 2018 at 10:03 PM Steve Dower wrote:
> I often use
> semaphores for this when I need it, and it looks like
> asyncio.Semaphore() is sufficient for this:
>
>
> import asyncio
> task_limiter = asyncio.Semaphore(4)
>
> async def my_task():
> await task_limiter.acquire()
>
Other folks have already chimed in, so I'll be to the point. Try writing a
simple asyncio web scraper (using maybe the aiohttp library) and create
5000 tasks for scraping different sites. My prediction is a whole lot of
them will time out due to various reasons.
Other responses inline.
On Thu,
Hi,
I've been using asyncio a lot lately and have encountered this problem
several times. Imagine you want to do a lot of queries against a database,
spawning 1 tasks in parallel will probably cause a lot of them to fail.
What you need in a task pool of sorts, to limit concurrency and do only
Thank you to everyone who participated (Kirill, Raymond, Nick, Naoki). I've
decided there are too many caveats for this approach to be worthwhile and
I'm giving up on it.
Kind regards,
Tin
On Sat, Mar 24, 2018 at 3:18 PM Tin Tvrtković <tinches...@gmail.com> wrote:
> Hi Python-dev,
&g
On Sun, Mar 25, 2018 at 5:23 AM Nick Coghlan wrote:
> That depends on what you mean by "safe" :)
>
> It won't crash, but it will lose any existing entries that a metaclass,
> subclass, or __new__ method implementation might have added to the instance
> dictionary before
That's reassuring, thanks.
On Sat, Mar 24, 2018 at 5:20 PM Raymond Hettinger <
raymond.hettin...@gmail.com> wrote:
> This should work. I've seen it done in other production tools without any
> ill effect.
>
> The dict can be replaced during __init__() and still get benefits of
> key-sharing.
Hi Python-dev,
I'm one of the core attrs contributors, and I'm contemplating applying an
optimization to our generated __init__s. Before someone warns me python-dev
is for the development of the language itself, there are two reasons I'm
posting this here:
1) it's a very low level question that
Hello,
I'm one of the attrs contributors, and the person who initially wrote the
slots functionality there.
We've given up on returning a new class always since this can conflict with
certain metaclasses (have you noticed you can't make a slots attrs class
inheriting from Generic[T]?) and with
>
> Date: Fri, 13 Oct 2017 08:57:00 -0700
> From: Guido van Rossum
> To: Martin Teichmann
> Cc: Python-Dev
> Subject: Re: [Python-Dev] What is the design purpose of metaclasses vs
> code generating decorators? (was Re:
14 matches
Mail list logo