Thanks Daniel,
I found my answer here (using your link):
https://docs.python.org/3/reference/datamodel.html#preparing-the-class-namespace
"""
When a new class is created by type.__new__, the object provided as the
namespace parameter is copied to a new ordered mapping and the original
object is
I use this a lot in my code.
Since `setdefault_call` does not exist, here is how I do it:
d = {}
lookup_d = d.get
provide_d = d.setdefault
for i in range(100): # some large loop
l = (lookup_d(somekey)) or (provide_d(somekey, []))
l.append(somevalue)
I am not arguing for or against
On Thu, Jul 26, 2018 at 1:09 AM, Steven D'Aprano
wrote:
> On Thu, Jul 26, 2018 at 01:02:47AM -0400, Amit Green wrote:
>
> > 3. The problem is way deeper than simply adding '?.' and other
> operators.
> > For real use cases, you also need to say "how far"
On Thu, Jul 26, 2018 at 12:25 AM, Raymond Hettinger <
raymond.hettin...@gmail.com> wrote:
> It probably is the wrong time and probably can hurt (by introducing
> divisiveness when we most need for be focusing on coming together).
>
> This PEP also shares some traits with PEP 572 in that it
On Mon, Jan 8, 2018 at 10:34 PM, Nick Coghlan wrote
> It could be useful to include a recipe in the documentation that shows a
> generator with suitable error handling (taking the generic connection
> errors and adapting them to app specific ones) while also showing how to
>
An argument against this API, is that any caller of recv should be doing
error handling (i.e.: catching exceptions from the socket).
Changing into an iterator makes it less likely that error handling will be
properly coded, and makes the error handling more obscure.
Thus although the API would
On Sun, Oct 15, 2017 at 9:44 AM, Koos Zevenhoven wrote:
> So, see below for some more discussion between (it would be useful if some
> people could reply to this email and say if and why they agree or disagree
> with something below -- also non-experts that roughly understand
it:
>>> def f(): yield 1
>>> ...
Syntax Error: the 'yield' keyword can only be used in a generator;
please be sure to use @generator before the definition of the generator
>>> @generator
... def f(): yield 1
...
>>> f
Just the
Once again, I think Paul Moore gets to the heart of the issue.
Generators are simply confusing & async even more so.
Per my earlier email, the fact that generators look like functions, but are
not functions, is at the root of the confusion.
This next part really gets to the heart of the matter:
really like what Paul Moore wrote here as it matches a *LOT* of what I
have been feeling as I have been reading this whole discussion;
specifically:
- I find the example, and discussion, really hard to follow.
- I also, don't understand async, but I do understand generators very
name the args. What do you think about infile, outfile,
> and errfile?
>
>
> FWIW, I did consider "in", "out", and "err", but "in" is a keyword, and I
> didn't think those quite captured the full meaning.
>
>
> wt
> ---
I'm fine with the idea in general of extra keyword parameters to the input
function.
A few points:
Your example code, needs try/catch to match what the input with parameters
does -- and yes, its way nicer to be able to use it the example you have
shown than play games with try/catch (Personally
NOTE: This is my first post to this mailing list, I'm not really sure
how to post a message, so I'm attempting a reply-all.
I like Nathaniel's idea for __iterclose__.
I suggest the following changes to deal with a few of the complex issues
he discussed.
1. Missing __iterclose__, or a
13 matches
Mail list logo