Le 10/05/2018 à 09:32, Terry Reedy a écrit :
On 5/9/2018 11:33 PM, Guido van Rossum wrote:
I now think that the best way out is to rule `:=` in the top level
expression of an expression statement completely
I would like to be able to interactively enter
>>> a: = f(2,4)
to have 'a' echoed
[Tim]
>> """
>> An assignment expression binds the target, except in a function F
>> synthesized to implement a list comprehension or generator expression
>> (see XXX). In the latter case[1], the target is bound in the block
>> containing F, and errors may be detected: If the target also appears
[Greg Ewing ']
> Can someone explain to me why it was considered a bad
> thing that for-clauses leaked names in comprehensions,
Because you can't write a list comprehension or generator expression
AT ALL without specifying a `for` loop header, so whether its
target(s) leaks is an issue in virtuall
I think we're approaching this from the wrong direction.
My point is, expression assignments dont have the complex case as
purpose - most coders wont try to maximize line information density.
If you're doing magic, you might as well spell it out over multiple
lines, because neither := nor given wi
Tim Peters wrote:
Because you never _need_ to use an assignment expression to write a
listcomp/genexp.
This whole discussion started because someone wanted a way
to bind a temporary result for use *within* a comprehension.
Those use cases don't require leakage.
Otherwise it's essentially impo
[Greg Ewing ']
> This whole discussion started because someone wanted a way
> to bind a temporary result for use *within* a comprehension.
It's been noted several times recently that the example PEP 572 gives
as _not_ working:
total = 0
progressive_sums = [total := total + value for value
Following up some of the discussions about the problems of adding keywords
and Guido's proposal of making tokenization context-dependent, I wanted to
propose an alternate way to go around the problem.
My proposal essentially boils down to:
1. The character "$" can be used as a prefix of identi
The topic reminded me of Stephan Houben's streamtee.
https://github.com/stephanh42/streamtee/blob/master/streamtee.py
It was an attempt at a memory-efficient tee, but it turned out tee was
efficient enough. It uses a thunk-like method and recursion to remove
the need for an explicit linked list.
On Mon, May 14, 2018 at 09:17:03PM +1200, Greg Ewing wrote:
> Tim Peters wrote:
> >Because you never _need_ to use an assignment expression to write a
> >listcomp/genexp.
>
> This whole discussion started because someone wanted a way
> to bind a temporary result for use *within* a comprehension.
>
On 14/05/18 10:17, Greg Ewing wrote:
Tim Peters wrote:
Because you never _need_ to use an assignment expression to write a
listcomp/genexp.
This whole discussion started because someone wanted a way
to bind a temporary result for use *within* a comprehension.
I still don't find that argument
On 12/05/18 01:41, Juancarlo Añez wrote:
while (cmd := get_command()).token != CMD_QUIT:
cmd.do_something()
while get_command() as cmd:
if cmd.token == CMD_QUIT:
break
cmd.do_something()
This would be exactly one of the patterns Python currently forces on me
that
On 13/05/18 19:19, Guido van Rossum wrote:
The idea I had (not for the first time:-) is that in many syntactic
positions we could just treat keywords as names, and that would free up
these keywords.
I'm not familiar with the innards of the parser and it's wy too long
since I sat through a
On 2018-05-14 12:35, Rhodri James wrote:
> On 13/05/18 19:19, Guido van Rossum wrote:
>> The idea I had (not for the first time:-) is that in many syntactic
>> positions we could just treat keywords as names, and that would free up
>> these keywords.
>
> I'm not familiar with the innards of the p
On 2018-05-14 05:02, Nick Coghlan wrote:
> The same grammar adjustment that I believe will allow "given" to be used as
> both a postfix keyword and as a regular name would also work for "where".
> However, "where" still has the problem of semantically conflicting with
> SQL's use of it to introduce
On Sat, May 12, 2018 at 08:13:01PM -0400, Juancarlo Añez wrote:
> A new thread just to suggest taking the discussion about PEP572 well beyond
> python-ideas (PyConn is good for that).
>
> The least anyone should want is a language change that immediately gets
> tagged on the networks as "don't use
On 14 May 2018 at 06:10, Tim Peters wrote:
> [Greg Ewing ']
> > This whole discussion started because someone wanted a way
> > to bind a temporary result for use *within* a comprehension.
>
> It's been noted several times recently that the example PEP 572 gives
> as _not_ working:
>
> total =
On 14 May 2018 at 08:24, Ed Kellett wrote:
> On 2018-05-14 05:02, Nick Coghlan wrote:
> > The same grammar adjustment that I believe will allow "given" to be used
> as
> > both a postfix keyword and as a regular name would also work for "where".
> > However, "where" still has the problem of seman
Tim Peters wrote:
total = 0
progressive_sums = [total := total + value for value in data]
I'm skeptical that it's a good idea to encourage this
kind of thing in the first place.
--
Greg
___
Python-ideas mailing list
Python-ideas@python.org
h
> On 2018 May 14 , at 6:47 a, Daniel Moisset wrote:
>
> Following up some of the discussions about the problems of adding keywords
> and Guido's proposal of making tokenization context-dependent, I wanted to
> propose an alternate way to go around the problem.
My main objection to what follow
Steven D'Aprano wrote:
To reiterate what Tim already pointed out, that original usecase
required a way to feed values *into* the comprehension.
https://mail.python.org/pipermail/python-ideas/2018-February/048971.html
There's no need for dedicated syntax for that if we can just set an
variable
On 14 May 2018 at 15:02, Clint Hepner wrote:
>
> > On 2018 May 14 , at 6:47 a, Daniel Moisset
> wrote:
> >
> > Following up some of the discussions about the problems of adding
> keywords and Guido's proposal of making tokenization context-dependent, I
> wanted to propose an alternate way to go
On 14 May 2018 at 15:02, Clint Hepner wrote:
>
> > On 2018 May 14 , at 6:47 a, Daniel Moisset
> wrote:
> >
> > Following up some of the discussions about the problems of adding
> keywords and Guido's proposal of making tokenization context-dependent, I
> wanted to propose an alternate way to go
On 2018-05-14 06:47 AM, Daniel Moisset wrote:
> Following up some of the discussions about the problems of adding
> keywords and Guido's proposal of making tokenization
> context-dependent, I wanted to propose an alternate way to go around
> the problem.
>
> My proposal essentially boils down to:
>
> So I was thinking: why not do define the methods
> > like: "def self.whatevermethod(par1, par2, etc)" instead of "def
> > whatevermethod(self, par1, par2, etc)"?
>
because "self" in this case is a class instance, passed in at method call
time.
but "whatevermethod" is a class attribute.
note th
On Thu, May 10, 2018 at 6:13 PM, Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:
> > Is there interest in a PEP for extending time, datetime / timedelta for
> arbitrary or extended precision fractional seconds?
>
> Having seen the utter disaster that similar ideas brought to numpy, I
On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas
wrote:
> But my question is whether high precision timedeltas belongs with "calendar
> time" at all.
>
> What with UTC and leap seconds, and all that, it gets pretty ugly, when down
> to the second or sub-second, what a given datetime
>> total = 0
>> progressive_sums = [total := total + value for value in data]
[Greg Ewing ]
> I'm skeptical that it's a good idea to encourage this kind of thing
> in the first place.
Adding a feature is creating possibility for its use, not encouraging
its use. I'd hate to, ,e.g., see p
On 5/13/2018 2:19 PM, Guido van Rossum wrote:
The idea I had (not for the first time :-) is that in many syntactic
positions we could just treat keywords as names, and that would free up
these keywords.
This trades the simplicity of 'using a keyword as an identifier always
fails immediately'
On 5/14/2018 10:02 AM, Clint Hepner wrote:
On 2018 May 14 , at 6:47 a, Daniel Moisset wrote:
Following up some of the discussions about the problems of adding keywords and
Guido's proposal of making tokenization context-dependent, I wanted to propose
an alternate way to go around the proble
On 5/10/2018 3:58 PM, stefano wrote:
I know that "self" parameter have been discussed a lot, but still I didn't
find this proposal. If it was instead take my sincere apologies and please
forget this mail.
The disturbing part of the "self parameter" is the asymmetry of the
definition and the call
[Terry Reedy ]
> ...
> The impact on real-time syntax coloring should be considered. An re-based
> colorizer, like IDLE's, tags every appearance of keywords outside of strings
> and comments, syntactically correct or not. For instance, the first two
> 'and's in "a and b.and # and error" are tagged
On 05/13/2018 10:57 PM, Greg Ewing wrote:
Rob Cliffe via Python-ideas wrote:
If you forbid redefining keywords, you remove the whole point of this proposal:
I mean the keywords that are in the language as of now.
There will never be a need to redefine those, since no
current code uses them a
I can only think of three ways to reference a name defined in a different
file: In an import
statement, as properties of objects and as keyword arguments.
Import statements are implicit assignments, so if Python allowed the
following grammar,
you could still import the odd thing that had a reserve
Just to be clear, if `foo` was introduced as a new infix operator, projects
that used `foo`
as a name would not be able to also use `foo` as an infix operator in the
file that defines
`foo` as a name, but could use the operator throughout the rest of their
project.
-- Carl Smith
carl.in...@gmail.c
Sorry to think out loud, but if the lexer marked `foo` as a generic `Word`
token, that
could be a keyword or a name, then the parser could look at the value of
each `Word`
token, and if the context is `import foo`, `class foo...`, `def foo...` or
`foo = ...`,
then `foo` is a name there and thereaft
So does NumPy and sckit-learn use the trailing underscore convention.
Albeit, sklearn uses it for (almost) all the model attributes, not just
those it thinks might clash.
On Mon, May 14, 2018, 2:12 PM Terry Reedy wrote:
> On 5/14/2018 10:02 AM, Clint Hepner wrote:
> >
> >> On 2018 May 14 , at 6:
>
> UTC and leap seconds aren't a problem.
Of course they are a problem— why else would they not be implemented
in datetime?
But my point if that given datetimestamp or calculation could be off
by a second or so depending on whether and how leap seconds are
implemented.
It just doesn’t seem like
Chris is certainly right. A program that deals with femtosecond intervals
should almost surely start by defining a "start of experiment" epoch where
microseconds are fine. Then within that epoch, events should be monotonic
integers for when measured or calculated times are marked.
I can easily see
>From "[Python-Dev] PEP 564: Add new time functions with nanosecond
resolution" (2017-10-16 hh:mm ss[...] -Z)
https://groups.google.com/forum/m/#!topic/dev-python/lLJuW_asYa0 :
> Maybe that's why we haven't found any CTCs (closed timelike curves) yet.
>
> Aligning simulation data in context to oth
Terry Reedy wrote:
the first two 'and's in "a and b.and # and error" are tagged.
To not tag the second would require full parsing,
That particular case could be handled by just not colouring
any word following a dot.
Some other situations might result in false positives, but
there are cases li
I'm hoping that the arguments for assignment expressions will be over by
Christmas *wink* so as a partial (and hopefully less controversial)
alternative, what do people think of the idea of flagging certain
expressions as "pure functions" so the compiler can automatically cache
results from it?
On Tue, May 15, 2018 at 10:35 AM, Steven D'Aprano wrote:
> We would need to flag which expression can be cached because it is PURE,
> and tag how far the CACHE operates over:
>
>
>
> func(arg)
>
> + func(arg)*2 + func(arg)**2
>
>
> This would tell th
The time machine is used again! We HAVE a spelling: @functions.lru_cache()
The problem is that there's still a time/space trade-off. If you might call
a (pure) function with millions of different values, you likely don't want
to cache them all. The current spelling makes this configurable, but
the
[Steven D'Aprano ]
> I'm hoping that the arguments for assignment expressions will be over by
> Christmas *wink* so as a partial (and hopefully less controversial)
> alternative, what do people think of the idea of flagging certain
> expressions as "pure functions" so the compiler can automatically
On Mon, May 14, 2018 at 8:35 PM, Steven D'Aprano wrote:
> I'm hoping that the arguments for assignment expressions will be over by
> Christmas *wink* so as a partial (and hopefully less controversial)
> alternative, what do people think of the idea of flagging certain
> expressions as "pure functi
On 5/14/2018 6:58 PM, Greg Ewing wrote:
Terry Reedy wrote:
the first two 'and's in "a and b.and # and error" are tagged. To not
tag the second would require full parsing,
That particular case could be handled by just not colouring
any word following a dot.
OK, more parsing, even if not compl
Just noting some real code I typed today where `given` works great if
it allows unpacking syntax, and assignment expressions don't:
while True:
head, matched, s = s.partition(sep)
if not matched:
break
Using `given`:
while matched given head, matched, s = s.p
On Tue, 15 May 2018 10:35:58 +1000, Steven D'Aprano wrote:
> We would need to flag which expression can be cached because it is
> PURE, and tag how far the CACHE operates over:
>
>
>
> func(arg)
>
> + func(arg)*2 + func(arg)**2
>
>
> This would
[Tim]
>> It's been noted several times recently that the example PEP 572 gives
>> as _not_ working:
>>
>> total = 0
>> progressive_sums = [total := total + value for value in data]
>>
>> was the original use case that prompted work on the PEP. You gotta
>> admit that's ironic ;-)
[Nick]
>
The syntax for formatted string literals is given here:
https://docs.python.org/3/reference/lexical_analysis.html#f-strings
If you were to examine this carefully, you would see that a format_spec (the
part within the braces but after the colon) can be empty or it can consist of
literal chara
50 matches
Mail list logo