On Fri, 17 Jun 2022 at 12:52, Steve Jorgensen <stevec...@gmail.com> wrote:
>
> Restarting this with an improved title "Bare" vs "Raw", and I will try not to 
> digress so much in the new thread.
>
> My suggestion is to allow a bare asterisk at the end of a desctructuring 
> expression to indicate that additional elements are to be ignored if present 
> and not iterated over if the rhs is being evaluated by iterating.
>
>     (first, second, *) = items
>
> This provides a way of using destructuring from something that will be 
> processed by iterating and for which the number of items might be very large 
> and/or accessing of successive items is expensive.
>

Important point: This is distinctly different from putting a dummy
variable there:

first, second, *_ = items

as this will iterate over the rest of items. What you're proposing is
actually a *removal* of a normal check - after unpacking two elements
from items and assigning them to first and second, the interpreter
normally queries the iterator once more and raises an error if it
doesn't StopIteration. So for generators, adding the trailing asterisk
will mean it doesn't try to pump it further.

> As Paul Moore pointed out in the original thread, itertools.islice can be 
> used to limit the number of items iterated over. That's a nice solution, but 
> it required knowing or thinking of the solution, an additional import, and 
> repetition of the count of items to be destrucured at the outermost nesting 
> level on the lhs.
>
> What are people's impressions of this idea. Is it valuable enough to pursue 
> writing a PEP?

I think it's a valuable idea, though I don't think it needs a PEP yet.
When the time comes, I'd be happy to help out with that aspect of
things.

> If so, then what should I do in writing the PEP to make sure that it's 
> somewhat close to something that can potentially be accepted? Perhaps, there 
> is a guide for doing that?

Before you get to that point, how comfortable are you with "kicking
the tires" on this by putting together a basic proof-of-concept
implementation? Sometimes, the best way to find out the potential
problems is to just try doing it. Syntactically and semantically, this
looks pretty straight-forward, but it's always possible for something
weird to sneak past your notice. For instance, are there any bizarre
situations in which this could become ambiguous? Currently, "a, b, =
x" is perfectly valid, and "a, b, *= x" errors out saying that
augmented assignment doesn't make sense with a tuple target, so I
think you're fine; safest to check though.

Will this syntax be supported in a match/case statement? It's probably
not as useful (since "*_" won't actually bind, and since they only
match sequences, not arbitrary iterables), but might be useful to
maintain the parallel.

ChrisA
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/PFZIVMBR77A2W2IAMVJMPHMCOSQUM7DW/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to