Re: variable length tuple assignment

2009-02-25 Thread Chris Rebert
On Wed, Feb 25, 2009 at 1:16 AM, Helmut Jarausch
jarau...@igpm.rwth-aachen.de wrote:
 Sorry if this is too simple but I couldn't find.

 I vaguely remember there is a means to assign a variable length tuple
 and catch the 'rest'  like

 S=a,b,c,d

 (A,B,list of remaining items) = S.split(',')

In Python 3.0 (IIRC):

A, B, *rest = S.split(',')

Cheers,
Chris

-- 
Follow the path of the Iguana...
http://rebertia.com
--
http://mail.python.org/mailman/listinfo/python-list


Re: variable length tuple assignment

2009-02-25 Thread Tim Chase

Chris Rebert wrote:

On Wed, Feb 25, 2009 at 1:16 AM, Helmut Jarausch
jarau...@igpm.rwth-aachen.de wrote:

Sorry if this is too simple but I couldn't find.

I vaguely remember there is a means to assign a variable length tuple
and catch the 'rest'  like

S=a,b,c,d

(A,B,list of remaining items) = S.split(',')


In Python 3.0 (IIRC):

A, B, *rest = S.split(',')


As an aside, as of the last time I read the PEP[1] on this, I 
believe it exhausts (or attempts to exhaust) any iterator.  IMHO, 
I think this exhausting is a bad idea because it prevents things like


  def numbers(start=0):
i = start
while True:
  yield i
  i += 1

  CONST_A, CONST_B, CONST_C, *rest = numbers()

which will hang in current Py3.0 until you blow a stack or 
overrun your heap somewhere because it will try to exhaust the 
infinite loop.  It also changes the type from iter() to list() 
for the remaining content (not as grevious).



I agree that the internal star usage needs to exhaust the iterator:

  A, *rest, C, D = s.split(',')

but the terminal-star syntax should be a little more gracious 
with iterators.


Anyways, my $0.02 (minus taxes, money for multiple bailouts, and 
deflated by economic conditions -- so not worth much :).


-tkc

[1]
http://www.python.org/dev/peps/pep-3132/






--
http://mail.python.org/mailman/listinfo/python-list


Re: variable length tuple assignment

2009-02-25 Thread Steven D'Aprano
Tim Chase wrote:

 As an aside, as of the last time I read the PEP[1] on this, I
 believe it exhausts (or attempts to exhaust) any iterator.  IMHO,
 I think this exhausting is a bad idea because it prevents things like
 
def numbers(start=0):
  i = start
  while True:
yield i
i += 1
 
CONST_A, CONST_B, CONST_C, *rest = numbers()
 
 which will hang in current Py3.0 until you blow a stack or
 overrun your heap somewhere because it will try to exhaust the
 infinite loop.  

But what else can it do? Do you expect Python to read your mind and
magically know when you intend to use rest and when you're intending to
just throw it away?

Perhaps Python could do that, via static analysis -- if rest is never used
again, don't bother exhausting the iterator. But that will lead to
differences in iterators that have side-effects. It will also have a subtle
difference in behaviour here:

it = xrange(5)  # for example
a, b, c, *rest = it
L = list(it)  # L is now [3, 4]

versus

a, b, c, *rest = xrange(5)
parrot(rest)
L = list(it)  # L is now []


 It also changes the type from iter() to list() 
 for the remaining content (not as grevious).

But that's what unpacking does. It would be a major semantic change for 

x, *s = some_iterator

to make s an alias of some_iterator. And why would you want it to? If you
do, just do this:

x, s = some_iterator.next(), some_iterator



-- 
Steven

--
http://mail.python.org/mailman/listinfo/python-list