Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Paul Moore
On 4/29/05, Shane Hathaway [EMAIL PROTECTED] wrote:
 I think this concept can be explained clearly.  I'd like to try
 explaining PEP 340 to someone new to Python but not new to programming.
 I'll use the term block iterator to refer to the new type of
 iterator.  This is according to my limited understanding.
[...]
 Is it understandable so far?

I like it.
Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Luis P Caamano
On 4/29/05, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
 
 Message: 2
 Date: Thu, 28 Apr 2005 21:56:42 -0600
 From: Shane Hathaway [EMAIL PROTECTED]
 Subject: Re: [Python-Dev] Re: anonymous blocks
 To: [EMAIL PROTECTED]
 Cc: Ka-Ping Yee [EMAIL PROTECTED],  Python Developers List
python-dev@python.org
 Message-ID: [EMAIL PROTECTED]
 Content-Type: text/plain; charset=ISO-8859-1
 
 
 I think this concept can be explained clearly.  I'd like to try
 explaining PEP 340 to someone new to Python but not new to programming.
 I'll use the term block iterator to refer to the new type of
 iterator.  This is according to my limited understanding.
 
 Good programmers move commonly used code into reusable functions.
 Sometimes, however, patterns arise in the structure of the functions
 rather than the actual sequence of statements.  For example, many
 functions acquire a lock, execute some code specific to that function,
 and unconditionally release the lock.  Repeating the locking code in
 every function that uses it is error prone and makes refactoring difficult.
 
 Block statements provide a mechanism for encapsulating patterns of
 structure.  Code inside the block statement runs under the control of an
 object called a block iterator.  Simple block iterators execute code
 before and after the code inside the block statement.  Block iterators
 also have the opportunity to execute the controlled code more than once
 (or not at all), catch exceptions, or receive data from the body of the
 block statement.
 
 A convenient way to write block iterators is to write a generator.  A
 generator looks a lot like a Python function, but instead of returning a
 value immediately, generators pause their execution at yield
 statements.  When a generator is used as a block iterator, the yield
 statement tells the Python interpreter to suspend the block iterator,
 execute the block statement body, and resume the block iterator when the
 body has executed.
 
 The Python interpreter behaves as follows when it encounters a block
 statement based on a generator.  First, the interpreter instantiates the
 generator and begins executing it.  The generator does setup work
 appropriate to the pattern it encapsulates, such as acquiring a lock,
 opening a file, starting a database transaction, or starting a loop.
 Then the generator yields execution to the body of the block statement
 using a yield statement.  When the block statement body completes,
 raises an uncaught exception, or sends data back to the generator using
 a continue statement, the generator resumes.  At this point, the
 generator can either clean up and stop or yield again, causing the block
 statement body to execute again.  When the generator finishes, the
 interpreter leaves the block statement.
 
 Is it understandable so far?
 

I've been skipping most of the anonymous block discussion and thus,
I only had a very vague idea of what it was about until I read this
explanation.

Yes, it is understandable -- assuming it's correct :-)

Mind you though, I'm not new to python and I've been writing system
software for 20+ years.

-- 
Luis P Caamano
Atlanta, GA USA
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Luis Bruno
Hello,

Shane Hathaway wrote:
 Is it understandable so far?

Definitely yes! I had the structure upside-down; your explanation is
right on target.

Thanks!
-- 
Luis Bruno
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread Shane Hathaway
Luis P Caamano wrote:
 I've been skipping most of the anonymous block discussion and thus,
 I only had a very vague idea of what it was about until I read this
 explanation.
 
 Yes, it is understandable -- assuming it's correct :-)

To my surprise, the explanation is now in the PEP.  (Thanks, Guido!)

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-29 Thread John J Lee
On Thu, 28 Apr 2005, Shane Hathaway wrote:
[...]
 I think this concept can be explained clearly.  I'd like to try
 explaining PEP 340 to someone new to Python but not new to programming.
[...snip explanation...]
 Is it understandable so far?

Yes, excellent.  Speaking as somebody who scanned the PEP and this thread
and only half-understood either, that was quite painless to read.

Still not sure whether thunks or PEP 340 are better, but I'm at least
confused on a higher level now.


John
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Guido van Rossum wrote:
And surely you exaggerate.  How about this then:
The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.
I think perhaps I'm not expressing myself very well.
What I'm after is a high-level explanation that actually
tells people something useful, and *doesn't* cop out by
just saying you're not experienced enough to understand
this yet.
If such an explanation can't be found, I strongly suspect
that this doesn't correspond to a cohesive enough concept
to be made into a built-in language feature. If you can't
give a short, understandable explanation of it, then it's
probably a bad idea.
Greg
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Ka-Ping Yee
On Thu, 28 Apr 2005, Greg Ewing wrote:
 If such an explanation can't be found, I strongly suspect
 that this doesn't correspond to a cohesive enough concept
 to be made into a built-in language feature. If you can't
 give a short, understandable explanation of it, then it's
 probably a bad idea.

In general, i agree with the sentiment of this -- though it's
also okay if there is a way to break the concept down into
concepts that *are* simple enough to have short, understandable
explanations.


-- ?!ng
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Steven Bethard
On 4/28/05, Steven Bethard [EMAIL PROTECTED] wrote:
 however, the iterable object is notified whenever a 'continue',
 'break', or 'return' statement is executed inside the block-statement.

This should read:

however, the iterable object is notified whenever a 'continue',
'break' or 'return' statement is executed *or an exception is raised*
inside the block-statement.

Sorry!

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Nick Coghlan
Brett C. wrote:
Guido van Rossum wrote:
Yet another alternative would be for the default behaviour to be to raise
Exceptions, and continue with anything else, and have the third argument be
raise_exc=True and set it to False to pass an exception in without raising it.

You've lost me there. If you care about this, can you write it up in
more detail (with code samples or whatever)? Or we can agree on a 2nd
arg to __next__() (and a 3rd one to next()).
Channeling Nick, I think he is saying that the raising argument should be 
made
True by default and be named 'raise_exc'.
Pretty close, although I'd say 'could' rather than 'should', as it was an idle 
thought, rather than something I actually consider a good idea.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Steven Bethard
On 4/28/05, Greg Ewing [EMAIL PROTECTED] wrote:
 Neil Schemenauer wrote:
 
  The translation of a block-statement could become:
 
  itr = EXPR1
  arg = None
  while True:
  try:
  VAR1 = next(itr, arg)
  except StopIteration:
  break
  try:
  arg = None
  BLOCK1
  except Exception, exc:
  err = getattr(itr, '__error__', None)
  if err is None:
  raise exc
  err(exc)
 
 That can't be right. When __error__ is called, if the iterator
 catches the exception and goes on to do another yield, the
 yielded value needs to be assigned to VAR1 and the block
 executed again. It looks like your version will ignore the
 value from the second yield and only execute the block again
 on the third yield.

Could you do something like:
itr = EXPR1
arg = None
next_func = next
while True:
try:
VAR1 = next_func(itr, arg)
except StopIteration:
break
try:
arg = None
next_func = next
BLOCK1
except Exception, arg:
try:
next_func = type(itr).__error__
except AttributeError:
raise arg


?

STeVe

-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Nick Coghlan
Brett C. wrote:
I'm surprisingly close to agreeing with you, actually. I've worked out
that it isn't the looping that I object to, it's the inability to get
out of the loop without exhausting the entire iterator.

'break' isn't' enough for you as laid out by the proposal?  The raising of
StopIteration, which is what 'break' does according to the standard, should be
enough to stop the loop without exhausting things.  Same way you stop a 'for'
loop from executing entirely.
The StopIteration exception effectively exhausted the generator, though. 
However, I've figured out how to deal with that, and my reservations about PEP 
340 are basically gone.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Guido van Rossum
[Greg Ewing]
 I think perhaps I'm not expressing myself very well.
 What I'm after is a high-level explanation that actually
 tells people something useful, and *doesn't* cop out by
 just saying you're not experienced enough to understand
 this yet.
 
 If such an explanation can't be found, I strongly suspect
 that this doesn't correspond to a cohesive enough concept
 to be made into a built-in language feature. If you can't
 give a short, understandable explanation of it, then it's
 probably a bad idea.

[Ping]
 In general, i agree with the sentiment of this -- though it's
 also okay if there is a way to break the concept down into
 concepts that *are* simple enough to have short, understandable
 explanations.

I don't know. What exactly is the audience supposed to be of this
high-level statement? It would be pretty darn impossible to explain
even the for-statement to people who are new to programming, let alone
generators. And yet explaining the block-statement *must* involve a
reference to generators. I'm guessing most introductions to Python,
even for experienced programmers, put generators off until the
advanced section, because this is pretty wild if you're not used to
a language that has something similar. (I wonder how you'd explain
Python generators to an experienced Ruby programmer -- their mind has
been manipulated to the point where they'd be unable to understand
Python's yield no matter how hard they tried. :-)

If I weren't limited to newbies (either to Python or to programming in
general) but simply had to explain it to Python programmers
pre-Python-2.5, I would probably start with a typical example of the
try/finally idiom for acquiring and releasing a lock, then explain how
for software engineering reasons you'd want to templatize that, and
show the solution with a generator and block-statement.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Shane Hathaway
Guido van Rossum wrote:
 I don't know. What exactly is the audience supposed to be of this
 high-level statement? It would be pretty darn impossible to explain
 even the for-statement to people who are new to programming, let alone
 generators. And yet explaining the block-statement *must* involve a
 reference to generators. I'm guessing most introductions to Python,
 even for experienced programmers, put generators off until the
 advanced section, because this is pretty wild if you're not used to
 a language that has something similar. (I wonder how you'd explain
 Python generators to an experienced Ruby programmer -- their mind has
 been manipulated to the point where they'd be unable to understand
 Python's yield no matter how hard they tried. :-)

I think this concept can be explained clearly.  I'd like to try
explaining PEP 340 to someone new to Python but not new to programming.
 I'll use the term block iterator to refer to the new type of
iterator.  This is according to my limited understanding.

Good programmers move commonly used code into reusable functions.
Sometimes, however, patterns arise in the structure of the functions
rather than the actual sequence of statements.  For example, many
functions acquire a lock, execute some code specific to that function,
and unconditionally release the lock.  Repeating the locking code in
every function that uses it is error prone and makes refactoring difficult.

Block statements provide a mechanism for encapsulating patterns of
structure.  Code inside the block statement runs under the control of an
object called a block iterator.  Simple block iterators execute code
before and after the code inside the block statement.  Block iterators
also have the opportunity to execute the controlled code more than once
(or not at all), catch exceptions, or receive data from the body of the
block statement.

A convenient way to write block iterators is to write a generator.  A
generator looks a lot like a Python function, but instead of returning a
value immediately, generators pause their execution at yield
statements.  When a generator is used as a block iterator, the yield
statement tells the Python interpreter to suspend the block iterator,
execute the block statement body, and resume the block iterator when the
body has executed.

The Python interpreter behaves as follows when it encounters a block
statement based on a generator.  First, the interpreter instantiates the
generator and begins executing it.  The generator does setup work
appropriate to the pattern it encapsulates, such as acquiring a lock,
opening a file, starting a database transaction, or starting a loop.
Then the generator yields execution to the body of the block statement
using a yield statement.  When the block statement body completes,
raises an uncaught exception, or sends data back to the generator using
a continue statement, the generator resumes.  At this point, the
generator can either clean up and stop or yield again, causing the block
statement body to execute again.  When the generator finishes, the
interpreter leaves the block statement.

Is it understandable so far?

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Guido van Rossum wrote:
I don't know. What exactly is the audience supposed to be of this
high-level statement? It would be pretty darn impossible to explain
even the for-statement to people who are new to programming, let alone
generators.
If the use of block-statements becomes common for certain
tasks such as opening files, it seems to me that people are
going to encounter their use around about the same time
they encounter for-statements. We need *something* to
tell these people to enable them to understand the code
they're reading.
Maybe it would be sufficient just to explain the meanings
of those particular uses, and leave the full general
explanation as an advanced topic.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Greg Ewing
Shane Hathaway wrote:
Block statements provide a mechanism for encapsulating patterns of
structure.  Code inside the block statement runs under the control of an
object called a block iterator.  Simple block iterators execute code
before and after the code inside the block statement.  Block iterators
also have the opportunity to execute the controlled code more than once
(or not at all), catch exceptions, or receive data from the body of the
block statement.
That actually looks pretty reasonable.
Hmmm. Patterns of structure. Maybe we could call it a
struct statement.
   struct opening(foo) as f:
  ...
Then we could confuse both C *and* Ruby programmers at
the same time! :-)
[No, I don't really mean this. I actually prefer block
to this.]
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-28 Thread Guido van Rossum
 If the use of block-statements becomes common for certain
 tasks such as opening files, it seems to me that people are
 going to encounter their use around about the same time
 they encounter for-statements. We need *something* to
 tell these people to enable them to understand the code
 they're reading.
 
 Maybe it would be sufficient just to explain the meanings
 of those particular uses, and leave the full general
 explanation as an advanced topic.

Right. The block statement is a bit like a chameleon: it adapts its
meaning to the generator you supply. (Or maybe it's like a sewer: what
you get out of it depends on what you put into it. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).

Some highlights:

- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use continue EXPR to pass a value to the generator
- generator exception handling explained

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Jim Fulton
Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration instance
- use continue EXPR to pass a value to the generator
- generator exception handling explained
This looks pretty cool.
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that happens
   to evaluate to None inside a block simply exits the block, rather
   than exiting a surrounding function. Did I miss something, or is this
   a bug?
2. I assume it would be a hack to try to use block statements to implement
   something like interfaces or classes, because doing so would require
   significant local-variable manipulation.  I'm guessing that
   either implementing interfaces (or implementing a class statement
   in which the class was created before execution of a suite)
   is not a use case for this PEP.
Jim
--
Jim Fulton   mailto:[EMAIL PROTECTED]   Python Powered!
CTO  (540) 361-1714http://www.python.org
Zope Corporation http://www.zope.com   http://www.zope.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton [EMAIL PROTECTED] wrote in news:[EMAIL PROTECTED]:

 Guido van Rossum wrote:
 I've written a PEP about this topic. It's PEP 340: Anonymous Block
 Statements (http://python.org/peps/pep-0340.html).
 
 Some observations:
 
 1. It looks to me like a bare return or a return with an EXPR3 that
 happens 
 to evaluate to None inside a block simply exits the block, rather
 than exiting a surrounding function. Did I miss something, or is
 this a bug?
 

No, the return sets a flag and raises StopIteration which should make the 
iterator also raise StopIteration at which point the real return happens.

If the iterator fails to re-raise the StopIteration exception (the spec 
only says it should, not that it must) I think the return would be ignored 
but a subsquent exception would then get converted into a return value. I 
think the flag needs reset to avoid this case.

Also, I wonder whether other exceptions from next() shouldn't be handled a 
bit differently. If BLOCK1 throws an exception, and this causes the 
iterator to also throw an exception then one exception will be lost. I 
think it would be better to propogate the original exception rather than 
the second exception.

So something like (added lines to handle both of the above):

itr = EXPR1
exc = arg = None
ret = False
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
if exc is not None:
if ret:
return exc
else:
raise exc   # XXX See below
break
+   except:
+   if ret or exc is None:
+   raise
+   raise exc # XXX See below
+   ret = False
try:
exc = arg = None
BLOCK1
except Exception, exc:
arg = StopIteration()
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Samuele Pedroni
Jim Fulton wrote:
Duncan Booth wrote:
Jim Fulton [EMAIL PROTECTED] wrote in news:[EMAIL PROTECTED]:

Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some observations:
1. It looks to me like a bare return or a return with an EXPR3 that
happensto evaluate to None inside a block simply exits the 
block, rather
   than exiting a surrounding function. Did I miss something, or is
   this a bug?


No, the return sets a flag and raises StopIteration which should make 
the iterator also raise StopIteration at which point the real return 
happens.

Only if exc is not None
The only return in the pseudocode is inside if exc is not None.
Is there another return that's not shown? ;)
I agree that we leave the block, but it doesn't look like we
leave the surrounding scope.
that we are having this discussion at all seems a signal that the 
semantics are likely too subtle.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Duncan Booth
Jim Fulton [EMAIL PROTECTED] wrote in news:[EMAIL PROTECTED]:

 No, the return sets a flag and raises StopIteration which should make
 the iterator also raise StopIteration at which point the real return
 happens. 
 
 Only if exc is not None
 
 The only return in the pseudocode is inside if exc is not None.
 Is there another return that's not shown? ;)
 

Ah yes, I see now what you mean. 

I would think that the relevant psuedo-code should look more like:

except StopIteration:
if ret:
return exc
if exc is not None:
raise exc   # XXX See below
break
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 12:30 AM 4/27/05 -0700, Guido van Rossum wrote:
I've written a PEP about this topic. It's PEP 340: Anonymous Block
Statements (http://python.org/peps/pep-0340.html).
Some highlights:
- temporarily sidestepping the syntax by proposing 'block' instead of 'with'
- __next__() argument simplified to StopIteration or ContinueIteration 
instance
- use continue EXPR to pass a value to the generator
- generator exception handling explained
Very nice.  It's not clear from the text, btw, if normal exceptions can be 
passed into __next__, and if so, whether they can include a traceback.  If 
they *can*, then generators can also be considered co-routines now, in 
which case it might make sense to call blocks coroutine blocks, because 
they're basically a way to interleave a block of code with the execution of 
a specified coroutine.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum [EMAIL PROTECTED] wrote:
 
 I've written a PEP about this topic. It's PEP 340: Anonymous Block
 Statements (http://python.org/peps/pep-0340.html).
 
 Some highlights:
 
 - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
 - __next__() argument simplified to StopIteration or ContinueIteration 
 instance
 - use continue EXPR to pass a value to the generator
 - generator exception handling explained

Your code for the translation of a standard for loop is flawed.  From
the PEP:

for VAR1 in EXPR1:
BLOCK1
else:
BLOCK2

will be translated as follows:

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
finally:
break
arg = None
BLOCK1
else:
BLOCK2


Note that in the translated version, BLOCK2 can only ever execute if
next raises a StopIteration in the call, and BLOCK1 will never be
executed because of the 'break' in the finally clause.

Unless it is too early for me, I believe what you wanted is...

itr = iter(EXPR1)
arg = None
while True:
VAR1 = next(itr, arg)
arg = None
BLOCK1
else:
BLOCK2

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 I would think that the relevant psuedo-code should look more like:
 
 except StopIteration:
 if ret:
 return exc
 if exc is not None:
 raise exc   # XXX See below
 break

Thanks! This was a bug in the PEP due to a last-minute change in how I
wanted to handle return; I've fixed it as you show (also renaming
'exc' to 'var' since it doesn't always hold an exception).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Steven Bethard
On 4/27/05, Guido van Rossum [EMAIL PROTECTED] wrote:
 I've written a PEP about this topic. It's PEP 340: Anonymous Block
 Statements (http://python.org/peps/pep-0340.html).

So block-statements would be very much like for-loops, except:

(1) iter() is not called on the expression
(2) the fact that break, continue, return or a raised Exception
occurred can all be intercepted by the block-iterator/generator,
though break, return and a raised Exception all look the same to the
block-iterator/generator (they are signaled with a StopIteration)
(3) the while loop can only be broken out of by next() raising a
StopIteration, so all well-behaved iterators will be exhausted when
the block-statement is exited

Hope I got that mostly right.

I know this is looking a little far ahead, but is the intention that
even in Python 3.0 for-loops and block-statements will still be
separate statements?  It seems like there's a pretty large section of
overlap.  Playing with for-loop semantics right now isn't possible due
to backwards compatibility, but when that limitation is removed in
Python 3.0, are we hoping that these two similar structures will be
expressed in a single statement?

STeVe
-- 
You can wordify anything if you just verb it.
--- Bucky Katt, Get Fuzzy
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 Your code for the translation of a standard for loop is flawed.  From
 the PEP:
 
 for VAR1 in EXPR1:
 BLOCK1
 else:
 BLOCK2
 
 will be translated as follows:
 
 itr = iter(EXPR1)
 arg = None
 while True:
 try:
 VAR1 = next(itr, arg)
 finally:
 break
 arg = None
 BLOCK1
 else:
 BLOCK2
 
 Note that in the translated version, BLOCK2 can only ever execute if
 next raises a StopIteration in the call, and BLOCK1 will never be
 executed because of the 'break' in the finally clause.

Ouch. Another bug in the PEP. It was late. ;-)

The finally: should have been except StopIteration: I've updated
the PEP online.

 Unless it is too early for me, I believe what you wanted is...
 
 itr = iter(EXPR1)
 arg = None
 while True:
 VAR1 = next(itr, arg)
 arg = None
 BLOCK1
 else:
 BLOCK2

No, this would just propagate the StopIteration when next() raises it.
StopIteration is not caught implicitly except around the next() call
made by the for-loop control code.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


RE: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Andrew Koenig

 that we are having this discussion at all seems a signal that the
 semantics are likely too subtle.

I feel like we're quietly, delicately tiptoeing toward continuations...


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Josiah Carlson

Guido van Rossum [EMAIL PROTECTED] wrote:
 Ouch. Another bug in the PEP. It was late. ;-)
 
 The finally: should have been except StopIteration: I've updated
 the PEP online.
 
  Unless it is too early for me, I believe what you wanted is...
  
  itr = iter(EXPR1)
  arg = None
  while True:
  VAR1 = next(itr, arg)
  arg = None
  BLOCK1
  else:
  BLOCK2
 
 No, this would just propagate the StopIteration when next() raises it.
 StopIteration is not caught implicitly except around the next() call
 made by the for-loop control code.

Still no good.  On break, the else isn't executed.

How about...

itr = iter(EXPR1)
arg = None
while True:
try:
VAR1 = next(itr, arg)
except StopIteration:
BLOCK2
break
arg = None
BLOCK1

 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
 I've written a PEP about this topic. It's PEP 340: Anonymous Block
 Statements (http://python.org/peps/pep-0340.html).
 
 Some highlights:
 
 - temporarily sidestepping the syntax by proposing 'block' instead of 'with'
 - __next__() argument simplified to StopIteration or ContinueIteration 
 instance
 - use continue EXPR to pass a value to the generator
 - generator exception handling explained
 

I am at least +0 on all of this now, with a slow warming up to +1 (but then it
might just be the cold talking  =).

I still prefer the idea of arguments to __next__() be raised if they are
exceptions and otherwise just be returned through the yield expression.  But I
do realize this is easily solved with a helper function now::

 def raise_or_yield(val):
 Return the argument if not an exception, otherwise raise it.

 Meant to have a yield expression as an argument.  Worries about
 Iteration subclasses are invalid since they will have been handled by the
 __next__() method on the generator already.


 
 if isinstance(val, Exception):
raise val
 else:
return val

My objections that I had earlier to 'continue' and 'break' being somewhat
magical in block statements has subsided.  It all seems reasonable now within
the context of a block statement.

And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .

Basically block statements are providing a simplified, syntactically supported
way to control a generator externally from itself (or at least this is the
impression I am getting).  I just had a flash of worry about how this would
work in terms of abstractions of things to functions with block statements in
them, but then I realized you just push more code into the generator and handle
it there with the block statement just driving the generator.  Seems like this
might provide that last key piece for generators to finally provide cool flow
control that we all know they are capable of but just required extra work
beforehand.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip Eby]
 Very nice.  It's not clear from the text, btw, if normal exceptions can be
 passed into __next__, and if so, whether they can include a traceback.  If
 they *can*, then generators can also be considered co-routines now, in
 which case it might make sense to call blocks coroutine blocks, because
 they're basically a way to interleave a block of code with the execution of
 a specified coroutine.

The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)

I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.

But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:

def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print ignored, var, :, err.__class__.__name__

block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x

This should print

0.1
0.2
ignored 0 : ZeroDivisionError
0.02

I've been thinking of alternative signatures for the __next__() method
to handle this. We have the following use cases:

1. plain old next()
2. passing a value from continue EXPR
3. forcing a break due to a break statement
4. forcing a break due to a return statement
5. passing an exception EXC

Cases 3 and 4 are really the same; I don't think the generator needs
to know the difference between a break and a return statement. And
these can be mapped to case 5 with EXC being StopIteration().

Now the simplest API would be this: if the argument to __next__() is
an exception instance (let's say we're talking Python 3000, where all
exceptions are subclasses of Exception), it is raised when yield
resumes; otherwise it is the return value from yield (may be None).

This is somewhat unsatisfactory because it means that you can't pass
an exception instance as a value. I don't know how much of a problem
this will be in practice; I could see it causing unpleasant surprises
when someone designs an API around this that takes an arbitrary
object, when someone tries to pass an exception instance. Fixing such
a thing could be expensive (you'd have to change the API to pass the
object wrapped in a list or something).

An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?

I'll add this to the PEP as an alternative for now.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 I feel like we're quietly, delicately tiptoeing toward continuations...

No way we aren't. We're not really adding anything to the existing
generator machinery (the exception/value passing is a trivial
modification) and that is only capable of 80% of coroutines (but it's
the 80% you need most :-).

As long as I am BDFL Python is unlikely to get continuations -- my
head explodes each time someone tries to explain them to me.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread David Ascher
On 4/27/05, Guido van Rossum [EMAIL PROTECTED] wrote:

 As long as I am BDFL Python is unlikely to get continuations -- my
 head explodes each time someone tries to explain them to me.

You just need a safety valve installed. It's outpatient surgery, don't worry.

--david
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 01:27 PM 4/27/05 -0700, Guido van Rossum wrote:
[Phillip Eby]
 Very nice.  It's not clear from the text, btw, if normal exceptions can be
 passed into __next__, and if so, whether they can include a traceback.  If
 they *can*, then generators can also be considered co-routines now, in
 which case it might make sense to call blocks coroutine blocks, because
 they're basically a way to interleave a block of code with the execution of
 a specified coroutine.
The PEP is clear on this: __next__() only takes Iteration instances,
i.e., StopIteration and ContinueIteration. (But see below.)
I'm not sure what the relevance of including a stack trace would be,
and why that feature would be necessary to call them coroutines.
Well, you need that feature in order to retain traceback information when 
you're simulating threads with a stack of generators.  Although you can't 
return from a generator inside a nested generator, you can simulate this by 
keeping a stack of generators and having a wrapper that passes control 
between generators, such that:

def somegen():
result = yield othergen()
causes the wrapper to push othergen() on the generator stack and execute 
it.  If othergen() raises an error, the wrapper resumes somegen() and 
passes in the error.  If you can only specify the value but not the 
traceback, you lose the information about where the error occurred in 
othergen().

So, the feature is necessary for anything other than simple (i.e. 
single-frame) coroutines, at least if you want to retain any possibility of 
debugging.  :)


But... Maybe it would be nice if generators could also be used to
implement exception handling patterns, rather than just resource
release patterns. IOW, maybe this should work:
def safeLoop(seq):
for var in seq:
try:
yield var
except Exception, err:
print ignored, var, :, err.__class__.__name__
block safeLoop([10, 5, 0, 20]) as x:
print 1.0/x
Yes, it would be nice.  Also, you may have just come up with an even better 
word for what these things should be called... patterns.  Perhaps they 
could be called pattern blocks or patterned blocks.  Pattern sounds so 
much more hip and politically correct than macro or even code block.  :)


An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I think it'd be simpler just to have two methods, conceptually 
resume(value=None) and error(value,tb=None), whatever the actual method 
names are.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
 I'm not sure what the relevance of including a stack trace would be,
 and why that feature would be necessary to call them coroutines.

[Phillip]
 Well, you need that feature in order to retain traceback information when
 you're simulating threads with a stack of generators.  Although you can't
 return from a generator inside a nested generator, you can simulate this by
 keeping a stack of generators and having a wrapper that passes control
 between generators, such that:
 
  def somegen():
  result = yield othergen()
 
 causes the wrapper to push othergen() on the generator stack and execute
 it.  If othergen() raises an error, the wrapper resumes somegen() and
 passes in the error.  If you can only specify the value but not the
 traceback, you lose the information about where the error occurred in
 othergen().
 
 So, the feature is necessary for anything other than simple (i.e.
 single-frame) coroutines, at least if you want to retain any possibility of
 debugging.  :)

OK. I think you must be describing continuations there, because my
brain just exploded. :-)

In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice? I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().

 Yes, it would be nice.  Also, you may have just come up with an even better
 word for what these things should be called... patterns.  Perhaps they
 could be called pattern blocks or patterned blocks.  Pattern sounds so
 much more hip and politically correct than macro or even code block.  :)

Yes, but the word has a much loftier meaning. I could get used to
template blocks though (template being a specific pattern, and this
whole thing being a non-OO version of the Template Method Pattern from
the GoF book).

 An alternative that solves this would be to give __next__() a second
 argument, which is a bool that should be true when the first argument
 is an exception that should be raised. What do people think?
 
 I think it'd be simpler just to have two methods, conceptually
 resume(value=None) and error(value,tb=None), whatever the actual method
 names are.

Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much. Your resume() would be
__next__(), but that means your error() would become __error__(). This
is more along the lines of PEP 288 and PEP 325 (and even PEP 310), but
we have a twist here in that it is totally acceptable (see my example)
for __error__() to return the next value or raise StopIteration. IOW
the return behavior of __error__() is the same as that of __next__().

Fredrik, what does your intuition tell you?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 If the iterator fails to re-raise the StopIteration exception (the spec
 only says it should, not that it must) I think the return would be ignored
 but a subsquent exception would then get converted into a return value. I
 think the flag needs reset to avoid this case.

Good catch. I've fixed this in the PEP.

 Also, I wonder whether other exceptions from next() shouldn't be handled a
 bit differently. If BLOCK1 throws an exception, and this causes the
 iterator to also throw an exception then one exception will be lost. I
 think it would be better to propogate the original exception rather than
 the second exception.

I don't think so. It's similar to this case:

try:
raise Foo
except:
raise Bar

Here, Foo is also lost.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Jim Fulton]

 2. I assume it would be a hack to try to use block statements to implement
 something like interfaces or classes, because doing so would require
 significant local-variable manipulation.  I'm guessing that
 either implementing interfaces (or implementing a class statement
 in which the class was created before execution of a suite)
 is not a use case for this PEP.

I would like to get back to the discussion about interfaces and
signature type declarations at some point, and a syntax dedicated to
declaring interfaces is high on my wish list.

In the mean time, if you need interfaces today, I think using
metaclasses would be easier than using a block-statement (if it were
even possible using the latter without passing locals() to the
generator).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Brett C. wrote:
And while the thought is in my head, I think block statements should be viewed
less as a tweaked version of a 'for' loop and more as an extension to
generators that happens to be very handy for resource management (while
allowing iterators to come over and play on the new swing set as well).  I
think if you take that view then the argument that they are too similar to
'for' loops loses some luster (although I doubt Nick is going to be buy this  
=) .
I'm surprisingly close to agreeing with you, actually. I've worked out that it 
isn't the looping that I object to, it's the inability to get out of the loop 
without exhausting the entire iterator.

I need to think about some ideas involving iterator factories, then my 
objections may disappear.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Nick Coghlan
Guido van Rossum wrote:
An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?
I'll add this to the PEP as an alternative for now.
An optional third argument (raise=False) seems a lot friendlier (and more 
flexible) than a typecheck.

Yet another alternative would be for the default behaviour to be to raise 
Exceptions, and continue with anything else, and have the third argument be 
raise_exc=True and set it to False to pass an exception in without raising it.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Guido]
  An alternative that solves this would be to give __next__() a second
  argument, which is a bool that should be true when the first argument
  is an exception that should be raised. What do people think?
 
  I'll add this to the PEP as an alternative for now.

[Nick]
 An optional third argument (raise=False) seems a lot friendlier (and more
 flexible) than a typecheck.

I think I agree, especially since Phillip's alternative (a different
method) is even worse IMO.

 Yet another alternative would be for the default behaviour to be to raise
 Exceptions, and continue with anything else, and have the third argument be
 raise_exc=True and set it to False to pass an exception in without raising 
 it.

You've lost me there. If you care about this, can you write it up in
more detail (with code samples or whatever)? Or we can agree on a 2nd
arg to __next__() (and a 3rd one to next()).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Phillip J. Eby
At 02:50 PM 4/27/05 -0700, Guido van Rossum wrote:
[Guido]
 I'm not sure what the relevance of including a stack trace would be,
 and why that feature would be necessary to call them coroutines.
[Phillip]
 Well, you need that feature in order to retain traceback information when
 you're simulating threads with a stack of generators.  Although you can't
 return from a generator inside a nested generator, you can simulate this by
 keeping a stack of generators and having a wrapper that passes control
 between generators, such that:

  def somegen():
  result = yield othergen()

 causes the wrapper to push othergen() on the generator stack and execute
 it.  If othergen() raises an error, the wrapper resumes somegen() and
 passes in the error.  If you can only specify the value but not the
 traceback, you lose the information about where the error occurred in
 othergen().

 So, the feature is necessary for anything other than simple (i.e.
 single-frame) coroutines, at least if you want to retain any possibility of
 debugging.  :)
OK. I think you must be describing continuations there, because my
brain just exploded. :-)
Probably my attempt at a *brief* explanation backfired.  No, they're not 
continuations or anything nearly that complicated.  I'm just simulating 
threads using generators that yield a nested generator when they need to do 
something that might block waiting for I/O.  The pseudothread object pushes 
the yielded generator-iterator and resumes it.  If that generator-iterator 
raises an error, the pseudothread catches it, pops the previous 
generator-iterator, and passes the error into it, traceback and all.

The net result is that as long as you use a yield expression for any 
function/method call that might do blocking I/O, and those functions or 
methods are written as generators, you get the benefits of Twisted (async 
I/O without threading headaches) without having to twist your code into 
the callback-registration patterns of Twisted.  And, by passing in errors 
with tracebacks, the normal process of exception call-stack unwinding 
combined with pseudothread stack popping results in a traceback that looks 
just as if you had called the functions or methods normally, rather than 
via the pseudothreading mechanism.  Without that, you would only get the 
error context of 'async_readline()', because the traceback wouldn't be able 
to show who *called* async_readline.


In Python 3000 I want to make the traceback a standard attribute of
Exception instances; would that suffice?
If you're planning to make 'raise' reraise it, such that 'raise exc' is 
equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you 
mean?  (i.e., just making it easier to pass the darn things around)

If so, then I could probably do what I need as long as there exist no error 
types whose instances disallow setting a 'traceback' attribute on them 
after the fact.  Of course, if Exception provides a slot (or dictionary) 
for this, then it shouldn't be a problem.

Of course, it seems to me that you also have the problem of adding to the 
traceback when the same error is reraised...

All in all it seems more complex than just allowing an exception and a 
traceback to be passed.


I really don't want to pass
the whole (type, value, traceback) triple that currently represents an
exception through __next__().
The point of passing it in is so that the traceback can be preserved 
without special action in the body of generators the exception is passing 
through.

I could be wrong, but it seems to me you need this even for PEP 340, if 
you're going to support error management templates, and want tracebacks to 
include the line in the block where the error originated.  Just reraising 
the error inside the generator doesn't seem like it would be enough.


 An alternative that solves this would be to give __next__() a second
 argument, which is a bool that should be true when the first argument
 is an exception that should be raised. What do people think?

 I think it'd be simpler just to have two methods, conceptually
 resume(value=None) and error(value,tb=None), whatever the actual method
 names are.
Part of me likes this suggestion, but part of me worries that it
complicates the iterator API too much.
I was thinking that maybe these would be a coroutine API or generator 
API instead.  That is, something not usable except with 
generator-iterators and with *new* objects written to conform to it.  I 
don't really see a lot of value in making template blocks work with 
existing iterators.  For that matter, I don't see a lot of value in 
hand-writing new objects with resume/error, instead of just using a generator.

So, I guess I'm thinking you'd have something like tp_block_resume and 
tp_block_error type slots, and generators' tp_iter_next would just be the 
same as tp_block_resume(None).

But maybe this is the part you're thinking is complicated.  :)
___
Python-Dev 

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
- temporarily sidestepping the syntax by proposing 'block' instead of
'with'
- __next__() argument simplified to StopIteration or
ContinueIteration instance
- use continue EXPR to pass a value to the generator
- generator exception handling explained
+1
A minor sticking point - I don't like that the generator has to re-raise any 
``StopIteration`` passed in. Would it be possible to have the semantics be:

   If a generator is resumed with ``StopIteration``, the exception is 
raised
   at the resumption point (and stored for later use). When the generator
   exits normally (i.e. ``return`` or falls off the end) it re-raises the
   stored exception (if any) or raises a new ``StopIteration`` exception.

So a generator would become effectively::
   try:
   stopexc = None
   exc = None
   BLOCK1
   finally:
   if exc is not None:
   raise exc
   if stopexc is not None:
   raise stopexc
   raise StopIteration
where within BLOCK1:
   ``raise exception`` is equivalent to::
   exc = exception
   return
   The start of an ``except`` clause sets ``exc`` to None (if the clause is
   executed of course).
   Calling ``__next__(exception)`` with ``StopIteration`` is equivalent 
to::

   stopexc = exception
   (raise exception at resumption point)
   Calling ``__next__(exception)`` with ``ContinueIteration`` is equivalent 
to::

   (resume exception with exception.value)
   Calling ``__next__(exception)__`` with any other value just raises that 
value
   at the resumption point - this allows for calling with arbitrary 
exceptions.

Also, within a for-loop or block-statement, we could have ``raise 
exception`` be equivalent to::

   arg = exception
   continue
This also takes care of Brett's concern about distinguishing between 
exceptions and values passed to the generator. Anything except StopIteration 
or ContinueIteration will be presumed to be an exception and will be raised. 
Anything passed via ContinueIteration is a value.

Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
[Phillip]
 Probably my attempt at a *brief* explanation backfired.  No, they're not
 continuations or anything nearly that complicated.  I'm just simulating
 threads using generators that yield a nested generator when they need to do
 something that might block waiting for I/O.  The pseudothread object pushes
 the yielded generator-iterator and resumes it.  If that generator-iterator
 raises an error, the pseudothread catches it, pops the previous
 generator-iterator, and passes the error into it, traceback and all.
 
 The net result is that as long as you use a yield expression for any
 function/method call that might do blocking I/O, and those functions or
 methods are written as generators, you get the benefits of Twisted (async
 I/O without threading headaches) without having to twist your code into
 the callback-registration patterns of Twisted.  And, by passing in errors
 with tracebacks, the normal process of exception call-stack unwinding
 combined with pseudothread stack popping results in a traceback that looks
 just as if you had called the functions or methods normally, rather than
 via the pseudothreading mechanism.  Without that, you would only get the
 error context of 'async_readline()', because the traceback wouldn't be able
 to show who *called* async_readline.

OK, I sort of get it, at a very high-level, although I still feel this
is wildly out of my league.

I guess I should try it first. ;-)

 In Python 3000 I want to make the traceback a standard attribute of
 Exception instances; would that suffice?
 
 If you're planning to make 'raise' reraise it, such that 'raise exc' is
 equivalent to 'raise type(exc), exc, exc.traceback'.  Is that what you
 mean?  (i.e., just making it easier to pass the darn things around)
 
 If so, then I could probably do what I need as long as there exist no error
 types whose instances disallow setting a 'traceback' attribute on them
 after the fact.  Of course, if Exception provides a slot (or dictionary)
 for this, then it shouldn't be a problem.

Right, this would be a standard part of the Exception base class, just
like in Java.

 Of course, it seems to me that you also have the problem of adding to the
 traceback when the same error is reraised...

I think when it is re-raised, no traceback entry should be added; the
place that re-raises it should not show up in the traceback, only the
place that raised it in the first place. To me that's the essence of
re-raising (and I think that's how it works when you use raise without
arguments).

 All in all it seems more complex than just allowing an exception and a
 traceback to be passed.

Making the traceback a standard attribute of the exception sounds
simpler; having to keep track of two separate arguments that are as
closely related as an exception and the corresponding traceback is
more complex IMO.

The only reason why it isn't done that way in current Python is that
it couldn't be done that way back when exceptions were strings.

 I really don't want to pass
 the whole (type, value, traceback) triple that currently represents an
 exception through __next__().
 
 The point of passing it in is so that the traceback can be preserved
 without special action in the body of generators the exception is passing
 through.
 
 I could be wrong, but it seems to me you need this even for PEP 340, if
 you're going to support error management templates, and want tracebacks to
 include the line in the block where the error originated.  Just reraising
 the error inside the generator doesn't seem like it would be enough.

*** I have to think about this more... ***

   I think it'd be simpler just to have two methods, conceptually
   resume(value=None) and error(value,tb=None), whatever the actual 
   method
   names are.
 
 Part of me likes this suggestion, but part of me worries that it
 complicates the iterator API too much.
 
 I was thinking that maybe these would be a coroutine API or generator
 API instead.  That is, something not usable except with
 generator-iterators and with *new* objects written to conform to it.  I
 don't really see a lot of value in making template blocks work with
 existing iterators.

(You mean existing non-generator iterators, right? existing
*generators* will work just fine -- the exception will pass right
through them and that's exactly the right default semantics.

Existing non-generator iterators are indeed a different case, and this
is actually an argument for having a separate API: if the __error__()
method doesn't exist, the exception is just re-raised rather than
bothering the iterator.

OK, I think I'm sold.

 For that matter, I don't see a lot of value in
 hand-writing new objects with resume/error, instead of just using a generator.

Not a lot, but I expect that there may be a few, like an optimized
version of lock synchronization.

 So, I guess I'm thinking you'd have something like tp_block_resume and
 tp_block_error type slots, and generators' tp_iter_next would just be the
 

Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 A minor sticking point - I don't like that the generator has to re-raise any
 ``StopIteration`` passed in. Would it be possible to have the semantics be:
 
 If a generator is resumed with ``StopIteration``, the exception is raised
 at the resumption point (and stored for later use). When the generator
 exits normally (i.e. ``return`` or falls off the end) it re-raises the
 stored exception (if any) or raises a new ``StopIteration`` exception.

I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Tim Delaney
Guido van Rossum wrote:
A minor sticking point - I don't like that the generator has to
re-raise any ``StopIteration`` passed in. Would it be possible to
have the semantics be: 

If a generator is resumed with ``StopIteration``, the exception
is raised at the resumption point (and stored for later use).
When the generator exits normally (i.e. ``return`` or falls off
the end) it re-raises the stored exception (if any) or raises a
new ``StopIteration`` exception. 
I don't like the idea of storing exceptions. Let's just say that we
don't care whether it re-raises the very same StopIteration exception
that was passed in or a different one -- it's all moot anyway because
the StopIteration instance is thrown away by the caller of next().
OK - so what is the point of the sentence::
   The generator should re-raise this exception; it should not yield
   another value.  

when discussing StopIteration?
Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 OK - so what is the point of the sentence::
 
 The generator should re-raise this exception; it should not yield
 another value.
 
 when discussing StopIteration?

It forbids returning a value, since that would mean the generator
could refuse a break or return statement, which is a little bit too
weird (returning a value instead would turn these into continue
statements).

I'll change this to clarify that I don't care about the identity of
the StopException instance.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Neil Schemenauer
On Wed, Apr 27, 2005 at 12:30:22AM -0700, Guido van Rossum wrote:
 I've written a PEP about this topic. It's PEP 340: Anonymous Block
 Statements (http://python.org/peps/pep-0340.html).

[Note: most of these comments are based on version 1.2 of the PEP]

It seems like what you are proposing is a limited form of
coroutines.  Just as Python's generators are limited (yield can only
jump up one stack frame), these coroutines have a similar
limitation.  Someone mentioned that we are edging closer to
continuations.  I think that may be a good thing.  One big
difference between what you propose and general continuations is in
finalization semantics.  I don't think anyone has figured out a way
for try/finally to work with continuations.  The fact that
try/finally can be used inside generators is a significant feature
of this PEP, IMO.

Regarding the syntax, I actually quite like the 'block' keyword.  It
doesn't seem so surprising that the block may be a loop.

Allowing 'continue' to have an optional value is elegant syntax.
I'm a little bit concerned about what happens if the iterator does
not expect a value.  If I understand the PEP, it is silently
ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
any worse then a caller not expecting a return value.

It's interesting that there is such similarity between 'for' and
'block'.  Why is it that block does not call iter() on EXPR1?  I
guess that fact that 'break' and 'return' work differently is a more
significant difference.

After thinking about this more, I wonder if iterators meant for
'for' loops and iterators meant for 'block' statements are really
very different things.  It seems like a block-iterator really needs
to handle yield-expressions.

I wonder if generators that contain a yield-expression should
properly be called coroutines.  Practically, I suspect it would just
cause confusion.

Perhaps passing an Iteration instance to next() should not be
treated the same as passing None.  It seems like that would
implementing the iterator easier.  Why not treat Iterator like any
normal value?  Then only None, StopIteration, and ContinueIteration
would be special.

Argh, it took me so long to write this that you are already up to
version 1.6 of the PEP.  Time to start a new message. :-)

  Neil
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Brett C.
Guido van Rossum wrote:
 [Guido]
 
An alternative that solves this would be to give __next__() a second
argument, which is a bool that should be true when the first argument
is an exception that should be raised. What do people think?

I'll add this to the PEP as an alternative for now.
 
 
 [Nick]
 
An optional third argument (raise=False) seems a lot friendlier (and more
flexible) than a typecheck.
 
 
 I think I agree, especially since Phillip's alternative (a different
 method) is even worse IMO.
 

The extra argument works for me as well.

 
Yet another alternative would be for the default behaviour to be to raise
Exceptions, and continue with anything else, and have the third argument be
raise_exc=True and set it to False to pass an exception in without raising 
it.
 
 
 You've lost me there. If you care about this, can you write it up in
 more detail (with code samples or whatever)? Or we can agree on a 2nd
 arg to __next__() (and a 3rd one to next()).
 

Channeling Nick, I think he is saying that the raising argument should be made
True by default and be named 'raise_exc'.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-27 Thread Guido van Rossum
 It seems like what you are proposing is a limited form of
 coroutines.

Well, I though that's already what generators were -- IMO there isn't
much news there. We're providing a more convenient way to pass a value
back, but that's always been possible (see Fredrik's examples).

 Allowing 'continue' to have an optional value is elegant syntax.
 I'm a little bit concerned about what happens if the iterator does
 not expect a value.  If I understand the PEP, it is silently
 ignored.  That seems like it could hide bugs.  OTOH, it doesn't seem
 any worse then a caller not expecting a return value.

Exactly.

 It's interesting that there is such similarity between 'for' and
 'block'.  Why is it that block does not call iter() on EXPR1?  I
 guess that fact that 'break' and 'return' work differently is a more
 significant difference.

Well, perhaps block *should* call iter()? I'd like to hear votes about
this. In most cases that would make a block-statement entirely
equivalent to a for-loop, the exception being only when there's an
exception or when breaking out of an iterator with resource
management.

I initially decided it should not call iter() so as to emphasize that
this isn't supposed to be used for looping over sequences -- EXPR1 is
really expected to be a resource management generator (or iterator).

 After thinking about this more, I wonder if iterators meant for
 'for' loops and iterators meant for 'block' statements are really
 very different things.  It seems like a block-iterator really needs
 to handle yield-expressions.

But who knows, they might be useful for for-loops as well. After all,
passing values back to the generator has been on some people's wish
list for a long time.

 I wonder if generators that contain a yield-expression should
 properly be called coroutines.  Practically, I suspect it would just
 cause confusion.

I have to admit that I haven't looked carefully for use cases for
this! I just looked at a few Ruby examples and realized that it would
be a fairly simple extension of generators.

You can call such generators coroutines, but they are still generators.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Brett C.
Greg Ewing wrote:
 Brett C. wrote:
 
 And before anyone decries the fact that this might confuse a newbie
 (which
 seems to happen with every advanced feature ever dreamed up), remember
 this
 will not be meant for a newbie but for someone who has experience in
 Python and
 iterators at the minimum, and hopefully with generators.
 
 
 This is dangerously close to the you don't need to know about
 it if you're not going to use it argument, which is widely
 recognised as false. Newbies might not need to know all the
 details of the implementation, but they will need to know
 enough about the semantics of with-statements to understand
 what they're doing when they come across them in other people's
 code.
 

I am not saying it is totally to be ignored by people staring at Python code,
but we don't need to necessarily spell out the intricacies.

 Which leads me to another concern. How are we going to explain
 the externally visible semantics of a with-statement in a way
 that's easy to grok, without mentioning any details of the
 implementation?
 
 You can explain a for-loop pretty well by saying something like
 It executes the body once for each item from the sequence,
 without having to mention anything about iterators, generators,
 next() methods, etc. etc. How the items are produced is completely
 irrelevant to the concept of the for-loop.
 
 But what is the equivalent level of description of the
 with-statement going to say?
 
 It executes the body with... ???
 

It executes the body, calling next() on the argument name on each time through
until the iteration stops.

 And a related question: What are we going to call the functions
 designed for with-statements, and the objects they return?
 Calling them generators and iterators (even though they are)
 doesn't seem right, because they're being used for a purpose
 very different from generating and iterating.
 

I like managers since they are basically managing resources most of the time
for the user.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Guido van Rossum
[Greg Ewing]
 I like the general shape of this, but I have one or two
 reservations about the details.

That summarizes the feedback so far pretty well. I think we're on to
something. And I'm not too proud to say that Ruby has led the way here
to some extent (even if Python's implementation would be fundamentally
different, since it's based on generators, which has some different
possibilities and precludes some Ruby patterns).

 1) We're going to have to think carefully about the naming of
 functions designed for use with this statement. If 'with'
 is going to be in there as a keyword, then it really shouldn't
 be part of the function name as well.

Of course. I only used 'with_opened' because it's been the running
example in this thread.

 I would rather see something like
 
with f = opened(pathname):
  ...
 
 This sort of convention (using a past participle as a function
 name) would work for some other cases as well:
 
with some_data.locked():
  ...
 
with some_resource.allocated():
  ...


Or how about

with synchronized(some_resource):
...

 On the negative side, not having anything like 'with' in the
 function name means that the fact the function is designed for
 use in a with-statement could be somewhat non-obvious. Since
 there's not going to be much other use for such a function,
 this is a bad thing.

This seems a pretty mild problem; one could argue that every function
is only useful in a context where its return type makes sense, and we
seem to be getting along just fine with naming conventions (or just
plain clear naming).

 It could also lead people into subtle usage traps such as
 
with f = open(pathname):
  ...
 
 which would fail in a somewhat obscure way.

Ouch. That one hurts. (I was going to say but f doesn't have a next()
method when I realized it *does*. :-) It is *almost* equivalent to

for f in open(pathname):
...

except if the ... block raises an exception.  Fortunately your
proposal to use 'as' makes this mistake less likely.

 So maybe the 'with' keyword should be dropped (again!) in
 favour of
 
with_opened(pathname) as f:
  ...

But that doesn't look so great for the case where there's no variable
to be assigned to -- I wasn't totally clear about it, but I meant the
syntax to be

with [VAR =] EXPR: BLOCK

where VAR would have the same syntax as the left hand side of an
assignment (or the variable in a for-statement).

 2) I'm not sure about the '='. It makes it look rather deceptively
 like an ordinary assignment, and I'm sure many people are going
 to wonder what the difference is between
 
with f = opened(pathname):
  do_stuff_to(f)
 
 and simply
 
f = opened(pathname)
do_stuff_to(f)
 
 or even just unconsciously read the first as the second without
 noticing that anything special is going on. Especially if they're
 coming from a language like Pascal which has a much less magical
 form of with-statement.

Right.

 So maybe it would be better to make it look more different:
 
with opened(pathname) as f:
  ...

Fredrik said this too, and as long as we're going to add 'with' as a
new keyword, we might as well promote 'as' to become a real
keyword. So then the syntax would become

with EXPR [as VAR]: BLOCK

I don't see a particular need for assignment to multiple VARs (but VAR
can of course be a tuple of identifiers).

 * It seems to me that this same exception-handling mechanism
 would be just as useful in a regular for-loop, and that, once
 it becomes possible to put 'yield' in a try-statement, people
 are going to *expect* it to work in for-loops as well.

(You can already put a yield inside a try-except, just not inside a
try-finally.)

 Guido has expressed concern about imposing extra overhead on
 all for-loops. But would the extra overhead really be all that
 noticeable? For-loops already put a block on the block stack,
 so the necessary processing could be incorporated into the
 code for unwinding a for-block during an exception, and little
 if anything would need to change in the absence of an exception.

Probably.

 However, if for-loops also gain this functionality, we end up
 with the rather embarrassing situation that there is *no difference*
 in semantics between a for-loop and a with-statement!

There would still be the difference that a for-loop invokes iter() and
a with-block doesn't.

Also, for-loops that don't exhaust the iterator leave it available for
later use. I believe there are even examples of this pattern, where
one for-loop searches the iterable for some kind of marker value and
the next for-loop iterates over the remaining items. For example:

f = open(messagefile)
# Process message headers
for line in f:
if not line.strip():
break
if line[0].isspace():
addcontinuation(line)
else:
addheader(line)
# Process message body
for line in f:
addbody(line)

 This could be 

Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Greg Ewing
Brett C. wrote:
It executes the body, calling next() on the argument
 name on each time through until the iteration stops.
But that's no good, because (1) it mentions next(),
which should be an implementation detail, and (2)
it talks about iteration, when most of the time
the high-level intent has nothing to do with iteration.
In other words, this is too low a level of explanation.
Greg

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Greg Ewing
Guido van Rossum wrote:
[Greg Ewing]
* It seems to me that this same exception-handling mechanism
would be just as useful in a regular for-loop, and that, once
it becomes possible to put 'yield' in a try-statement, people
are going to *expect* it to work in for-loops as well.
(You can already put a yield inside a try-except, just not inside a
try-finally.)
Well, my point still stands. People are going to write
try-finally around their yields and expect the natural
thing to happen when their generator is used in a
for-loop.
There would still be the difference that a for-loop invokes iter() and
a with-block doesn't.

 Also, for-loops that don't exhaust the iterator leave it available for
 later use.
Hmmm. But are these big enough differences to justify
having a whole new control structure? Whither TOOWTDI?

The statement:
for VAR in EXPR:
BLOCK
does the same thing as:
with iter(EXPR) as VAR:# Note the iter() call
BLOCK
except that:
- you can leave out the as VAR part from the with-statement;
- they work differently when an exception happens inside BLOCK;
- break and continue don't always work the same way.
The only time you should write a with-statement is when the
documentation for the function you are calling says you should.

Surely you jest. Any newbie reading this is going to think
he hasn't a hope in hell of ever understanding what is going
on here, and give up on Python in disgust.

I'm seriously worried by the
possibility that a return statement could do something other
than return from the function it's written in.

Let me explain the use cases that led me to throwing that in
Yes, I can see that it's going to be necessary to treat
return as an exception, and accept the possibility that
it will be abused. I'd still much prefer people refrain
from abusing it that way, though. Using return to spell
send value back to yield statement would be extremely
obfuscatory.
(BTW ReturnFlow etc. aren't great
names.  Suggestions?)
I'd suggest just calling them Break, Continue and Return.
synchronized(lock):
BLOCK
transactional(db):
BLOCK
forever():
BLOCK
opening(filename) as f:
BLOCK
Hey, I like that last one! Well done!
One last thing: if we need a special name for iterators and generators
designed for use in a with-statement, how about calling them
with-iterators and with-generators.
Except that if it's no longer a with statement, this
doesn't make so much sense...
Greg


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Reinhold Birkenfeld wrote:
Nick Coghlan wrote:
Guido van Rossum wrote:
[snip]
- I think there's a better word than Flow, but I'll keep using it
 until we find something better.
How about simply reusing Iteration (ala StopIteration)?
  Pass in 'ContinueIteration' for 'continue'
  Pass in 'BreakIteration' for 'break'
  Pass in 'AbortIteration' for 'return' and finalisation.
And advise strongly *against* intercepting AbortIteration with anything other 
than a finally block.

Hmmm... another idea: If break and continue return keep exactly the current
semantics (break or continue the innermost for/while-loop), do we need
different exceptions at all? AFAICS AbortIteration (+1 on the name) would be
sufficient for all three interrupting statements, and this would prevent
misuse too, I think.
No, the iterator should be able to keep state around in the case of 
BreakIteration and ContinueIteration, whereas AbortIteration should shut the 
whole thing down.

In particular VAR = yield None is likely to become syntactic sugar for:
  try:
yield None
  except ContinueIteration, exc:
VAR = ContinueIteration.value
We definitely don't want that construct swallowing AbortIteration.
Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Michael Hudson
Whew!  This is a bit long...
On 25 Apr 2005, at 00:57, Guido van Rossum wrote:
After reading a lot of contributions (though perhaps not all -- this
thread seems to bifurcate every time someone has a new idea :-)
I haven't read all the posts around the subject, I'll have to admit.  
I've read the one I'm replying and its followups to pretty carefully, 
though.

I'm back to liking yield for the PEP 310 use case. I think maybe it was
Doug Landauer's post mentioning Beta, plus scanning some more examples
of using yield in Ruby. Jim Jewett's post on defmacro also helped, as
did Nick Coghlan's post explaining why he prefers 'with' for PEP 310
and a bare expression for the 'with' feature from Pascal (and other
languages :-).
The history of iterators and generators could be summarized by saying 
that an API was invented, then it turned out that in practice one way 
of implementing them -- generators -- was almost universally useful.

This proposal seems a bit like an effort to make generators good at 
doing something that they aren't really intended -- or dare I say 
suited? -- for.  The tail wagging the dog so to speak.

It seems that the same argument that explains why generators are so
good for defining iterators, also applies to the PEP 310 use case:
it's just much more natural to write
def with_file(filename):
f = open(filename)
try:
yield f
finally:
f.close()
This is a syntax error today, of course.  When does the finally: clause 
execute  with your proposal? [I work this one out below :)]

than having to write a class with __entry__ and __exit__ and
__except__ methods (I've lost track of the exact proposal at this
point).

At the same time, having to use it as follows:
for f in with_file(filename):
for line in f:
print process(line)
is really ugly,
This is a non-starter, I hope.  I really meant what I said in PEP 310 
about loops being loops.

so we need new syntax, which also helps with keeping
'for' semantically backwards compatible. So let's use 'with', and then
the using code becomes again this:
with f = with_file(filename):
for line in f:
print process(line)
Now let me propose a strawman for the translation of the latter into
existing semantics. Let's take the generic case:
with VAR = EXPR:
BODY
This would translate to the following code:
it = EXPR
err = None
while True:
try:
if err is None:
VAR = it.next()
else:
VAR = it.next_ex(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend except Exception: == 
except:
if not hasattr(it, next_ex):
raise

(The variables 'it' and 'err' are not user-visible variables, they are
internal to the translation.)
This looks slightly awkward because of backward compatibility; what I
really want is just this:
it = EXPR
err = None
while True:
try:
VAR = it.next(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend except Exception: == 
except:
pass

but for backwards compatibility with the existing argument-less next()
API
More than that: if I'm implementing an iterator for, uh, iterating, why 
would one dream of needing to handle an 'err' argument in the next() 
method?

I'm introducing a new iterator API next_ex() which takes an
exception argument.  If that argument is None, it should behave just
like next().  Otherwise, if the iterator is a generator, this will
raised that exception in the generator's frame (at the point of the
suspended yield).  If the iterator is something else, the something
else is free to do whatever it likes; if it doesn't want to do
anything, it can just re-raise the exception.
Ah, this answers my 'when does finally' execute question above.
Finally, I think it would be cool if the generator could trap
occurrences of break, continue and return occurring in BODY.  We could
introduce a new class of exceptions for these, named ControlFlow, and
(only in the body of a with statement), break would raise BreakFlow,
continue would raise ContinueFlow, and return EXPR would raise
ReturnFlow(EXPR) (EXPR defaulting to None of course).
Well, this is quite a big thing.
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be VAR = yield ... like in Ruby.)
With a little extra magic we could also get the behavior that if the
generator doesn't handle ControlFlow exceptions but re-raises them,
they would affect the code containing the with statement; this means
that the generator can decide whether return, break and continue are
handled locally or passed through to the containing block.
Note that EXPR doesn't have to return a 

Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Michael Hudson
Samuele Pedroni [EMAIL PROTECTED] writes:

 Michael Hudson wrote:

 The history of iterators and generators could be summarized by
 saying that an API was invented, then it turned out that in practice
 one way of implementing them -- generators -- was almost universally
 useful.

 This proposal seems a bit like an effort to make generators good at
 doing something that they aren't really intended -- or dare I say
 suited? -- for.  The tail wagging the dog so to speak.

 it is fun because the two of us sort of already had this discussion in
 compressed form a lot of time ago:

Oh yes.  That was the discussion that led to PEP 310 being written.

 http://groups-beta.google.com/groups?q=with+generators+pedronishl=en

At least I'm consistent :)

 not that I was really conviced about my idea at the time which was
 very embrional,  and in fact I'm bit skeptical right now about how
 much bending or not of generators makes sense, especially for a
 learnability point of view.

As am I, obviously.

Cheers,
mwh

-- 
  Agh, the braindamage!  It's not unlike the massively
  non-brilliant decision to use the period in abbreviations
  as well as a sentence terminator.  Had these people no
  imagination at _all_? -- Erik Naggum, comp.lang.lisp
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread ron adam
Hi, this is my first post here and I've been following this very 
interesting discussion as is has developed. 

A really short intro about me,  I was trained as a computer tech in the 
early 80's... ie. learned transistors, gates, logic etc...  And so my 
focus tends to be from that of a troubleshooter.  I'm medically retired 
now (not a subject for here) and am looking for something meaningful and 
rewarding that I can contribute to with my free time.

I will not post often at first as I am still getting up to speed with 
CVS and how Pythons core works.  Hopefully I'm not lagging this 
discussion too far or adding unneeded noise to it.  :-)

So maybe the 'with' keyword should be dropped (again!) in
favour of
  with_opened(pathname) as f:
...
But that doesn't look so great for the case where there's no variable
to be assigned to -- I wasn't totally clear about it, but I meant the
syntax to be
   with [VAR =] EXPR: BLOCK
where VAR would have the same syntax as the left hand side of an
assignment (or the variable in a for-statement).
I keep wanting to read it as:
  with OBJECT [from EXPR]: BLOCK
2) I'm not sure about the '='. It makes it look rather deceptively
like an ordinary assignment, and I'm sure many people are going
to wonder what the difference is between
  with f = opened(pathname):
do_stuff_to(f)
and simply
  f = opened(pathname)
  do_stuff_to(f)
or even just unconsciously read the first as the second without
noticing that anything special is going on. Especially if they're
coming from a language like Pascal which has a much less magical
form of with-statement.
Below is what gives me the clearest picture so far.  To me there is 
nothing 'anonymous' going on here.  Which is good I think. :-)

After playing around with Guido's example a bit, it looks to me the role 
of a 'with' block is to define the life of a resource object.  so with 
OBJECT: BLOCK seems to me to be the simplest and most natural way to 
express this.

def with_file(filename, mode):
   Create a file resource 
  f = open(filename, mode)
  try:
  yield f# use yield here
  finally:
  # Do at exit of 'with resource: block'
  f.close
# Get a resource/generator object and use it.
f_resource = with_file('resource.py', 'r')
with f_resource:
  f = f_resource.next()   # get values from yields
  for line in f:
  print line,
# Generator resource with yield loop.
def with_file(filename):
   Create a file line resource 
  f = open(filename, 'r')  try:
  for line in f:
  yield line
  finally:
  f.close()
 # print lines in this file.
f_resource = with_file('resource.py')
with f_resource:
  while 1:
  line = f_resource.next()
  if line == :
  break
  print line,
The life of an object used with a 'with' block is shorter than that of 
the function it is called from, but if the function is short, the life 
could be the same as the function. Then the 'with' block could be 
optional if the resource objects __exit__ method is called when the 
function exits, but that may require some way to tag a resource as being 
different from other class's and generators to keep from evaluating 
__exit__ methods of other objects.
As far as looping behaviors go, I prefer the loop to be explicitly 
defined in the resource  or the body of the 'with', because it looks to 
be more flexible.

Ron_Adam
# The right question is a good start to finding the correct answer.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Aahz
On Tue, Apr 26, 2005, Guido van Rossum wrote:

 Now there's one more twist, which you may or may not like.  Presumably
 (barring obfuscations or bugs) the handling of BreakFlow and
 ContinueFlow by an iterator (or generator) is consistent for all uses
 of that particular iterator.  For example synchronized(lock) and
 transactional(db) do not behave as loops, and forever() does.  Ditto
 for handling ReturnFlow.  This is why I've been thinking of leaving
 out the 'with' keyword: in your mind, these calls would become new
 statement types, even though the compiler sees them all the same:
 
 synchronized(lock):
 BLOCK
 
 transactional(db):
 BLOCK
 
 forever():
 BLOCK
 
 opening(filename) as f:
 BLOCK

That's precisely why I think we should keep the ``with``: the point of
Python is to have a restricted syntax and requiring a prefix for these
constructs makes it easier to read the code.  You'll soon start to gloss
over the ``with`` but it will be there as a marker for your subconscious.
-- 
Aahz ([EMAIL PROTECTED])   * http://www.pythoncraft.com/

It's 106 miles to Chicago.  We have a full tank of gas, a half-pack of
cigarettes, it's dark, and we're wearing sunglasses.  Hit it.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Nick Coghlan
Phillip J. Eby wrote:
At 09:12 PM 4/24/05 -0600, Steven Bethard wrote:
I guess it would be helpful to see example where the looping
with-block is useful.

Automatically retry an operation a set number of times before hard failure:
with auto_retry(times=3):
do_something_that_might_fail()
Process each row of a database query, skipping and logging those that 
cause a processing error:

with x,y,z = log_errors(db_query()):
do_something(x,y,z)
You'll notice, by the way, that some of these runtime macros may be 
stackable in the expression.
These are also possible by combining a normal for loop with a non-looping with 
(but otherwise using Guido's exception injection semantics):

def auto_retry(attempts):
success = [False]
failures = [0]
except = [None]
def block():
try:
yield None
except:
failures[0] += 1
else:
success[0] = True
while not success[0] and failures[0]  attempts:
yield block()
if not success[0]:
raise Exception # You'd actually propagate the last inner failure
for attempt in auto_retry(3):
with attempt:
do_something_that_might_fail()
The non-looping version of with seems to give the best of both worlds - 
multipart operation can be handled by multiple with statements, and repeated use 
of the same suite can be handled by nesting the with block inside iteration over 
an appropriate generator.

Cheers,
Nick.
--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
http://boredomandlaziness.skystorm.net
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Paul Moore
On 4/26/05, Jim Jewett [EMAIL PROTECTED] wrote:
 I'm not sure I understand this.  The preferred way would be
 to just stick the keyword before the call.  Using 'collapse', it
 would look like:
 
 def foo(b):
 c=a
 def bar():
 a=a1
 collapse foo(b1)
 print b, c# prints b1, a1
 a=a2
 foo(b2)# Not collapsed this time
 print b, c# still prints b1, a1

*YUK* I spent a long time staring at this and wondering where did b come from?

You'd have to come up with a very compelling use case to get me to like this.

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Guido van Rossum
[Jim Jewett]
  (2)  Add a way to say Make this function I'm calling use *my* locals
  and globals.  This seems to meet all the agreed-upon-as-good use
  cases, but there is disagreement over how to sensibly write it.  The
  calling function is the place that could get surprised, but people
  who want thunks seem to want the specialness in the called function.

[Guido]
  I think there are several problems with this. First, it looks
  difficult to provide semantics that cover all the corners for the
  blending of two namespaces. What happens to names that have a
  different meaning in each scope?

[Jim]
 Programming error.  Same name == same object.

Sounds like a recipe for bugs to me. At the very least it is a total
breach of abstraction, which is the fundamental basis of the
relationship between caller and callee in normal circumstances. The
more I understand your proposal the less I like it.

 If a function is using one of _your_ names for something incompatible,
 then don't call that function with collapsed scope.  The same problem
 happens with globals today.  Code in module X can break if module Y
 replaces (not shadows, replaces) a builtin with an incompatible object.
 
 Except ...
  (E.g. 'self' when calling a method of
  another object; or any other name clash.)
 
 The first argument of a method *might* be a special case.  It seems
 wrong to unbind a bound method.  On the other hand, resource
 managers may well want to use unbound methods for the called
 code.

Well, what would you pass in as the first argument then?

  Are the globals also blended?  How?
 
 Yes.  The callee does not even get to see its normal namespace.
 Therefore, the callee does not get to use its normal name resolution.

Another breach of abstraction: if a callee wants to use an imported
module, the import should be present in the caller, not in the callee.

This seems to me to repeat all the mistakes of the dynamic scoping of
early Lisps (including GNU Emacs Lisp I believe).

It really strikes me as an endless source of errors that these
blended-scope callees (in your proposal) are ordinary
functions/methods, which means that they can *also* be called without
blending scopes. Having special syntax to define a callee intended for
scope-blending seems much more appropriate (even if there's also
special syntax at the call site).

 If the name normally resolves in locals (often inlined to a tuple, today),
 it looks in the shared scope, which is owned by the caller.  This is
 different from a free variable only because the callee can write to this
 dictionary.

Aha! This suggests that a blend-callee needs to use different bytecode
to avoid doing lookups in the tuple of optimized locals, since the
indices assigned to locals in the callee and the caller won't match up
except by miracle.

 If the name is free in that shared scope, (which implies that the
 callee does not bind it, else it would be added to the shared scope)
 then the callee looks up the caller's nested stack and then to the
 caller's globals, and then the caller's builtins.
 
  Second, this construct only makes sense for all callables;

(I meant this to read does not make sense for all callables.)

 Agreed.

(And I presume you read it that way. :-)

 But using it on a non-function may cause surprising results
 especially if bound methods are not special-cased.
 
 The same is true of decorators, which is why we have (at least
 initially) function decorators instead of callable decorators.

Not true. It is possible today to write decorators that accept things
other than functions -- in fact, this is often necessary if you want
to write decorators that combine properly with other decorators that
don't return function objects (such as staticmethod and classmethod).

  it makes no sense when the callable is implemented as
  a C function,
 
 Or rather, it can't be implemented, as the compiler may well
 have optimized the variables names right out.  Stack frame
 transitions between C and python are already special.

Understatement of the year. There just is no similarity between C and
Python stack frames. How much do you really know about Python's
internals???

  or is a class, or an object with a __call__ method.
 
 These are just calls to __init__ (or __new__) or __call__.

No they're not. Calling a class *first* creates an instance (calling
__new__ if it exists) and *then* calls __init__ (if it exists).

 These may be foolish things to call (particularly if the first
 argument to a method isn't special-cased), but ... it isn't
 a problem if the class is written appropriately.  If the class
 is not written appropriately, then don't call it with collapsed
 scope.

That's easy for you to say. Since the failure behavior is so messy I'd
rather not get started.

  Third, I expect that if we solve the first two
  problems, we'll still find that for an efficient implementation we
  need to modify the bytecode of the called function.
 
 Absolutely.  Even giving up the XXX_FAST 

Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Guido van Rossum
[Paul Moore]
 *YUK* I spent a long time staring at this and wondering where did b come 
 from?
 
 You'd have to come up with a very compelling use case to get me to like this.

I couldn't have said it better.

I said it longer though. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks vs scope-collapse

2005-04-26 Thread Greg Ewing
I don't think this proposal has any chance as long as
it's dynamically scoped.
It mightn't be so bad if it were lexically scoped,
i.e. a special way of defining a function so that
it shares the lexically enclosing scope. This
would be implementable, since the compiler has
all the necessary information about both scopes
available.
Although it might be better to have some sort of
outer declaration for rebinding in the enclosing
scope, instead of doing it on a whole-function basis.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-26 Thread Guido van Rossum
  [Greg Ewing]
 * It seems to me that this same exception-handling mechanism
 would be just as useful in a regular for-loop, and that, once
 it becomes possible to put 'yield' in a try-statement, people
 are going to *expect* it to work in for-loops as well.

[Guido]
  (You can already put a yield inside a try-except, just not inside a
  try-finally.)

[Greg]
 Well, my point still stands. People are going to write
 try-finally around their yields and expect the natural
 thing to happen when their generator is used in a
 for-loop.

Well, the new finalization semantics should take care of that when
their generator is finalized -- its __next__() will be called with
some exception.  But as long you hang on to the generator, it will not
be finalized, which is distinctly different from the desired
with-statement semantics.

  There would still be the difference that a for-loop invokes iter()
  and a with-block doesn't.
  
   Also, for-loops that don't exhaust the iterator leave it
   available for later use.
 
 Hmmm. But are these big enough differences to justify
 having a whole new control structure? Whither TOOWTDI?

Indeed, but apart from declaring that henceforth the with-statement
(by whatever name) is the recommended looping construct and a
for-statement is just a backwards compatibility macro, I just don't
see how we can implement the necessary immediate cleanup semantics of
a with-statement.  In order to serve as a resource cleanup statement
it *must* have stronger cleanup guarantees than the for-statement can
give (if only for backwards compatibility reasons).

  
  The statement:
 
  for VAR in EXPR:
  BLOCK
 
  does the same thing as:
 
  with iter(EXPR) as VAR:# Note the iter() call
  BLOCK
 
  except that:
 
  - you can leave out the as VAR part from the with-statement;
  - they work differently when an exception happens inside BLOCK;
  - break and continue don't always work the same way.
 
  The only time you should write a with-statement is when the
  documentation for the function you are calling says you should.
  
 
 Surely you jest. Any newbie reading this is going to think
 he hasn't a hope in hell of ever understanding what is going
 on here, and give up on Python in disgust.

And surely you exaggerate.  How about this then:

The with-statement is similar to the for-loop.  Until you've
learned about the differences in detail, the only time you should
write a with-statement is when the documentation for the function
you are calling says you should.

 I'm seriously worried by the
 possibility that a return statement could do something other
 than return from the function it's written in.
 
  Let me explain the use cases that led me to throwing that in
 
 Yes, I can see that it's going to be necessary to treat
 return as an exception, and accept the possibility that
 it will be abused. I'd still much prefer people refrain
 from abusing it that way, though. Using return to spell
 send value back to yield statement would be extremely
 obfuscatory.

That depends on where you're coming from.  To Ruby users it will look
completely natural because that's what Ruby uses.  (In fact it'll be a
while before they appreciate the deep differences between yield in
Python and in Ruby.)

But I accept that in Python we might want to use a different keyword
to pass a value to the generator.  I think using 'continue' should
work; continue with a value has no precedent in Python, and continue
without a value happens to have exactly the right semantics anyway.

  (BTW ReturnFlow etc. aren't great
  names.  Suggestions?)
 
 I'd suggest just calling them Break, Continue and Return.

Too close to break, continue and return IMO.

  One last thing: if we need a special name for iterators and
  generators designed for use in a with-statement, how about calling
  them with-iterators and with-generators.
 
 Except that if it's no longer a with statement, this
 doesn't make so much sense...

Then of course we'll call it after whatever the new statement is going
to be called.  If we end up calling it the foible-statement, they will
be foible-iterators and foible-generators.

Anyway, I think I'll need to start writing a PEP.  I'll ask the PEP
editor for a number.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Rodrigo Dias Arruda Senra
[ Simon Percivall ]:
 [ Terry Reedy ]:
  with target as value:
 
  would parallel the for-statement header and read smoother to me.
 
  for target as value:
 
  would not need new keyword, but would require close reading to  
  distinguish
  'as' from 'in'.
 
 But it also moves the value to the right, removing focus. Wouldn't  
 from
 be a good keyword to overload here?
 
 in/with/for/ value from target:
  BODY

 I do not have strong feelings about this issue, but for
 completeness sake...

 Mixing both suggestions:

 from target as value:
 BODY

 That resembles an import statement which some
 may consider good (syntax/keyword reuse) or
 very bad (confusion?, value focus).

 cheers,
 Senra

-- 
Rodrigo Senra 
--
MSc Computer Engineerrodsenra(at)gpr.com.br  
GPr Sistemas Ltdahttp://www.gpr.com.br/ 
Personal Blog http://rodsenra.blogspot.com/

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Tim Delaney
Paul Moore wrote:
Hmm, it took me a while to get this, but what you're ssaying is that
if you modify Guido's what I really want solution to use
   VAR = next(it, exc)
then this builtin next makes API v2 stuff using __next__ work while
remaining backward compatible with old-style API v1 stuff using
0-arg next() (as long as old-style stuff isn't used in a context where
an exception gets passed back in).
Yes, but it could also be used (almost) anywhere an explicit obj.next() is 
used.

it = iter(seq)
while True:
   print next(it)
for loops would also change to use builtin next() rather than calling 
it.next() directly.

I'd suggest that the new builtin have a magic name (__next__ being
the obvious one :-)) to make it clear that it's an internal
implementation detail.
There aren't many builtins that have magic names, and I don't think this 
should be one of them - it has obvious uses other than as an implementation 
detail.

PS The first person to replace builtin __next__ in order to implement
a next hook of some sort, gets shot :-)
Damn! There goes the use case ;)
Tim Delaney 

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Tim Delaney wrote:
There aren't many builtins that have magic names, and I don't think this 
should be one of them - it has obvious uses other than as an 
implementation detail.
I think there's some confusion here. As I understood the
suggestion, __next__ would be the Python name of the method
corresponding to the tp_next typeslot, analogously with
__len__, __iter__, etc.
There would be a builtin function next(obj) which would
invoke obj.__next__(), for use by Python code. For loops
wouldn't use it, though; they would continue to call the
tp_next typeslot directly.
Paul Moore wrote: 
PS The first person to replace builtin __next__ in order to implement
a next hook of some sort, gets shot :-)
I think he meant next(), not __next__. And it wouldn't
work anyway, since as I mentioned above, C code would
bypass next() and call the typeslot directly.
I'm +1 on moving towards __next__, BTW. IMO, that's the
WISHBDITFP. :-)
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Guido van Rossum wrote:
with VAR = EXPR:
BODY
This would translate to the following code:
it = EXPR
err = None
while True:
try:
if err is None:
VAR = it.next()
else:
VAR = it.next_ex(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend except Exception: == except:
if not hasattr(it, next_ex):
raise
I like the general shape of this, but I have one or two
reservations about the details.
1) We're going to have to think carefully about the naming of
functions designed for use with this statement. If 'with'
is going to be in there as a keyword, then it really shouldn't
be part of the function name as well. Instead of
  with f = with_file(pathname):
...
I would rather see something like
  with f = opened(pathname):
...
This sort of convention (using a past participle as a function
name) would work for some other cases as well:
  with some_data.locked():
...
  with some_resource.allocated():
...
On the negative side, not having anything like 'with' in the
function name means that the fact the function is designed for
use in a with-statement could be somewhat non-obvious. Since
there's not going to be much other use for such a function,
this is a bad thing.
It could also lead people into subtle usage traps such as
  with f = open(pathname):
...
which would fail in a somewhat obscure way.
So maybe the 'with' keyword should be dropped (again!) in
favour of
  with_opened(pathname) as f:
...
2) I'm not sure about the '='. It makes it look rather deceptively
like an ordinary assignment, and I'm sure many people are going
to wonder what the difference is between
  with f = opened(pathname):
do_stuff_to(f)
and simply
  f = opened(pathname)
  do_stuff_to(f)
or even just unconsciously read the first as the second without
noticing that anything special is going on. Especially if they're
coming from a language like Pascal which has a much less magical
form of with-statement.
So maybe it would be better to make it look more different:
  with opened(pathname) as f:
...
* It seems to me that this same exception-handling mechanism
would be just as useful in a regular for-loop, and that, once
it becomes possible to put 'yield' in a try-statement, people
are going to *expect* it to work in for-loops as well.
Guido has expressed concern about imposing extra overhead on
all for-loops. But would the extra overhead really be all that
noticeable? For-loops already put a block on the block stack,
so the necessary processing could be incorporated into the
code for unwinding a for-block during an exception, and little
if anything would need to change in the absence of an exception.
However, if for-loops also gain this functionality, we end up
with the rather embarrassing situation that there is *no difference*
in semantics between a for-loop and a with-statement!
This could be fixed by making the with-statement not loop,
as has been suggested. That was my initial thought as well,
but having thought more deeply, I'm starting to think that
Guido was right in the first place, and that a with-statement
should be capable of looping. I'll elaborate in another post.
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be VAR = yield ... like in Ruby.)
This is a very elegant idea, but I'm seriously worried by the
possibility that a return statement could do something other
than return from the function it's written in, especially if
for-loops also gain this functionality. Intercepting break
and continue isn't so bad, since they're already associated
with the loop they're in, but return has always been an
unconditional get-me-out-of-this-function. I'd feel uncomfortable
if this were no longer true.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-25 Thread Greg Ewing
Brett C. wrote:
And before anyone decries the fact that this might confuse a newbie (which
seems to happen with every advanced feature ever dreamed up), remember this
will not be meant for a newbie but for someone who has experience in Python and
iterators at the minimum, and hopefully with generators.
This is dangerously close to the you don't need to know about
it if you're not going to use it argument, which is widely
recognised as false. Newbies might not need to know all the
details of the implementation, but they will need to know
enough about the semantics of with-statements to understand
what they're doing when they come across them in other people's
code.
Which leads me to another concern. How are we going to explain
the externally visible semantics of a with-statement in a way
that's easy to grok, without mentioning any details of the
implementation?
You can explain a for-loop pretty well by saying something like
It executes the body once for each item from the sequence,
without having to mention anything about iterators, generators,
next() methods, etc. etc. How the items are produced is completely
irrelevant to the concept of the for-loop.
But what is the equivalent level of description of the
with-statement going to say?
It executes the body with... ???
And a related question: What are we going to call the functions
designed for with-statements, and the objects they return?
Calling them generators and iterators (even though they are)
doesn't seem right, because they're being used for a purpose
very different from generating and iterating.
--
Greg Ewing, Computer Science Dept, +--+
University of Canterbury,  | A citizen of NewZealandCorp, a   |
Christchurch, New Zealand  | wholly-owned subsidiary of USA Inc.  |
[EMAIL PROTECTED]  +--+
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-24 Thread Brett C.
Guido van Rossum wrote:
[SNIP]
 Now let me propose a strawman for the translation of the latter into
 existing semantics. Let's take the generic case:
 
 with VAR = EXPR:
 BODY
 
 This would translate to the following code:
[SNIP]
 
 it = EXPR
 err = None
 while True:
 try:
 VAR = it.next(err)
 except StopIteration:
 break
 try:
 err = None
 BODY
 except Exception, err: # Pretend except Exception: == except:
 pass
 
 but for backwards compatibility with the existing argument-less next()
 API I'm introducing a new iterator API next_ex() which takes an
 exception argument.

Can I suggest the name next_exc() instead?  Everything in the sys module uses
exc as the abbreviation for exception.  I realize you might be suggesting
using the ex as the suffix because of the use of that as the suffix in the C
API for an extended API, but that usage is not prominent in the stdlib.

Also, would this change in Python 3000 so that both next_ex() and next() are
merged into a single method?

As for an opinion of the need of 'with', I am on the fence, leaning towards
liking it.  To make sure I am understanding the use case, it is to help
encapsulate typical resource management with proper cleanup in another function
instead of having to constantly pasting in boilerplate into your code, right?
So the hope is to be able to create factory functions, typically implemented as
a generator, that encapsulate the obtaining, temporary lending out, and cleanup
of a resource?

Is there some other use that I am totally missing that is obvious?

  If that argument is None, it should behave just
 like next().  Otherwise, if the iterator is a generator, this will
 raised that exception in the generator's frame (at the point of the
 suspended yield).  If the iterator is something else, the something
 else is free to do whatever it likes; if it doesn't want to do
 anything, it can just re-raise the exception.
 
 Also note that, unlike the for-loop translation, this does *not*
 invoke iter() on the result of EXPR; that's debatable but given that
 the most common use case should not be an alternate looping syntax
 (even though it *is* technically a loop) but a more general macro
 statement expansion, I think we can expect EXPR to produce a value
 that is already an iterator (rather than merely an interable).
 
 Finally, I think it would be cool if the generator could trap
 occurrences of break, continue and return occurring in BODY.  We could
 introduce a new class of exceptions for these, named ControlFlow, and
 (only in the body of a with statement), break would raise BreakFlow,
 continue would raise ContinueFlow, and return EXPR would raise
 ReturnFlow(EXPR) (EXPR defaulting to None of course).
 
 So a block could return a value to the generator using a return
 statement; the generator can catch this by catching ReturnFlow.
 (Syntactic sugar could be VAR = yield ... like in Ruby.)
 
 With a little extra magic we could also get the behavior that if the
 generator doesn't handle ControlFlow exceptions but re-raises them,
 they would affect the code containing the with statement; this means
 that the generator can decide whether return, break and continue are
 handled locally or passed through to the containing block.
 

Honestly, I am not very comfortable with this magical meaning of 'break',
'continue', and 'return' in a 'with' block.  I realize 'return' already has
special meaning in an generator, but I don't think that is really needed
either.  It leads to this odd dichotomy where a non-exception-related statement
directly triggers an exception in other code.  It seems like code doing
something behind my back; remember, it looks like a 'continue', but it really
is a method call with a specific exception instance.  Surprise!

Personally, what I would rather see, is to have next_ex(), for a generator,
check if the argument is a subclass of Exception.  If it is, raise it as such.
 If not, have the 'yield' statement return the passed-in argument.  This use of
it would make sense for using the next_ex() name.

Then again I guess having exceptions triggering a method call instead of
hitting an 'except' statement is already kind of surprise semantics anyway.
=)  Still, I would like to minimize the surprises that we could spring.

And before anyone decries the fact that this might confuse a newbie (which
seems to happen with every advanced feature ever dreamed up), remember this
will not be meant for a newbie but for someone who has experience in Python and
iterators at the minimum, and hopefully with generators.  Not exactly meant for
someone for which raw_input() still holds a wow factor for.  =)

 Note that EXPR doesn't have to return a generator; it could be any
 object that implements next() and next_ex().  (We could also require
 next_ex() or even next() with an argument; perhaps this is better.)
 

Yes, that requirement would 

Re: [Python-Dev] Re: anonymous blocks

2005-04-24 Thread Phillip J. Eby
At 04:57 PM 4/24/05 -0700, Guido van Rossum wrote:
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be VAR = yield ... like in Ruby.)
[uncontrolled drooling, followed by much rejoicing]
If this were available to generators in general, you could untwist 
Twisted.  I'm basically simulating this sort of exception/value passing in 
peak.events to do exactly that, except I have to do:

yield somethingBlocking(); result=events.resume()
where events.resume() magically receives a value or exception from outside 
the generator and either returns or raises it.  If next()-with-argument and 
next_ex() are available normally on generators, this would allow you to 
simulate co-routines without the events.resume() magic; the above would 
simply read:

result = yield somethingBlocking()
The rest of the peak.events coroutine simulation would remain around to 
manage the generator stack and scheduling, but the syntax would be cleaner 
and the operation of it entirely unmagical.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-24 Thread Bob Ippolito
On Apr 24, 2005, at 11:32 PM, Phillip J. Eby wrote:
At 04:57 PM 4/24/05 -0700, Guido van Rossum wrote:
So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be VAR = yield ... like in Ruby.)
[uncontrolled drooling, followed by much rejoicing]
If this were available to generators in general, you could untwist 
Twisted.  I'm basically simulating this sort of exception/value 
passing in peak.events to do exactly that, except I have to do:

yield somethingBlocking(); result=events.resume()
where events.resume() magically receives a value or exception from 
outside the generator and either returns or raises it.  If 
next()-with-argument and next_ex() are available normally on 
generators, this would allow you to simulate co-routines without the 
events.resume() magic; the above would simply read:

result = yield somethingBlocking()
The rest of the peak.events coroutine simulation would remain around 
to manage the generator stack and scheduling, but the syntax would be 
cleaner and the operation of it entirely unmagical.
Only if result = yield somethingBlocking() could also raise an 
exception.

-bob
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-21 Thread Paul Moore
On 4/20/05, Samuele Pedroni [EMAIL PROTECTED] wrote:
 
 
 
 def do():
 print setup
 try:
 yield None
 finally:
 print tear down
 
  doesn't quite work (if it did, all you would need is syntactic sugar
  for for
  dummy in).
 
 PEP325 is about that

And, of course, PEP 310 is all about encapsulating before/after
(acquire/release) actions.

Paul.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-21 Thread Bob Ippolito
On Apr 21, 2005, at 6:28 AM, Fredrik Lundh wrote:
Glyph Lefkowitz wrote:
Despite being guilty of propagating this style for years myself, I 
have to disagree.  Consider the
following network-conversation using Twisted style (which, I might 
add, would be generalizable to
other Twisted-like systems if they existed ;-)):

def strawman(self):
def sayGoodbye(mingleResult):
def goAway(goodbyeResult):
self.loseConnection()
self.send(goodbye).addCallback(goAway)
def mingle(helloResult):
self.send(nice weather we're having).addCallback(sayGoodbye)
self.send(hello).addCallback(mingle)
def iterman(self):
yield hello
yield nice weather we're having
yield goodbye
Which, more or less works, for a literal translation of the straw-man 
above.  However, you're missing the point.  These deferred operations 
actually return results.  Generators offer no sane way to pass results 
back in.  If they did, then this use case could be mostly served by 
generators.

-bob
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks (off topic: match)

2005-04-20 Thread Shannon -jj Behrens
 PS. a side effect of the for-in pattern is that I'm beginning to feel that 
 Python
 might need a nice switch statement based on dictionary lookups, so I can
 replace multiple callbacks with a single loop body, without writing too many
 if/elif clauses.

That's funny.  I keep wondering if match from the ML world would
make sense in Python.  I keep thinking it'd be a really nice thing to
have.

-jj

-- 
I have decided to switch to Gmail, but messages to my Yahoo account will
still get through.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-20 Thread Aahz
On Wed, Apr 20, 2005, [EMAIL PROTECTED] wrote:

 As students keep on asking me about the differences between languages
 and the pros and cons, I think I may claim some familiarity with
 other languages too, especially Python's self-declared antithesis,
 Ruby. 

That seems a little odd to me.  To the extent that Python has an
antithesis, it would be either C++ or Perl.  Ruby is antithetical to some
of Python's core ideology because it borrows from Perl, but Ruby is much
more similar to Python than Perl is.
-- 
Aahz ([EMAIL PROTECTED])   * http://www.pythoncraft.com/

The joy of coding Python should be in seeing short, concise, readable
classes that express a lot of action in a small amount of clear code -- 
not in reams of trivial code that bores the reader to death.  --GvR
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-20 Thread A.M. Kuchling
On Wed, Apr 20, 2005 at 08:18:11AM -0700, Aahz wrote:
 antithesis, it would be either C++ or Perl.  Ruby is antithetical to some
 of Python's core ideology because it borrows from Perl, but Ruby is much
 more similar to Python than Perl is.

I'm not that familiar with the Ruby community; might it be that they
consider Ruby to be Python's antithesis, in that it returns to
bracketing instead of Python's indentation?

--amk
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Re: anonymous blocks

2005-04-19 Thread Shane Hathaway
Fredrik Lundh wrote:
 Brian Sabbey wrote:
 doFoo(**):
 def func1(a, b):
 return a + b
 def func2(c, d):
 return c + d

 That is, a suite can be used to define keyword arguments.
 
 
 umm.  isn't that just an incredibly obscure way to write
 
def func1(a, b):
return a + b
def func2(c, d):
return c + d
doFoo(func1, func2)
 
 but with more indentation?

Brian's suggestion makes the code read more like an outline.  In Brian's
example, the high-level intent stands out from the details, while in
your example, there is no visual cue that distinguishes the details from
the intent.  Of course, lambdas are even better, when it's possible to
use them:

doFoo((lambda a, b: a + b), (lambda c, d: c + d))

Shane
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com