Whew!  This is a bit long...

On 25 Apr 2005, at 00:57, Guido van Rossum wrote:

After reading a lot of contributions (though perhaps not all -- this
thread seems to bifurcate every time someone has a new idea :-)

I haven't read all the posts around the subject, I'll have to admit. I've read the one I'm replying and its followups to pretty carefully, though.


I'm back to liking yield for the PEP 310 use case. I think maybe it was
Doug Landauer's post mentioning Beta, plus scanning some more examples
of using yield in Ruby. Jim Jewett's post on defmacro also helped, as
did Nick Coghlan's post explaining why he prefers 'with' for PEP 310
and a bare expression for the 'with' feature from Pascal (and other
languages :-).

The history of iterators and generators could be summarized by saying that an API was invented, then it turned out that in practice one way of implementing them -- generators -- was almost universally useful.


This proposal seems a bit like an effort to make generators good at doing something that they aren't really intended -- or dare I say suited? -- for. The tail wagging the dog so to speak.

It seems that the same argument that explains why generators are so
good for defining iterators, also applies to the PEP 310 use case:
it's just much more natural to write

    def with_file(filename):
        f = open(filename)
        try:
            yield f
        finally:
            f.close()

This is a syntax error today, of course. When does the finally: clause execute with your proposal? [I work this one out below :)]


than having to write a class with __entry__ and __exit__ and
__except__ methods (I've lost track of the exact proposal at this
point).

At the same time, having to use it as follows:

    for f in with_file(filename):
        for line in f:
            print process(line)

is really ugly,

This is a non-starter, I hope. I really meant what I said in PEP 310 about loops being loops.


so we need new syntax, which also helps with keeping
'for' semantically backwards compatible. So let's use 'with', and then
the using code becomes again this:

    with f = with_file(filename):
        for line in f:
            print process(line)

Now let me propose a strawman for the translation of the latter into
existing semantics. Let's take the generic case:

    with VAR = EXPR:
        BODY

This would translate to the following code:

it = EXPR
err = None
while True:
try:
if err is None:
VAR = it.next()
else:
VAR = it.next_ex(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend "except Exception:" == "except:"
if not hasattr(it, "next_ex"):
raise


(The variables 'it' and 'err' are not user-visible variables, they are
internal to the translation.)

This looks slightly awkward because of backward compatibility; what I
really want is just this:

it = EXPR
err = None
while True:
try:
VAR = it.next(err)
except StopIteration:
break
try:
err = None
BODY
except Exception, err: # Pretend "except Exception:" == "except:"
pass


but for backwards compatibility with the existing argument-less next()
API

More than that: if I'm implementing an iterator for, uh, iterating, why would one dream of needing to handle an 'err' argument in the next() method?


I'm introducing a new iterator API next_ex() which takes an
exception argument.  If that argument is None, it should behave just
like next().  Otherwise, if the iterator is a generator, this will
raised that exception in the generator's frame (at the point of the
suspended yield).  If the iterator is something else, the something
else is free to do whatever it likes; if it doesn't want to do
anything, it can just re-raise the exception.

Ah, this answers my 'when does finally' execute question above.

Finally, I think it would be cool if the generator could trap
occurrences of break, continue and return occurring in BODY.  We could
introduce a new class of exceptions for these, named ControlFlow, and
(only in the body of a with statement), break would raise BreakFlow,
continue would raise ContinueFlow, and return EXPR would raise
ReturnFlow(EXPR) (EXPR defaulting to None of course).

Well, this is quite a big thing.

So a block could return a value to the generator using a return
statement; the generator can catch this by catching ReturnFlow.
(Syntactic sugar could be "VAR = yield ..." like in Ruby.)

With a little extra magic we could also get the behavior that if the
generator doesn't handle ControlFlow exceptions but re-raises them,
they would affect the code containing the with statement; this means
that the generator can decide whether return, break and continue are
handled locally or passed through to the containing block.

Note that EXPR doesn't have to return a generator; it could be any
object that implements next() and next_ex().  (We could also require
next_ex() or even next() with an argument; perhaps this is better.)

My main objection to all this is that it conflates iteration and a more general kind of execution control (I guess iteration is a kind of execution control, but I contend that it's a sufficiently common case to get special treatment and also that names like 'for' and 'next' are only applicable to iteration).


So, here's a counterproposal!

with expr as var:
   ... code ...

is roughly:

def _(var):
    ... code ...
__private = expr
__private(_)

(var optional as in other proposals).

so one might write:

def open_file(f):
    def inner(block):
        try:
            block(f)
        finally:
            f.close()
    return inner

and have

with auto_closing(open("/tmp/foo")) as f:
    f.write('bob')

The need for approximation in the above translation is necessary because you'd want to make assignments in '...code...' affect the scope their written in, and also one might want to allow breaks and continues to be handled as in the end of your proposal. And grudgingly, I guess you'd need to make returns behave like that anyway.

Has something like this been argued out somewhere in this thread?

As another example, here's how you'd implement something very like a for loop:

def as_for_loop(thing):
    it = iter(thing)
    def inner(thunk):
        while 1:
            try:
                v = it.next()
            except StopIteration:
                break
            try:
                thunk(v)
            except Continue:
                continue
            except Break:
                break

so

for x in s:

and

with as_for_loop(s) as x:

are now equivalent (I hope :).

Cheers,
mwh

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to