On Sun, Nov 23, 2014 at 7:30 AM, Raymond Hettinger
<raymond.hettin...@gmail.com> wrote:
> Legitimate Use Cases for Raising StopIteration in a Generator
> ------------------------------------------------------------
>
> In a producer/consumer generator chain, the input generator signals
> it is done by raising StopIteration and the output generator signals
> that it is done by raising StopIteration (this isn't in dispute).
>
> That use case is currently implemented something like this:
>
>     def middleware_generator(source_generator):
>         it = source_generator()
>         input_value = next(it)
>         output_value = do_something_interesting(input_value)
>         yield output_value

Does your middleware_generator work with just a single element,
yielding either one output value or none? Or is it more likely to be
iterating over the source generator:

def middleware_generator(source_generator):
    for input_value in source_generator:
        yield do_something_interesting(input_value)

MUCH tidier code, plus it's safe against unexpected StopIterations.
Even if you can't do it this way, all you need is to stick a 'break'
at the end of the loop, if the try/except is so abhorrent. Wouldn't be
the first time I've seen a loop with a hard break at the end of it.

> This doesn't make the code better in any way.  The new code
> is wordy, slow, and unnecessarily convoluted:
>
>     def middleware_generator(source_generator):
>         it = source_generator()
>         try:
>             input_value = next(it)
>         except StopIteration:
>             return             # This causes StopIteration to be reraised
>         output_value = do_something_interesting(input_value)
>         yield output_value

The raising of StopIteration here is an implementation detail; you
might just as well have a comment saying "This causes the function to
set an exception state and return NULL", which is what happens at the
C level.

What happens if do_something_interesting happens to raise
StopIteration? Will you be surprised that this appears identical to
the source generator yielding nothing? That's current behaviour. You
don't have the option of narrowing the try/except scope as you've done
above.

> Is next() Surprising?
> ---------------------
>
> If someone in this thread says that they were suprised that next() could
> raise StopIteration, I don't buy it.

Agreed, I don't think that's surprising to anyone.

> Being able to consume a value from an iterator stream is a fundamental
> skill and not hard to learn (when I teach iterators and generators, the
> operation of next() has never been a stumbling block).

In anything other than a generator, you're expected to cope with two
possible results from next(): a returned value or a raised
StopIteration. Suppose you want to read a file with a header - you'd
need to do something like this:

def process_file(f):
    f = iter(f)
    try: header=next(f)
    except StopIteration: cope_somehow_maybe_return
    for line in f:
        process_line_with_headers(line, header)

Currently, *if and only if* you're writing a generator, you have an
implicit "except StopIteration: return" there. Anywhere else, you need
to catch that exception.

Iterators are an implementation detail of generators. There is no
particular reason for a generator author to be aware of iterator
protocol, any more than this class needs to be:

class X:
    def __iter__(self):
        return iter([1,2,3,4])

It's perfectly iterable, just as a generator is, and it knows nothing
about StopIteration.

ChrisA
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to