[Raymond] > > Likewise, is it correct that "yield" is anti-parallel to the current > > meaning? Inside a generator, it returns control upwards to the caller. > > But inside a block-iterator, it pushes control downwards (?inwards) to > > the block it controls.
[Guido van Rossum] > I have a hard time visualizing the difference. They feel the same to > me, and the implementation (from the generator's POV) is identical: > yield suspends the current frame, returning to the previous frame from > the call to next() or __next__(), and the suspended frame can be > resumed by calling next() / __next__() again. This concept ought to be highlighted in the PEP because it explains clearly what "yield" does and it may help transition from a non-Dutch mental model. I expect that many folks (me included) think in terms of caller vs callee with a parallel spatial concept of enclosing vs enclosed. In that model, the keywords "continue", "break", "yield", and "return" all imply a control transfer from the enclosed back to the encloser. In contrast, the new use of yield differs in that the suspended frame transfers control from the encloser to the enclosed. > > Are there some good use cases that do not involve resource locking? > > IIRC, that same use case was listed a prime motivating example for > > decorators (i.e. @syncronized). TOOWTDI suggests that a single use case > > should not be used to justify multiple, orthogonal control structures. > > Decorators don't need @synchronized as a motivating use case; there > are plenty of other use cases. No doubt about that. > Anyway, @synchronized was mostly a demonstration toy; whole method > calls are rarely the right granularity of locking. Agreed. Since that is the case, there should be some effort to shift some of the examples towards real use cases where a block-iterator is the appropriate solution. It need not hold-up releasing the PEP to comp.lang.python, but it would go a long way towards improving the quality of the subsequent discussion. > (BTW in the latest > version of PEP 340 I've renamed synchronized to locking; many people > complained about the strange Javaesque term.) That was diplomatic. Personally, I find it amusing when there is an early focus on naming rather than on functionality, implementation issues, use cases, usability, and goodness-of-fit within the language. > > It would be great if we could point to some code in the standard library > > or in a major Python application that would be better (cleaner, faster, > > or clearer) if re-written using blocks and block-iterators > look > more closely at Queue, and you'll find that the two such methods use > different locks! I don't follow this one. Tim's uses of not_empty and not_full are orthogonal (pertaining to pending gets at one end of the queue and to pending puts at the other end). The other use of the mutex is independent of either pending puts or gets; instead, it is a weak attempt to minimize what can happen to the queue during a size query. While the try/finallys could get factored-out into separate blocks, I do not see how the code could be considered better off. There is a slight worsening of all metrics of merit: line counts, total number of function defs, number of calls, or number of steps executed outside the lock (important given that the value a query result declines rapidly once the lock is released). > Also the use case for closing a file upon leaving a block, while > clearly a resource allocation use case, doesn't work well with a > decorator. Right. > I just came across another use case that is fairly common in the > standard library: redirecting sys.stdout. This is just a beauty (in > fact I'll add it to the PEP): > > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout This is the strongest example so far. When adding it to the PEP, it would be useful to contrast the code with simpler alternatives like PEP 288's g.throw() or PEP 325's g.close(). On the plus side, the block-iterator approach factors out code common to multiple callers. On the minus side, the other PEPs involve simpler mechanisms and their learning curve would be nearly zero. These pluses and minuses are important because apply equally to all examples using blocks for initialization/finalization. Raymond _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com