Re: [Python-Dev] code blocks using 'for' loops and generators
Greg Ewing <[EMAIL PROTECTED]> wrote: > > Josiah Carlson wrote: > > > Since PEP 310 was already mentioned, can we just say that the discussion > > can be boiled down to different ways of spelling __enter__/__exit__ from > > PEP 310? > > It's not quite the same thing. PEP 310 suggests a mechanism > with a fixed control flow -- do something on entry, do the > code block, do something on exit. A general block-passing > mechanism would give complete control to the function as > to when and how to call the block. I would like to ask a question. Does Python want or need Ruby code blocks? I ask because that is what is being offered to cure what ails us. Don't get me wrong, I'm sure code blocks can solve quite a few problems (and is used as such in Ruby), but I'm not convinced that it is the solution for Python. Any manual on Ruby will invariably discuss code blocks as one of the most powerful features Ruby has to offer. Sounds great. Sounds like a great big sledgehammer that can be used to do a bunch of things...so what is currently being proposed as a use for them, and what can't they do (that would be really nice)? They are being offered, right now, as a setup and finalization mechanism. Essentially a way of allowing people to wrap their own blocks of code in custom try/finally code, or whatever their minds can think up. Ok, __enter__/__exit__ offers that. What else? If you were to pass your generator as a code block, then you could finalize a generator [1], and even raise exceptions in your code block, but it still wouldn't allow one to pass exceptions into a currently running generator (a long standing problem such that if we could, then we would get generator finalization for free [2]). What else? Let's dig back into the python-dev archives... http://mail.python.org/pipermail/python-dev/2003-February/032739.html Guido: > - synchronized-like, where the block should connect to its environment > > - property-like, where the block should introduce a new scope, and the > caller should be able to inspect or control that scope's namespace The synchronized-like thing is the custom try/finally, aka __enter__/__exit__ as specified in PEP 310. The property-like thing was perhaps to be an easier way to generate properties, which fairly quickly fell to the wayside in discussion, seemingly because people didn't see a need to add thunks for this. Later XML DOM parsing came into the discussion, and was quickly dismissed as being not possible due to the Python parser's limited lookahead. Someone else mentioned that thunks could be used to generate a switch statement, but no elaboration was done, and no PEP was actually written (switch has also had its own PEP, and even talk of a peephole optimization for certain if/elif/else blocks to be made into dictionary lookups...) So where has all this reading gotten me? To the point that I believe previous discussion had concluded that Ruby-style code blocks have little use in Python. *shrug* Greg: > However, it's possible that if generators were enhanced > with some means of allowing yield inside try-finally, > then generators plus PEP 310 would cover most use cases: > for-loops and generators when you want to loop, and > PEP 310 when you don't. > > So rather than coming up with new syntax at this point, > perhaps it would be better to concentrate on the problem > of yield inside try-finally. Even if the finally can't > be guaranteed to run under all conditions, I think it > would be useful if it could be arranged so that > >for x in somegenerator(): > ... > raise Blather > ... > > would caused any finallies that the generator was suspended > inside to be executed. Then the semantics would be the > same as if the for-loop-over-generator were implemented by > passing a thunk to a function that calls it repeatedly. PEP 288 using gen.throw() to trigger early finalization, with try/finally allowed. - Josiah [1] thunk limitations with generator exceptions def finalize(thunk, seq): #setup try: thunk(seq) finally: #finalize def foo(seq): with s=finalize(seq): for i in s: #do something with i in finalize's namespace... yield f(i) #raise an exception if desired for j in foo(seq): #do something with j #can't raise an exception in foo or finalize's executing code [2] PEP 288 goodies, with try/finally allowed (with code duplication, or refactoring, this can be done with try/except/else) def gen(): #setup try: #yields here finally: #finalization raise a = gen() for i in a: if some condition: #stop the generator and call cleanup code a.throw(StopIteration) #body ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archiv
[Python-Dev] itertools.walk()
Some folks on comp.lang.python have been pushing for itertools to include a flatten() operation. Unless you guys have some thoughts on the subject, I'm inclined to accept the request. Rather than calling it flatten(), it would be called "walk" and provide a generalized capability to descend through nested iterables (similar to what os.walk does for directories). The one wrinkle is having a stoplist argument to specify types that should be considered atomic eventhough they might be iterable (strings for example). Raymond Hettinger ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Phillip J. Eby wrote: At 10:36 PM 3/15/05 +1000, Nick Coghlan wrote: Does deciding whether or not to supply the function really need to be dependent on whether or not a format for __signature__ has been chosen? Um, no. Why would you think that? Pronoun confusion. I interpreted an 'it' in your message as referring to update_meta, but I now realise you only meant __signature__ :) Cheers, Nick -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] code blocks using 'for' loops and generators
Josiah Carlson wrote: Greg Ewing <[EMAIL PROTECTED]> wrote: Josiah Carlson wrote: Since PEP 310 was already mentioned, can we just say that the discussion can be boiled down to different ways of spelling __enter__/__exit__ from PEP 310? It's not quite the same thing. PEP 310 suggests a mechanism with a fixed control flow -- do something on entry, do the code block, do something on exit. A general block-passing mechanism would give complete control to the function as to when and how to call the block. I would like to ask a question. Does Python want or need Ruby code blocks? I ask because that is what is being offered to cure what ails us. Don't get me wrong, I'm sure code blocks can solve quite a few problems (and is used as such in Ruby), but I'm not convinced that it is the solution for Python. Any manual on Ruby will invariably discuss code blocks as one of the most powerful features Ruby has to offer. Sounds great. Sounds like a great big sledgehammer that can be used to do a bunch of things...so what is currently being proposed as a use for them, and what can't they do (that would be really nice)? They are being offered, right now, as a setup and finalization mechanism. Essentially a way of allowing people to wrap their own blocks of code in custom try/finally code, or whatever their minds can think up. Ok, __enter__/__exit__ offers that. What else? If you were to pass your generator as a code block, then you could finalize a generator [1], and even raise exceptions in your code block, but it still wouldn't allow one to pass exceptions into a currently running generator (a long standing problem such that if we could, then we would get generator finalization for free [2]). What else? Let's dig back into the python-dev archives... http://mail.python.org/pipermail/python-dev/2003-February/032739.html Guido: - synchronized-like, where the block should connect to its environment - property-like, where the block should introduce a new scope, and the caller should be able to inspect or control that scope's namespace The synchronized-like thing is the custom try/finally, aka __enter__/__exit__ as specified in PEP 310. The property-like thing was perhaps to be an easier way to generate properties, which fairly quickly fell to the wayside in discussion, seemingly because people didn't see a need to add thunks for this. Later XML DOM parsing came into the discussion, and was quickly dismissed as being not possible due to the Python parser's limited lookahead. Someone else mentioned that thunks could be used to generate a switch statement, but no elaboration was done, and no PEP was actually written (switch has also had its own PEP, and even talk of a peephole optimization for certain if/elif/else blocks to be made into dictionary lookups...) So where has all this reading gotten me? To the point that I believe previous discussion had concluded that Ruby-style code blocks have little use in Python. *shrug* well, I think some people desire a more streamlined way of writing code like: def f(...) ... def g(...) ... x = h(...,f,g) [property, setting up callbacks etc are cases of this] were f,g etc definitions would appear inline and the whole has a statement flavor; because this is Python a form that does not involve a lot parantheses would be nice. Of course if the functions then are allowed to change the surrounding bindings this could be used for resource release issues etc. Notice that decorators basically support a special case of this. But yes, apart for the issue of rebinding (and if one wants non-local returns), this is stricly about sugar. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] itertools.walk()
On Mar 16, 2005, at 6:19, Raymond Hettinger wrote:
Some folks on comp.lang.python have been pushing for itertools to
include a flatten() operation. Unless you guys have some thoughts on
the subject, I'm inclined to accept the request.
Rather than calling it flatten(), it would be called "walk" and provide
a generalized capability to descend through nested iterables (similar
to
what os.walk does for directories). The one wrinkle is having a
stoplist argument to specify types that should be considered atomic
eventhough they might be iterable (strings for example).
You could alternatively give them a way to supply their own "iter"
function, like the code I demonstrate below:
from itertools import chain
def nostring_iter(obj):
if isinstance(obj, basestring):
raise TypeError
return iter(obj)
def uniqueiter_factory(iterfunc=nostring_iter):
def uniqueiter(obj, uniques={}):
iterable = iterfunc(obj)
if id(obj) in uniques:
raise TypeError
uniques[id(obj)] = obj
return iterable
return uniqueiter
# maybe there should be a bfswalk too?
def walk(iterable, iterfunc=nostring_iter):
iterables = iter((iterable,))
while True:
for obj in iterables:
try:
iterable = iterfunc(obj)
except TypeError:
yield obj
else:
iterables = chain(iterable, iterables)
break
else:
break
>>> data = [('foo', 'bar'), 'baz', 5]
>>> list(walk(data))
['foo', 'bar', 'baz', 5]
>>> list(walk(data, uniqueiter_factory(iter)))
['f', 'o', 'o', 'b', 'a', 'r', 'b', 'a', 'z', 5]
>>> data.append(data)
>>> list(walk(data, uniqueiter_factory()))
['foo', 'bar', 'baz', 5, [('foo', 'bar'), 'baz', 5, [...]]]
-bob
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
Guido van Rossum wrote: 2. How would the initial value that forms the basis of summation be built for non-empty sequences? Here's you're way off. There's never any use of "+=", so never any need to create a new object. The algorithm I had in mind was: - if empty, return 2nd arg - if one item, return that - if more than one item (A, B, C, ...) return (...((A + B) + C) + ...) There I go again, missing the obvious and thinking things are more complicated than they really are. . . But I'm not so sure now. Thinking ahead to generic types, I'd like the full signature to be: def sum(seq: sequence[T], initial: T = 0) -> T. and that's exactly what it is today. Conclusion: sum() is perfect after all! So the official verdict is "sum() is mainly intended for numbers, but can be used with other types by supplying a default argument"? I guess that leaves Alex's question of whether or not supplying a string of some description as the initial value can be legitimately translated to: if isinstance(initial, basestring): return initial + type(initial)().join(seq) rather than raising the current TypeError that suggests the programmer may want to rewrite their code. Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] itertools.walk()
Bob Ippolito wrote: On Mar 16, 2005, at 6:19, Raymond Hettinger wrote: Some folks on comp.lang.python have been pushing for itertools to include a flatten() operation. Unless you guys have some thoughts on the subject, I'm inclined to accept the request. Rather than calling it flatten(), it would be called "walk" and provide a generalized capability to descend through nested iterables (similar to what os.walk does for directories). The one wrinkle is having a stoplist argument to specify types that should be considered atomic eventhough they might be iterable (strings for example). You could alternatively give them a way to supply their own "iter" function, like the code I demonstrate below: I think the extra flexibility ends up making the function harder to comprehend and use. Here's a version with a simple stoplist: from itertools import chain def walk(iterable, atomic_types=(basestring,)): itr = iter(iterable) while True: for item in itr: if isinstance(item, atomic_types): yield item continue try: subitr = iter(item) except TypeError: yield item else: itr = chain(walk(subitr), itr) break else: break This makes it easy to reclassify certain things like dictionaries or tuples as atomic elements. > # maybe there should be a bfswalk too? By putting the 'walk(subitr)' after the current itr when chaining? If Raymond does decide to go for the flexible approach rather than the simple one, then I'd vote for a full-featured approach like: def walk(iterable, depth_first=True, atomic_types=(basestring,), iter_factory=iter): itr = iter(iterable) while True: for item in itr: if isinstance(item, atomic_types): yield item continue try: subitr = iter_factory(item) except TypeError: yield item else: if depth_first: itr = chain(walk(subitr), itr) else: itr = chain(itr, walk(subitr)) break else: break Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] BRANCH FREEZE for 2.4.1rc2 at 2005-03-18 0000 UTC
The release24-maint branch should be considered FROZEN as at UTC on 2005-03-18 - in other words, in about 11 hours time. Allegedly this is around 1900 on the 17th for the US East Coast. I'll post again once it's unfrozen. From here, we'll be aiming at a 2.4.1 final for the 29th - straight after PyCon. I'll post again when the branch is available for checkins. And I'll repeat my request for people to be ultra-conservative with checkins to the branch until 2.4.1 final is out. If in doubt, ping me first. Those contact details again: AIM: anthonyatekit Jabber: [EMAIL PROTECTED] IRC: #python-dev on Freenode. Thanks, Anthony -- Anthony Baxter <[EMAIL PROTECTED]> It's never too late to have a happy childhood. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] BRANCH FREEZE for 2.4.1rc2 at 2005-03-**17** 0000 UTC
On Thursday 17 March 2005 00:28, Anthony Baxter wrote: > The release24-maint branch should be considered FROZEN as at UTC > on 2005-03-18 That should of course be 2005-03-17. -- Anthony Baxter <[EMAIL PROTECTED]> It's never too late to have a happy childhood. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] itertools.walk()
On Mar 16, 2005, at 8:37 AM, Nick Coghlan wrote: Bob Ippolito wrote: On Mar 16, 2005, at 6:19, Raymond Hettinger wrote: Some folks on comp.lang.python have been pushing for itertools to include a flatten() operation. Unless you guys have some thoughts on the subject, I'm inclined to accept the request. Rather than calling it flatten(), it would be called "walk" and provide a generalized capability to descend through nested iterables (similar to what os.walk does for directories). The one wrinkle is having a stoplist argument to specify types that should be considered atomic eventhough they might be iterable (strings for example). You could alternatively give them a way to supply their own "iter" function, like the code I demonstrate below: I think the extra flexibility ends up making the function harder to comprehend and use. Here's a version with a simple stoplist: ... This makes it easy to reclassify certain things like dictionaries or tuples as atomic elements. > # maybe there should be a bfswalk too? By putting the 'walk(subitr)' after the current itr when chaining? Yeah If Raymond does decide to go for the flexible approach rather than the simple one, then I'd vote for a full-featured approach like: I don't mind that at all. It's certainly convenient to have an easy stoplist. The problem with the way you have implemented it is that basestring will cause infinite recursion if you use the built-in iter, so if you provide your own atomic_types, you damn well better remember to add in basestring. def walk(iterable, depth_first=True, atomic_types=(basestring,), iter_factory=iter): itr = iter(iterable) while True: for item in itr: if isinstance(item, atomic_types): yield item continue try: subitr = iter_factory(item) except TypeError: yield item else: if depth_first: itr = chain(walk(subitr), itr) else: itr = chain(itr, walk(subitr)) break else: break I'm not sure why it's useful to explode the stack with all that recursion? Mine didn't do that. The control flow is nearly identical, but it looks more fragile (and you would get some really evil stack trace if iter_factory(foo) happened to raise something other than TypeError). -bob ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
On Tue, 15 Mar 2005 07:47:20 -0800, Guido van Rossum <[EMAIL PROTECTED]> wrote: > But I'm not so sure now. Thinking ahead to generic types, I'd like the > full signature to be: > > def sum(seq: sequence[T], initial: T = 0) -> T. Would this _syntax_ work with generic types: def sum(seq: sequence[T], initial: T = T()) -> T. Cheers, Michael ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
RE: [Python-Dev] Lambda & deferred evaluation (was: Adding any() andall())
> >>Lambda will be more difficult. Eric Raymond adapted an anti-gun control
> >>slogan and said "you can pry lambda out of my cold dead hands." A bunch
> >>of folks will sorely miss the ability to create anonymous functions on
> >>the fly. When lambda is used for deferred argument evaluation (a la PEP
> >>312), the def syntax is a crummy substitute.
> > Yeah, I'm with you here. As warty as lambda is, it just is so damn
> > convenient some times. I've recently been using it as a companion to
> > property(), providing concise definitions of read-only attributes.
I'm with you on lambda: I've found a few places where losing it would be a
major inconvenience.
One example is in a library I wrote to implement the Snobol4 algorithm for
recursive-descent pattern matching. Using that algorithm relies heavily on
the ability to specify subexpressions for "deferred evaluation" -- their
values must not be computed until they are actually needed. For example,
here is a mini-specification for arithmetic expressions in Snobol4:
id = any(letters) span(letters digits)
primary = id | '(' *expr ')'
factor = primary arbno(any("*/") primary)
expr = factor arbno(any("+-") factor)
This should be fairly straightforward once you realize that "arbno" is like
a regular-expression *, blank is used for concatenation, and | (meaning
"or") binds less tightly than blank.
The whole definition depends on *expr, which is a request for deferred
evaluation of expr. In other words, it refers to the value of expr when
encountered during matching, rather than the (null) value it has when the
assignment to primary is evaluated.
Through appropriate use of overloading, I have been able to translate this
code into the following Python:
id = any(letters) + span(letters + digits)
primary = id | '(' + defer(lambda: expr) + ')'
factor = primary + arbno(any("*/") + primary)
expr = factor + arbno(any("+-") + factor)
I do not want to have to rewrite this code as:
def defer_expr:
return expr
id = any(letters) + span(letters + digits)
primary = id | '(' + defer(defer_expr) + ')'
factor = primary + arbno(any("*/") + primary)
expr = factor + arbno(any("+-") + factor)
because I do not want to have to make up a new name, and because in
practice, I've seen patterns in which there are many deferred evaluations,
not just one as in this example. In other words, I want deferred evaluation
to remain a conceptually inexpensive operation, and lambda is the only way
of doing this.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
> I guess that leaves Alex's question of whether or not supplying a string of > some > description as the initial value can be legitimately translated to: > >if isinstance(initial, basestring): > return initial + type(initial)().join(seq) If you're trying to get people in the habit of writing sum(x, "") instead of "".join(x), I fear that they'll try sum(x, " ") instead of " ".join(x), and be sadly disappointed. Frankly, I've had enough of this exploration of alternative definitions for sum(), and think it is perfect as it is. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
> > Thinking ahead to generic types, I'd like the full signature to be: > > > > def sum(seq: sequence[T], initial: T = 0) -> T. > > Would this _syntax_ work with generic types: > > def sum(seq: sequence[T], initial: T = T()) -> T. Maybe, but it would preclude union types; continuing with the (bad) example of strings, what should one choose for T when seq == ['a', u'b']? The general case is a sequence of objects of different types that are mutually addable. This can be made to work with the (hypothetical) type system by using unions, but you can't instantiate an instance of a union without being more specific. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
Michael Walter wrote: On Tue, 15 Mar 2005 07:47:20 -0800, Guido van Rossum <[EMAIL PROTECTED]> wrote: But I'm not so sure now. Thinking ahead to generic types, I'd like the full signature to be: def sum(seq: sequence[T], initial: T = 0) -> T. Would this _syntax_ work with generic types: def sum(seq: sequence[T], initial: T = T()) -> T. This doesn't make sense with existing semantics because default arguments are evaluated when the function is defined, but T() can't be evaluated until the function is called. I'm not sure there's a way around that problem without turning default arguments into a trap for the unwary. jw ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] code blocks using 'for' loops and generators
Samuele Pedroni <[EMAIL PROTECTED]> wrote:
> Josiah Carlson wrote:
> > Greg Ewing <[EMAIL PROTECTED]> wrote:
> >
> >>Josiah Carlson wrote:
> >>
> >>
> >>>Since PEP 310 was already mentioned, can we just say that the discussion
> >>>can be boiled down to different ways of spelling __enter__/__exit__ from
> >>>PEP 310?
> >>
> >>It's not quite the same thing. PEP 310 suggests a mechanism
> >>with a fixed control flow -- do something on entry, do the
> >>code block, do something on exit. A general block-passing
> >>mechanism would give complete control to the function as
> >>to when and how to call the block.
> >
> >
> > I would like to ask a question. Does Python want or need Ruby code
> > blocks? I ask because that is what is being offered to cure what ails
> > us. Don't get me wrong, I'm sure code blocks can solve quite a few
> > problems (and is used as such in Ruby), but I'm not convinced that it is
> > the solution for Python.
> >
> > Any manual on Ruby will invariably discuss code blocks as one of the
> > most powerful features Ruby has to offer. Sounds great. Sounds like a
> > great big sledgehammer that can be used to do a bunch of things...so
> > what is currently being proposed as a use for them, and what can't they
> > do (that would be really nice)?
[snip myself]
> > The synchronized-like thing is the custom try/finally, aka
> > __enter__/__exit__ as specified in PEP 310.
> >
> > The property-like thing was perhaps to be an easier way to generate
> > properties, which fairly quickly fell to the wayside in discussion,
> > seemingly because people didn't see a need to add thunks for this.
[snip myself]
Samuele:
> well, I think some people desire a more streamlined way of writing code
> like:
>
> def f(...)
> ...
> def g(...)
> ...
> x = h(...,f,g)
>
> [property, setting up callbacks etc are cases of this]
I think properties are the most used case where this kind of thing would
be nice. Though the only thing that I've ever had a gripe with
properties is that I didn't like the trailing property() call - which is
why I wrote a property helper decorator (a use can be seen in [1]). But
my needs are small, so maybe this kind of thing isn't sufficient for
those who write hundreds of properties.
> were f,g etc definitions would appear inline and the whole has a
> statement flavor; because this is Python a form that does not involve a
> lot parantheses would be nice. Of course if the functions then are
> allowed to change the surrounding bindings this could be used for
> resource release issues etc.
Resource release, right, already been discussed, and already known this
is a use case. Now, people keep saying that using code blocks can make
things like properties easier to construct. Ok, I'll bite, someone
write properties using code blocks. I tried to, but I prefer original
properties to what I wrote. Maybe someone who is a thunk advocate can
write something better. Now's your chance, show the world how thunks
make properties prettier and easier to understand. Feel free to use
whatever syntax is your favorite (if it may be ambiguous to third party,
document it). If someone can make a thunk that looks better than my
example [1] below, then I will agree that helping properties is a
legitimate use case (I have had serious doubts from the beginning), but
if not, then code blocks as a solution to the 'property problem' should
be taken off the table.
> Notice that decorators basically support a special case of this.
>
> But yes, apart for the issue of rebinding (and if one wants non-local
> returns), this is stricly about sugar.
But the sugar isn't all that sweet. It solves the problem of resource
acquisition and release (also solved by PEP 310), and may help with
property-like things (helped by a decorator). Any other use cases for
one of the most powerful features of Ruby, in Python?
- Josiah
[1] example use of my property helper decorator
class foo(object):
prop = property_maker("put documentation here")
@prop #getter
def a(self):
return 1
@prop #setter
def a(self, v):
pass
@prop #deleter
def a(self):
pass
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] rationale for the no-new-features approach
On Fri, Mar 11, 2005 at 06:47:11PM -0500, Bob Ippolito wrote:
>
> On Mar 11, 2005, at 2:26 PM, Skip Montanaro wrote:
>
> >
> >Bob> try:
> >Bob> set
> >Bob> except NameError:
> >Bob> from sets import Set as set
> >
> >Bob> You don't need the rest.
> >
> >Sure, but then pychecker bitches about a statement that appears to
> >have no
> >effect. ;-)
>
> Well then fix PyChecker to look for this pattern :)
>
> -bob
or make it even uglier to hide from pychecker by writing that as:
exec("""
try:
set
except NameError:
from sets import Set as set
""")
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
RE: [Python-Dev] rationale for the no-new-features approach
[Bob Ippolito]
try:
set
except NameError:
from sets import Set as set
You don't need the rest.
[Skip Montanaro]
>>> Sure, but then pychecker bitches about a statement that appears to
>>> have no effect. ;-)
[Bob Ippolito]
>> Well then fix PyChecker to look for this pattern :)
+1.
[Gregory P. Smith]
> or make it even uglier to hide from pychecker by writing that as:
>
> exec("""
> try:
> set
> except NameError:
> from sets import Set as set
> """)
I presume that was somewhat tongue-in-cheek, but if it wasn't, please
reconsider. Modulefinder isn't able to realise that set (or sets.Set) is
needed with the latter (a problem of this very nature was just fixed with
bsddb), which causes trouble for people later on.
=Tony.Meyer
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Problems with definition of _POSIX_C_SOURCE
[Jack Jansen] > On a platform I won't mention here I'm running into problems compiling > Python, because of it defining _POSIX_C_SOURCE. ... > Does anyone know what the real meaning of this define is? LOL. Here's the Official Story: http://www.opengroup.org/onlinepubs/009695399/functions/xsh_chap02_02.html Look near the top, under "The _POSIX_C_SOURCE Feature Test Macro". This will tell you: When an application includes a header described by IEEE Std 1003.1-2001, and when this feature test macro is defined to have the value 200112L: yadda yadda yadda yadda yadda yadda yadda yadda Then again, every journey of a million miles begins with 200112L small steps ... ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] rationale for the no-new-features approach
> [Gregory P. Smith]
> > or make it even uglier to hide from pychecker by writing that as:
> >
> > exec("""
> > try:
> > set
> > except NameError:
> > from sets import Set as set
> > """)
>
> I presume that was somewhat tongue-in-cheek, but if it wasn't, please
> reconsider. Modulefinder isn't able to realise that set (or sets.Set) is
> needed with the latter (a problem of this very nature was just fixed with
> bsddb), which causes trouble for people later on.
>
> =Tony.Meyer
hehe yes sorry, i left off the :P
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Problems with definition of _POSIX_C_SOURCE
On a platform I won't mention here I'm running into problems compiling Python, because of it defining _POSIX_C_SOURCE. It turns out that on this platform the definition causes all sorts of declarations in sys/types.h to be skipped (presumably because they're not official Posix identifiers), which in turn causes platform-specific headers to fail. The comment in pyconfig.h suggests that defining _POSIX_C_SOURCE may enable certain features, but the actual system headers appear to work the other way around: it seems that defining this will disable features that are not strict Posix. Does anyone know what the real meaning of this define is? Because if it is the former then Python is right, but if it is the latter Python really has no business defining it: in general Python isn't 100% posix-compliant because it'll use all sorts of platform-dependent (and, thus, potentially non-posix-compliant) code... This problem is currently stopping Python 2.4.1 to compile on this platform, so if anyone can provide any insight that would be very helpful... -- Jack Jansen, <[EMAIL PROTECTED]>, http://www.cwi.nl/~jack If I can't dance I don't want to be part of your revolution -- Emma Goldman ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] thunks (was: code blocks using 'for' loops and generators)
Jim Jewett wrote:
It may be time to PEP (or re-PEP), if only to clarify what people are
actually asking for.
I will PEPify this, unless someone does not think I am the correct person
to do so. The PEP is probably a better place to try to address questions
you raise, as well as give the rationale that Josiah Carlson was looking
for.
But, in short:
Brian Sabbey's example from message
http://mail.python.org/pipermail/python-dev/2005-March/052202.html
*seems* reasonably clear, but I don't see how it relates in any way to
"for" loops or generators, except as one (but not the only) use case.
The original post in this thread was an idea about using 'for' loops and
generators, but that idea has since been replaced with something else.
(1) Calls for "Ruby blocks" or "thunks" are basically calls for
placeholders in a function. These placeholders will be filled
with code from someplace else, but will execute in the function's
own local namespace.
It wasn't my intention that the thunk would execute in the function's
namespace ("function" here is to mean the function that takes the thunk as
an argument). I was thinking that scope rules for the thunk would mimic
the rules for control flow structures.
-Brian
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] code blocks using 'for' loops and generators
Samuele Pedroni wrote: well, I think some people desire a more streamlined way of writing code like: def f(...) ... def g(...) ... x = h(...,f,g) Using the recently-proposed 'where' facility, this could be written x = h(..., f, g) where: def f(...): ... def g(...): ... Of course if the functions then are allowed to change the surrounding bindings this could be used for resource release issues etc. Yes, rebinding in the surrounding scope is the one thing that style wouldn't give you -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
Tim Peters wrote: I can't say it bothers me to specify an appropriate identity element when 0 is inappropriate. Maybe instead of a single sum() function, each summable type should have a sum() static method which uses an identity appropriate to that type. So to sum a list of integers you would use int.sum(x), to sum floats you would use float.sum(x), and to sum timedeltas you would use timedelta.sum(x). This would also neatly solve the problem of people trying to sum strings, because there would be no str.sum() method. Or alternatively, there could be a str.sum(x) which was equivalent to "".join(x). Although it might be better to call it str.concat() instead of str.sum(). -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
John Williams wrote: Michael Walter wrote: Would this _syntax_ work with generic types: def sum(seq: sequence[T], initial: T = T()) -> T. This doesn't make sense with existing semantics because default arguments are evaluated when the function is defined, but T() can't be evaluated until the function is called. Not to mention that if the seq is empty, there's no way of knowing what T to instantiate... -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Rationale for sum()'s design?
On Wed, 16 Mar 2005 08:28:22 -0800, Guido van Rossum <[EMAIL PROTECTED]> wrote: > > > Thinking ahead to generic types, I'd like the full signature to be: > > > > > > def sum(seq: sequence[T], initial: T = 0) -> T. > > > > Would this _syntax_ work with generic types: > > > > def sum(seq: sequence[T], initial: T = T()) -> T. > > Maybe, but it would preclude union types; continuing with the (bad) > example of strings, what should one choose for T when seq == ['a', > u'b']? The general case is a sequence of objects of different types > that are mutually addable. This can be made to work with the > (hypothetical) type system by using unions, but you can't > instantiate an instance of a union without being more specific. Continuing that hypothetical thought, it would be perfectly acceptable to make require an argument for union types T. Maybe T() should only be valid for non-union types. Several questions like "when should T() be evaluated" [1], "how can we avoid ': T = T()' leading to a type error" and "how about optional parameters in front of ': T = T()'" just popped up in my mind. Michael [1] Thanks, John! ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Rationale for sum()'s design?
On Thu, 17 Mar 2005 14:34:23 +1300, Greg Ewing <[EMAIL PROTECTED]> wrote: > Not to mention that if the seq is empty, there's no > way of knowing what T to instantiate... You just use the same T as inferred for seq : sequence[T] . Michael ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] RE: code blocks using 'for' loops and generators
Jim Jewett wrote: (2) A function as a parameter isn't good enough, because the passed-in function can't see bindings in the surrounding larger function. (It still sees the lexical scope it which it was defined.) That sounds confused, because the lexical scope it which it was defined is exactly what it *should* see. (4) A thunk could be used today be creating a string (rather than a pre-compiled function) and substituting in the thunk's string Again, you seem to be under a misapprehension about how code blocks should work. They should be lexically scoped, not dynamically scoped. (7) A __leave__ or __exit__ special method really turns into another name for __del__. Not really. A PEP-310-style __exit__ method is explicitly invoked at well-defined times, not left until the object is reclaimed. It doesn't suffer from any of the problems of __del__. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
