Re: [Python-Dev] problem with genexp
On 10/16/05, Neal Norwitz <[EMAIL PROTECTED]> wrote: > On 10/10/05, Neal Norwitz <[EMAIL PROTECTED]> wrote: > > There's a problem with genexp's that I think really needs to get > > fixed. See http://python.org/sf/1167751 the details are below. This > > code: > > > > >>> foo(a = i for i in range(10)) > > > > I agree with the bug report that the code should either raise a > > SyntaxError or do the right thing. > > The change to Grammar/Grammar below seems to fix the problem and all > the tests pass. Can anyone comment on whether this fix is > correct/appropriate? Is there a better way to fix the problem? Since no one responded other than Jiwon, I checked in this change. I did *not* backport it since what was syntactically correct in 2.4.2 would raise an error in 2.4.3. I'm not sure which is worse. I'll leave it up to Anthony whether this should be backported. BTW, the change was the same regardless of old code vs. new AST code. n ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
At 07:57 PM 10/20/2005 -0700, Guido van Rossum wrote: >(Sorry for the long post -- there just wasn't anything you said that I >felt could be left unquoted. :-) Wow. You've brought up an awful lot of stuff I want to respond to, about the nature of frameworks, AOP, Chandler, PEP 342, software deployment, etc. But I know you're busy, and the draft I was working on in reply to this has gotten simply huge and still unfinished, so I think I should just turn it all into a blog article on "Why Frameworks Are Evil And What We Can Do To Stop Them". :) I don't think I've exaggerated anything, though. I think maybe you're perceiving more vehemence than I actually have on the issue. Context variables are a very small thing and I've not been arguing that they're a big one. In the scope of the coming Global War On Frameworks, they are pretty small potatoes. :) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Questionable AST wibbles
Jeremy, There are a bunch of mods from the AST branch that got integrated into head. Hopefully, by doing this on python-dev more people will get involved. I'll describe high level things first, but there will be a ton of details later on. If people don't want to see this crap on python-dev, I can take this offline. Highlevel overview of code size (rough decrease of 300 C source lines): * Python/compile.c -2733 (was 6822 now 4089) * Python/Python-ast.c +2281 (new file) * Python/asdl.c +92 (new file) * plus other minor mods symtable.h has lots of changes to structs and APIs. Not sure what needs to be doc'ed. I was very glad to see that ./python compileall.py Lib took virtually the same time before and after AST. Yeah! Unfortunately, I can't say the same for memory usage for running compileall: Before AST: [10120 refs] After AST: [916096 refs] I believe there aren't that many true memory leaks from running valgrind. Though there are likely some ref leaks. Most of this is probably stuff that we are just hanging on to that is not required. I will continue to run valgrind to find more problems. A bunch of APIs changed and there is some additional name pollution. Since these are pretty internal APIs, I'm not sure that part is a big deal. I will try to find more name pollution and eliminate it by prefixing with Py. One API change which I think was a mistake was _Py_Mangle() losing 2 parameters (I think this was how it was a long time ago). See typeobject.c, Python.h, compile.c. pythonrun.h has a bunch of changes. I think a lot of the APIs changed, but there might be backwards compatible macros. I'm not sure. I need to review closely. symtable.h has lots of changes to structs and APIs. Not sure what needs to be doc'ed. Some #defines are history (I think they are in the enum now): TYPE_*. code.h was added, but it mostly contains stuff from compile.h. Should we remove code.h and just put everything in compile.h? This will remove lots little changes. code.h & compile.h are tightly coupled. If we keep them separate, I would like to see some other changes. This probably is not a big deal, but I was surprised by this change: +++ test_repr.py20 Oct 2005 19:59:24 - 1.20 @@ -123,7 +123,7 @@ def test_lambda(self): self.failUnless(repr(lambda x: x).startswith( -"http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Coroutines, generators, function calling
> so the new syntax would > not be useful, unless it was something that provided access to the index > item as a variable, like: > > yield foo(i) for i in x > > which barely saves you anything (a colon, a newline, and an indent). Not even that, because you can omit the newline and indent: for i in x: yield foo(i) There's a bigger difference between for i in x: yield i and yield from x Moreover, I can imagine optimization opportunities for "yield from" that would not make sense in the context of comprehensions. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] AST branch is in?
On 10/20/05, Anthony Baxter <[EMAIL PROTECTED]> wrote: > So it looks like the AST branch has landed. Wooo! Well done to all who > were involved - it seems like it's been a huge amount of work. Hear, hear. Great news! Thanks to Jeremy, Neil and all the others. I can't wait to check it out! -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
On 10/20/05, Phillip J. Eby <[EMAIL PROTECTED]> wrote: > At 08:57 AM 10/20/2005 -0700, Guido van Rossum wrote: > >Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? > > > >PEP 343 is still (back) on the drawing table, PEP 342 has barely been > >implemented (did it survive the AST-branch merge?), and already you > >are talking about adding more stuff. Please put on the brakes! > > Sorry. I thought that 343 was just getting a minor tune-up. Maybe, but the issues on the table are naming issues -- is __with__ the right name, or should it be __context__? Should the decorator be applied implicitly? Should the decorator be called @context or @contextmanager? > In the months > since the discussion and approval (and implementation; Michael Hudson > actually had a PEP 343 patch out there), Which he described previously as "a hack" and apparently didn't feel comfortable checking in. At least some of it will have to be redone, (a) for the AST code, and (b) for the revised PEP. > I've been doing a lot of thinking > about how they will be used in applications, and thought that it would be a > good idea to promote people using task-specific variables in place of > globals or thread-locals. That's clear, yes. :-) I still find it unlikely that a lot of people will be using trampoline frameworks. You and Twisted, that's all I expect. > The conventional wisdom is that global variables are bad, but the truth is > that they're very attractive because they allow you to have one less thing > to pass around and think about in every line of code. Which doesn't make them less bad -- they're still there and perhaps more likely to trip you up when you least expect it. I think there's a lot of truth in that conventional wisdom. > Without globals, you > would sooner or later end up with every function taking twenty arguments to > pass through states down to other code, or else trying to cram all this > data into some kind of "context" object, which then won't work with code > that doesn't know about *your* definition of what a context is. Methinks you are exaggerating for effect. > Globals are thus extremely attractive for practical software > development. If they weren't so useful, it wouldn't be necessary to warn > people not to use them, after all. :) > > The problem with globals, however, is that sometimes they need to be > changed in a particular context. PEP 343 makes it safer to use globals > because you can always offer a context manager that changes them > temporarily, without having to hand-write a try-finally block. This will > make it even *more* attractive to use globals, which is not a problem as > long as the code has no multitasking of any sort. Hm. There are different kinds of globals. Most globals don't need to be context-managed at all, because they can safely be shared between threads, tasks or coroutines. Caches usually fall in this category (e.g. the compiled regex cache). A little locking is all it takes. The globals that need to be context-managed are the pernicious kind of which you can never have too few. :-) They aren't just accumulating global state, they are implicit parameters, thereby truly invoking the reasons why globals are frowned upon. > Of course, the multithreading scenario is usually fixed by using > thread-locals. All I'm proposing is that we replace thread locals with > task locals, and promote the use of task-local variables for managed > contexts (such as the decimal context) *that would otherwise be a global or > a thread-local variable*. This doesn't seem to me like a very big deal; > just an encouragement for people to make their stuff easy to use with PEP > 342 and 343. I'm all for encouraging people to make their stuff easy to use with these PEPs, and with multi-threading use. But IMO the best way to accomplish those goals is to refrain from global (or thread-local or task-local) context as much as possible, for example by passing along explicit context. The mere existence of a standard library module to make handling task-specific contexts easier sends the wrong signal; it suggests that it's a good pattern to use, which it isn't -- it's a last-resort pattern, when all other solutions fail. If it weren't for Python's operator overloading, the decimal module would have used explicit contexts (like the Java version); but since it would be really strange to have such a fundamental numeric type without the ability to use the conventional operator notation, we resorted to per-thread context. Even that doesn't always do the right thing -- handling decimal contexts is surprisingly subtle (as Nick can testify based on his experiences attempting to write a decimal context manager for the with-statement!). Yes, coroutines make it even subtler. But I haven't seen the use case yet for mixing coroutines with changes to decimal context settings; somehow it doesn't strike me as a likely use case (not that you can't construct one, so don't bother -- I can imagine i
[Python-Dev] A solution to the evils of static typing and interfaces?
Hi, I was thinking why not have a separate file for all the proposed optional meta-information (in particular interfaces, static types)? Something along the lines of IDLs in CORBA (with pythonic syntax, of curse). This way most of the benefits are retained without "contaminating" the actual syntax (dare I be so pretentious to even hope making both sides happy?). For the sole purpose of illustration, let meta-files have extension .pym and linking to source-files be name based: parrot.py parrot.pym (parrot.pyc) With some utilities like a prototype generator (to and from meta-files) and a synchronization tool, time penalty on development for having two separate files could be kept within reason. We could even go as far as introducing a syntax allowing custom meta-information to be added. For example something akin to decorators. parrot.pym: @sharedinstance class Parrot: # Methods # note this are only prototypes so no semicolon or suite is needed @cache def playDead(a : int, b : int) -> None # Attributes @const name : str where sharedinstance, cache and const are custom meta-information. This opens up countless possibilities for third-party interpreter enchantments and/or optimisations by providing a fully portable (as all meta-information are optional) language extensions. P.S. my sincerest apologies if I am reopening a can of worms here ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] AST branch is in?
On 10/20/05, Anthony Baxter <[EMAIL PROTECTED]> wrote: > > Could someone involved give a short email laying out what concrete (no > pun intended) advantages this new compiler gives us? Does it just > allow us to do new and interesting manipulations of the code during > compilation? Cleaner, easier to maintain, or the like? The Grammar is (was at one point at least) shared between Jython and would allow more tools to be able to share infrastructure. The idea is to eventually be able to have [JP]ython output the same AST to tools. There is quite a bit of generated code based on the Grammar. So some stuff should be easier. Other stuff is just moved. You still need to convert from the AST to the byte code. Hopefully it will be easier to do various sorts of optimization and general manipulation of an AST rather than what existed before. Only time will tell if we can acheive many of the benefits, so it would be good if people could review the code and see if things look more complex/complicated and suggest improvements. I'm not all that familiar with the structure, I'm more of a hopeful consumer of it. HTH, n ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] AST branch is in?
So it looks like the AST branch has landed. Wooo! Well done to all who were involved - it seems like it's been a huge amount of work. Could someone involved give a short email laying out what concrete (no pun intended) advantages this new compiler gives us? Does it just allow us to do new and interesting manipulations of the code during compilation? Cleaner, easier to maintain, or the like? Anthony -- Anthony Baxter <[EMAIL PROTECTED]> It's never too late to have a happy childhood. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
On 10/20/05, Phillip J. Eby <[EMAIL PROTECTED]> wrote: > At 04:04 PM 10/20/2005 -0400, Jeremy Hylton wrote: > >On 10/20/05, Guido van Rossum <[EMAIL PROTECTED]> wrote: > > > Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? > > > > > > PEP 343 is still (back) on the drawing table, PEP 342 has barely been > > > implemented (did it survive the AST-branch merge?), and already you > > > are talking about adding more stuff. Please put on the brakes! > > > >Yes. PEP 342 survived the merge of the AST branch. I wonder, though, > >if the Grammar for it can be simplified at all. I haven't read the > >PEP closely, but I found the changes a little hard to follow. That > >is, why was the grammar changed the way it was -- or how would you > >describe the intent of the changes? > > The intent was to make it so that '(yield optional_expr)' always works, and > also that [lvalue =] yield optional_expr works. If you can find another > way to hack the grammar so that both of 'em work, it's certainly okay by > me. The changes I made were just the simplest things I could figure out to > do. Right. > I seem to recall that the hard part was the need for 'yield expr,expr' to > be interpreted as '(yield expr,expr)', not '(yield expr),expr', for > backward compatibility reasons. But only at the statement level. These should be errors IMO: foo(yield expr, expr) foo(expr, yield expr) foo(1 + yield expr) x = yield expr, expr x = expr, yield expr x = 1 + yield expr -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] bool(iter([])) changed between 2.3 and 2.4
"Guido van Rossum" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] >> [Fred] >> > think iterators shouldn't have length at all: >> > they're *not* containers and shouldn't act that way. >> >> Some iterators can usefully report their length with the invariant: >>len(it) == len(list(it)). > >I still consider this an erroneous hypergeneralization of the concept >of iterators. Iterators should be pure iterators and not also act as >containers. Which other object type implements __len__ but not >__getitem__? Too late, and probably irrelevant by now; the answer though is set([1,2,3]) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
At 08:57 AM 10/20/2005 -0700, Guido van Rossum wrote: >Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? > >PEP 343 is still (back) on the drawing table, PEP 342 has barely been >implemented (did it survive the AST-branch merge?), and already you >are talking about adding more stuff. Please put on the brakes! Sorry. I thought that 343 was just getting a minor tune-up. In the months since the discussion and approval (and implementation; Michael Hudson actually had a PEP 343 patch out there), I've been doing a lot of thinking about how they will be used in applications, and thought that it would be a good idea to promote people using task-specific variables in place of globals or thread-locals. The conventional wisdom is that global variables are bad, but the truth is that they're very attractive because they allow you to have one less thing to pass around and think about in every line of code. Without globals, you would sooner or later end up with every function taking twenty arguments to pass through states down to other code, or else trying to cram all this data into some kind of "context" object, which then won't work with code that doesn't know about *your* definition of what a context is. Globals are thus extremely attractive for practical software development. If they weren't so useful, it wouldn't be necessary to warn people not to use them, after all. :) The problem with globals, however, is that sometimes they need to be changed in a particular context. PEP 343 makes it safer to use globals because you can always offer a context manager that changes them temporarily, without having to hand-write a try-finally block. This will make it even *more* attractive to use globals, which is not a problem as long as the code has no multitasking of any sort. Of course, the multithreading scenario is usually fixed by using thread-locals. All I'm proposing is that we replace thread locals with task locals, and promote the use of task-local variables for managed contexts (such as the decimal context) *that would otherwise be a global or a thread-local variable*. This doesn't seem to me like a very big deal; just an encouragement for people to make their stuff easy to use with PEP 342 and 343. By the way, I don't know if you do much with Java these days, but a big part of the whole J2EE fiasco and the rise of the so-called "lightweight containers" in Java has all been about how to manage implicit context so that you don't get stuck with either the inflexibility of globals or the deadweight of passing tons of parameters around. One of the big selling points of AspectJ is that it lets you implicitly funnel parameters from point A to point B without having to modify all the call signatures in between. In other words, its use is promoted for precisely the sort of thing that 'with' plus a task variable would be ideal for. As far as I can tell, 'with' plus a task variable is *much* easier to explain, use, and understand than an aspect-oriented programming tool is! (Especially from the "if the implementation is easy to explain, it may be a good idea" perspective.) >I know that somewhere in the proto-PEP Phillip argues that the context >API needs to be made a part of the standard library so that his >trampoline can efficiently swap implicit contexts required by >arbitrary standard and third-party library code. My response to that >is that library code (whether standard or third-party) should not >depend on implicit context unless it assumes it can assume complete >control over the application. I think maybe there's some confusion here, at least on my part. :) I see two ways to read your statement, one of which seems to be saying that we should get rid of the decimal context (because it doesn't have complete control over the application), and the other way of reading it doesn't seem connected to what I proposed. Anything that's a global variable is an "implicit context". Because of that, I spent considerable time and effort in PEAK trying to utterly stamp out global variables. *Everything* in PEAK has an explicit context. But that then becomes more of a pain to *use*, because you are now stuck with managing it, even if you cram it into a Zope-style acquisition tree so there's only one "context" to deal with. Plus, it assumes that everything the developer wants to do can be supplied by *one* framework, be it PEAK, Zope, or whatever, which is rarely the case but still forces framework developers to duplicate everybody else's stuff. In other words, I've come to realize that the path the major Python application frameworks is not really Pythonic. A Pythonic framework shouldn't load you down with new management burdens and keep you from using other frameworks. It should make life easier, and make your code *more* interoperable, not less. Indeed, I've pretty much come to agreement with the part of the Python developer community tha
Re: [Python-Dev] list splicing
"Greg Ewing" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] > Karl Chen wrote: >> Hi, has anybody considered adding something like this: >> a = [1, 2] >> [ 'x', *a, 'y'] >> >> as syntactic sugar for >> a = [1, 2] >> [ 'x' ] + a + [ 'y' ]. > > You can write that as > a = [1, 2] > a[1:1] = a I'm sure you meant to write: a = [1, 2] b = ['x', 'y'] b[1:1] = a Occasional absence of mind makes other people feel useful! PS actually one *can* write a = [1, 2] ['x', 'y'][1:1] = a since this is not actually an assignment but rather syntactic sugar for a function call, but I don't know how one would use the modified list, since b = ['x','y'][1:1] = a doesn't quite fulfill the initial requirement ;) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
At 04:04 PM 10/20/2005 -0400, Jeremy Hylton wrote: >On 10/20/05, Guido van Rossum <[EMAIL PROTECTED]> wrote: > > Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? > > > > PEP 343 is still (back) on the drawing table, PEP 342 has barely been > > implemented (did it survive the AST-branch merge?), and already you > > are talking about adding more stuff. Please put on the brakes! > >Yes. PEP 342 survived the merge of the AST branch. I wonder, though, >if the Grammar for it can be simplified at all. I haven't read the >PEP closely, but I found the changes a little hard to follow. That >is, why was the grammar changed the way it was -- or how would you >describe the intent of the changes? The intent was to make it so that '(yield optional_expr)' always works, and also that [lvalue =] yield optional_expr works. If you can find another way to hack the grammar so that both of 'em work, it's certainly okay by me. The changes I made were just the simplest things I could figure out to do. I seem to recall that the hard part was the need for 'yield expr,expr' to be interpreted as '(yield expr,expr)', not '(yield expr),expr', for backward compatibility reasons. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
On 10/20/05, Guido van Rossum <[EMAIL PROTECTED]> wrote: > Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? > > PEP 343 is still (back) on the drawing table, PEP 342 has barely been > implemented (did it survive the AST-branch merge?), and already you > are talking about adding more stuff. Please put on the brakes! Yes. PEP 342 survived the merge of the AST branch. I wonder, though, if the Grammar for it can be simplified at all. I haven't read the PEP closely, but I found the changes a little hard to follow. That is, why was the grammar changed the way it was -- or how would you describe the intent of the changes? It was hard when doing the transformation in ast.c to be sure that the intent of the changes was honored. On the other hand, it seemed to have extensive tests and they all pass. Jeremy ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
At 10:40 PM 10/20/2005 +1000, Nick Coghlan wrote: >Phillip J. Eby wrote: > > This is still rather rough, but I figured it's easier to let everybody > fill > > in the remaining gaps by arguments than it is for me to pick a position I > > like and try to convince everybody else that it's right. :) Your > feedback > > is requested and welcome. > >I think you're actually highlighting a bigger issue with the behaviour of >"yield" inside a "with" block, and working around it rather than fixing the >fundamental problem. > >The issue with "yield" causing changes to leak to outer scopes isn't limited >to coroutine style usage - it can happen with generator-iterators, too. > >What's missing is a general way of saying "suspend this context temporarily, >and resume it when done". Actually, it's fairly simple to write a generator decorator using context.swap() that saves and restores the current execution state around next()/send()/throw() calls, if you prefer it to be the generator's responsibility to maintain such context. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] enumerate with a start index
On 10/19/05, Martin Blais <[EMAIL PROTECTED]> wrote: > Just wondering, would anyone think of it as a good idea if the > enumerate() builtin could accept a "start" argument? And why not an additional "step" argument? Anyway, perhaps all this can be done with a 'xrange' object... -- Lisandro Dalcín --- Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC) Instituto de Desarrollo Tecnológico para la Industria Química (INTEC) Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET) PTLC - Güemes 3450, (3000) Santa Fe, Argentina Tel/Fax: +54-(0)342-451.1594 ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Early PEP draft (For Python 3000?)
Jim Jewett <[EMAIL PROTECTED]> wrote: > I'll try to be more explicit; if Josiah and I are talking past each > other, than the explanation was clearly not yet mature. > > (In http://mail.python.org/pipermail/python-dev/2005-October/057251.html) > Eyal Lotem suggested: > > > Name: Attribute access for all namespaces ... > > > global x ; x = 1 > > Replaced by: > > module.x = 1 > > I responded: > > Attribute access as an option would be nice, but might be slower. > > > Also note that one common use for a __dict__ is that you don't > > know what keys are available; meeting this use case with > > attribute access would require some extra machinery, such as > > an iterator over attributes. > > Josiah Carlson responded > (http://mail.python.org/pipermail/python-dev/2005-October/057451.html) > > > This particular use case is easily handled. Put the following > > once at the top of the module... > > > module = __import__(__name__) > > > Then one can access (though perhaps not quickly) the module-level > > variables for that module. To access attributes, it is a quick scan > > through module.__dict__, dir(), or vars(). > > My understanding of the request was that all namespaces -- > including those returned by globals() and locals() -- should > be used with attribute access *instead* of __dict__ access. Yeah, I missed the transition from arbitrary stack frame access to strictly global and local scope attribute access. > module.x is certainly nicer than module.__dict__['x'] > > Even with globals() and locals(), I usually *wish* I could > use attribute access, to avoid creating a string when what > I really want is a name. Indeed. > The catch is that sometimes I don't know the names in > advance, and have to iterate over the dict -- as you > suggested. That works fine today; my question is what > to do instead if __dict__ is unavailable. > > Note that vars(obj) itself conceptually returns a NameSpace > rather than a dict, so that isn't the answer. >>> help(vars) vars(...) vars([object]) -> dictionary Without arguments, equivalent to locals(). With an argument, equivalent to object.__dict__. When an object lacks a dictionary, dir() works just fine. >>> help(dir) Help on built-in function dir: dir(...) dir([object]) -> list of strings Return an alphabetized list of names comprising (some of) the attributes of the given object, and of attributes reachable from it: No argument: the names in the current scope. Module object: the module attributes. Type or class object: its attributes, and recursively the attributes of its bases. Otherwise: its attributes, its class's attributes, and recursively the attributes of its class's base classes. > My inclination is to add an __iterattr__ that returns > (attribute name, attribute value) pairs, and to make this the > default iterator for NameSpace objects. def __iterattr__(obj): for i in dir(obj): yield i, getattr(obj, i) > Whether the good of > (1) not needing to mess with __dict__, and > (2) not needing to pretend that strings are names > is enough to justify an extra magic method ... I'm not as sure. I don't know, but leaning towards no; dir() works pretty well. Yeah, you have to use getattr(), but there are worse things. - Josiah ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
Whoa, folks! Can I ask the gentlemen to curb their enthusiasm? PEP 343 is still (back) on the drawing table, PEP 342 has barely been implemented (did it survive the AST-branch merge?), and already you are talking about adding more stuff. Please put on the brakes! If there's anything this discussion shows me, it's that implicit contexts are a dangerous concept, and should be treated with much skepticism. I would recommend that if you find yourself needing context data while programming an asynchronous application using generator trampolines simulating coroutines, you ought to refactor the app so that the context is explicitly passed along rather than grabbed implicitly. Zope doesn't *require* you to get the context from a thread-local, and I presume that SQLObject also has a way to explicitly use a specific connection (I'm assuming cursors and similar data structures have an explicit reference to the connection used to create them). Heck, even Decimal allows you to invoke every operation as a method on a decimal.Context object! I'd rather not tie implicit contexts to the with statement, conceptually. Most uses of the with-statement are purely local (e.g. "with open(fn) as f"), or don't apply to coroutines (e.g. "with my_lock"). I'd say that "with redirect_stdout(f)" also doesn't apply -- we already know it doesn't work in threaded applications, and that restriction is easily and logically extended to coroutines. If you're writing a trampoline for an app that needs to modify decimal contexts, the decimal module already provides the APIs for explicitly saving and restoring contexts. I know that somewhere in the proto-PEP Phillip argues that the context API needs to be made a part of the standard library so that his trampoline can efficiently swap implicit contexts required by arbitrary standard and third-party library code. My response to that is that library code (whether standard or third-party) should not depend on implicit context unless it assumes it can assume complete control over the application. (That rules out pretty much everything except Zope, which is fine with me. :-) Also, Nick wants the name 'context' for PEP-343 style context managers. I think it's overloading too much to use the same word for per-thread or per-coroutine context. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Early PEP draft (For Python 3000?)
I'll try to be more explicit; if Josiah and I are talking past each other, than the explanation was clearly not yet mature. (In http://mail.python.org/pipermail/python-dev/2005-October/057251.html) Eyal Lotem suggested: > Name: Attribute access for all namespaces ... > global x ; x = 1 > Replaced by: > module.x = 1 I responded: > Attribute access as an option would be nice, but might be slower. > Also note that one common use for a __dict__ is that you don't > know what keys are available; meeting this use case with > attribute access would require some extra machinery, such as > an iterator over attributes. Josiah Carlson responded (http://mail.python.org/pipermail/python-dev/2005-October/057451.html) > This particular use case is easily handled. Put the following > once at the top of the module... > module = __import__(__name__) > Then one can access (though perhaps not quickly) the module-level > variables for that module. To access attributes, it is a quick scan > through module.__dict__, dir(), or vars(). My understanding of the request was that all namespaces -- including those returned by globals() and locals() -- should be used with attribute access *instead* of __dict__ access. module.x is certainly nicer than module.__dict__['x'] Even with globals() and locals(), I usually *wish* I could use attribute access, to avoid creating a string when what I really want is a name. The catch is that sometimes I don't know the names in advance, and have to iterate over the dict -- as you suggested. That works fine today; my question is what to do instead if __dict__ is unavailable. Note that vars(obj) itself conceptually returns a NameSpace rather than a dict, so that isn't the answer. My inclination is to add an __iterattr__ that returns (attribute name, attribute value) pairs, and to make this the default iterator for NameSpace objects. Whether the good of (1) not needing to mess with __dict__, and (2) not needing to pretend that strings are names is enough to justify an extra magic method ... I'm not as sure. -jJ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
Nick Coghlan wrote: > P.S. Here's a different generator wrapper that could be used to create a > generator-based "suspendable context" that can be invoked multiple times > through use of the "without" keyword. If applied to the PEP 343 > decimal.Context() __with__ method example, it would automatically restore the > original context for the duration of the "without" block. I realised this isn't actually true for the version I posted, and the __with__ method example in the PEP - changes made to the decimal context in the "without" block would be visible after the "with" block. Consider the following: def iter_sin(iterable): # Point A with decimal.getcontext() as ctx: ctx.prec += 10 for r in iterable: y = sin(r) # Very high precision during calculation without: yield +y # Interim results have normal precision # Point B What I posted would essentially work for this example, but there isn't a guarantee that the context at Point A is the same as the context at Point B - the reason is that the thread-local context may be changed within the without block (i.e., external to this iterator), and that changed context would get saved when the decimal.Context context manager was resumed. To fix that, the arguments to StopIteration in __suspend__ would need to be used as arguments when the generator is recreated in __resume__. That is, the context manager would look like: @suspendable def __with__(self, oldctx=None): # Accept argument in __resume__ newctx = self.copy() if oldctx is None: oldctx = decimal.getcontext() decimal.setcontext(newctx) try: yield newctx finally: decimal.setcontext(oldctx) raise StopIteration(oldctx) # Return result in __suspend__ (This might look cleaner if "return arg" in a generator was equivalent to "raise StopIteration(arg)" as previously discussed) And (including reversion to 'one-use-only' status) the wrapper class would look like: class SuspendableGeneratorContext(object): def __init__(self, func, args, kwds): self.gen = func(*args, **kwds) self.func = func self.args = None def __with__(self): return self def __enter__(self): try: return self.gen.next() except StopIteration: raise RuntimeError("generator didn't yield") def __suspend__(self): try: self.gen.next() except StopIteration, ex: # Use the return value as the arguments for resumption self.args = ex.args return else: raise RuntimeError("generator didn't stop") def __resume__(self): if self.args is None: raise RuntimeError("context not suspended") self.gen = self.func(*args) try: self.gen.next() except StopIteration: raise RuntimeError("generator didn't yield") def __exit__(self, type, value, traceback): if type is None: try: self.gen.next() except StopIteration: return else: raise RuntimeError("generator didn't stop") else: try: self.gen.throw(type, value, traceback) except (type, StopIteration): return else: raise RuntimeError("generator caught exception") def suspendable_context(func): def helper(*args, **kwds): return SuspendableGeneratorContext(func, args, kwds) return helper Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.blogspot.com ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Pre-PEP: Task-local variables
Phillip J. Eby wrote: > This is still rather rough, but I figured it's easier to let everybody fill > in the remaining gaps by arguments than it is for me to pick a position I > like and try to convince everybody else that it's right. :) Your feedback > is requested and welcome. I think you're actually highlighting a bigger issue with the behaviour of "yield" inside a "with" block, and working around it rather than fixing the fundamental problem. The issue with "yield" causing changes to leak to outer scopes isn't limited to coroutine style usage - it can happen with generator-iterators, too. What's missing is a general way of saying "suspend this context temporarily, and resume it when done". An example use-case not involving 'yield' at all is the "asynchronise" functionality. A generator-iterator that works in a high precision decimal.Context(), but wants to return values from inside a loop using normal precision is another example not involving coroutines. The basic idea would be to provide syntax that allows a with statement to be "suspended", along the lines of: with EXPR as VAR: for VAR2 in EXPR2: without: BLOCK To mean: abc = (EXPR).__with__() exc = (None, None, None) VAR = abc.__enter__() try: for VAR2 in EXPR2: try: abc.__suspend__() try: BLOCK finally: abc.__resume__() except: exc = sys.exc_info() raise finally: abc.__exit__(*exc) To keep things simple, just as 'break' and 'continue' work only on the innermost loop, 'without' would only apply to the innermost 'with' statement. Locks, for example, could support this via: class Lock(object): def __with__(self): return self def __enter__(self): self.acquire() return self def __resume__(self): self.acquire() def __suspend__(self): self.release() def __exit__(self): self.release() (Note that there's a potential problem if the call to acquire() in __resume__ fails, but that's no different than if this same dance is done manually). Cheers, Nick. P.S. Here's a different generator wrapper that could be used to create a generator-based "suspendable context" that can be invoked multiple times through use of the "without" keyword. If applied to the PEP 343 decimal.Context() __with__ method example, it would automatically restore the original context for the duration of the "without" block: class SuspendableGeneratorContext(object): def __init__(self, func, args, kwds): self.gen = None self.func = func self.args = args self.kwds = kwds def __with__(self): return self def __enter__(self): if self.gen is not None: raise RuntimeError("context already in use") gen = self.func(*args, **kwds) try: result = gen.next() except StopIteration: raise RuntimeError("generator didn't yield") self.gen = gen return result def __resume__(self): if self.gen is None: raise RuntimeError("context not suspended") gen = self.func(*args, **kwds) try: gen.next() except StopIteration: raise RuntimeError("generator didn't yield") self.gen = gen def __suspend__(self): try: self.gen.next() except StopIteration: return else: raise RuntimeError("generator didn't stop") def __exit__(self, type, value, traceback): gen = self.gen self.gen = None if type is None: try: gen.next() except StopIteration: return else: raise RuntimeError("generator didn't stop") else: try: gen.throw(type, value, traceback) except (type, StopIteration): return else: raise RuntimeError("generator caught exception") def suspendable_context(func): def helper(*args, **kwds): return SuspendableGeneratorContext(func, args, kwds) return helper -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.blogspot.com ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Definining properties - a use case for class decorators?
As other explained, the syntax would not work for functions (and it is not intended to). A possible use case I had in mind is to define inlined modules to be used as bunches of attributes. For instance, I could define a module as module m(): a = 1 b = 2 where 'module' would be the following function: def module(name, args, dic): mod = types.ModuleType(name, dic.get('__doc__')) for k in dic: setattr(mod, k, dic[k]) return mod ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com