Re: [Python-Dev] Proper place to put extra args for building
Brett C. wrote: OK, EXTRA_CFLAGS support has been checked into Makefile.pre.in and distutils.sysconfig . Martin, please double-check I tweaked sysconfig the way you wanted. It is the way I wanted it, but it doesn't work. Just try and use it for some extension modules to see for yourself, I tried with a harmless GCC option (-fgcse). The problem is that distutils only looks at the Makefile, not at the environment variables. So I changed parse_makefile to do what make does: fall back to the environment when no makefile variable is set. This was still not sufficient, since distutils never looks at CFLAGS. So I changed setup.py and sysconfig.py to fetch CFLAGS, and not bother with BASECFLAGS and EXTRA_CFLAGS. setup.py 1.218 NEWS 1.1289 sysconfig.py 1.65 Regards, Martin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Re: anonymous blocks
Guido van Rossum wrote: At the same time, having to use it as follows: for f in with_file(filename): for line in f: print process(line) is really ugly, so we need new syntax, which also helps with keeping 'for' semantically backwards compatible. So let's use 'with', and then the using code becomes again this: with f = with_file(filename): for line in f: print process(line) or with with_file(filename) as f: ... ? (assignment inside block-opening constructs aren't used in Python today, as far as I can tell...) Finally, I think it would be cool if the generator could trap occurrences of break, continue and return occurring in BODY. We could introduce a new class of exceptions for these, named ControlFlow, and (only in the body of a with statement), break would raise BreakFlow, continue would raise ContinueFlow, and return EXPR would raise ReturnFlow(EXPR) (EXPR defaulting to None of course). So a block could return a value to the generator using a return statement; the generator can catch this by catching ReturnFlow. (Syntactic sugar could be VAR = yield ... like in Ruby.) slightly weird, but useful enough to be cool. (maybe return value is enough, though. the others may be slightly too weird... or should that return perhaps be a continue value? you're going back to the top of loop, after all). /F ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: switch statement
Shannon -jj Behrens wrote: On 4/20/05, M.-A. Lemburg [EMAIL PROTECTED] wrote: Fredrik Lundh wrote: PS. a side effect of the for-in pattern is that I'm beginning to feel that Python might need a nice switch statement based on dictionary lookups, so I can replace multiple callbacks with a single loop body, without writing too many if/elif clauses. PEP 275 anyone ? (http://www.python.org/peps/pep-0275.html) My use case for switch is that of a parser switching on tokens. mxTextTools applications would greatly benefit from being able to branch on tokens quickly. Currently, there's only callbacks, dict-to-method branching or long if-elif-elif-...-elif-else. I think match from Ocaml would be a much nicer addition to Python than switch from C. PEP 275 is about branching based on dictionary lookups which is somewhat different than pattern matching - for which we already have lots and lots of different tools. The motivation behind the switch statement idea is that of interpreting the multi-state outcome of some analysis that you perform on data. The main benefit is avoiding Python function calls which are very slow compared to branching to inlined Python code. Having a simple switch statement would enable writing very fast parsers in Python - you'd let one of the existing tokenizers such as mxTextTools, re or one of the xml libs create the token input data and then work on the result using a switch statement. Instead of having one function call per token, you'd only have a single dict lookup. BTW, has anyone in this thread actually read the PEP 275 ? -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 25 2005) Python/Zope Consulting and Support ...http://www.egenix.com/ mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/ ::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: anonymous blocks
[ Simon Percivall ]: [ Terry Reedy ]: with target as value: would parallel the for-statement header and read smoother to me. for target as value: would not need new keyword, but would require close reading to distinguish 'as' from 'in'. But it also moves the value to the right, removing focus. Wouldn't from be a good keyword to overload here? in/with/for/ value from target: BODY I do not have strong feelings about this issue, but for completeness sake... Mixing both suggestions: from target as value: BODY That resembles an import statement which some may consider good (syntax/keyword reuse) or very bad (confusion?, value focus). cheers, Senra -- Rodrigo Senra -- MSc Computer Engineerrodsenra(at)gpr.com.br GPr Sistemas Ltdahttp://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Re: Re: anonymous blocks
Skip Montanaro [EMAIL PROTECTED] wrote in message news:[EMAIL PROTECTED] Guido with VAR = EXPR: Guido BODY What about a multi-variable case? Will you have to introduce a new level of indentation for each 'with' var? I would expect to see the same structure unpacking as with assignment, for loops, and function calls: with a,b,c = x,y,z and so on. Terry J. Reedy ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Re: Re: anonymous blocks
Brett C. [EMAIL PROTECTED] wrote in message news:[EMAIL PROTECTED] And before anyone decries the fact that this might confuse a newbie (which seems to happen with every advanced feature ever dreamed up), remember this will not be meant for a newbie but for someone who has experience in Python and iterators at the minimum, and hopefully with generators. Not exactly meant for someone for which raw_input() still holds a wow factor for. =) I have accepted the fact that Python has become a two-level language: basic Python for expressing algorithms + advanced features (metaclasses, decorators, CPython-specific introspection and hacks, and now possibly 'with' or whatever) for solving software engineering issues. Perhaps there should correspondingly be two tutorials. Terry J. Reedy ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
RE: [Python-Dev] defmacro (was: Anonymous blocks)
Jim Jewett writes: As best I can tell, the anonymous blocks are used to take care of boilerplate code without changing the scope -- exactly what macros are used for. Folks, I think that Jim is onto something here. I've been following this conversation, and it sounds to me as if we are stumbling about in the dark, trying to feel our way toward something very useful and powerful. I think Jim is right, what we're feeling our way toward is macros. The problem, of course, is that Guido (and others!) are on record as being opposed to adding macros to Python. (Even good macros... think lisp, not cpp.) I am not quite sure that I am convinced by the argument, but let me see if I can present it: Allowing macros in Python would enable individual programmers or groups to easily invent their own syntax. Eventually, there would develop a large number of different Python dialects (as some claim has happened in the Lisp community) each dependent on macros the others lack. The most important casualty would be Python's great *readability*. (If this is a strawman argument, i.e. if you know of a better reason for keeping macros OUT of Python please speak up. Like I said, I've never been completely convinced of it myself.) I think it would be useful if we approached it like this: either what we want is the full power of macros (in which case the syntax we choose should be guided by that choice), or we want LESS than the full power of macros. If we want less, then HOW less? In other words, rather than hearing what we'd like to be able to DO with blocks, I'd like to hear what we want to PROHIBIT DOING with blocks. I think this might be a fruitful way of thinking about the problem which might make it easier to evaluate syntax suggestions. And if the answer is that we want to prohibit nothing, then the right solution is macros. -- Michael Chermside ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: Caching objects in memory
On 4/22/05, Fredrik Lundh [EMAIL PROTECTED] wrote: Is there a document that details which objects are cached in memory (to not create the same object multiple times, for performance)? why do you think you need to know? I was in my second class of the Python workshop I'm giving here in one Argentine University, and I was explaining how to think using name/object and not variable/value. Using id() for being pedagogic about the objects, the kids saw that id(3) was always the same, but id([]) not. I explained to them that Python, in some circumstances, caches the object, and I kept them happy enough. But I really don't know what objects and in which circumstances. .Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: Caching objects in memory
I was in my second class of the Python workshop I'm giving here in one Argentine University, and I was explaining how to think using name/object and not variable/value. Using id() for being pedagogic about the objects, the kids saw that id(3) was always the same, but id([]) not. I explained to them that Python, in some circumstances, caches the object, and I kept them happy enough. But I really don't know what objects and in which circumstances. Aargh! Bad explanation. Or at least you're missing something: *mutable* objects (like lists) can *never* be cached, because they have explicit object semantics. For example each time the expression [] is evaluated it *must* produce a fresh list object (though it may be recycled from a GC'ed list object -- or any other GC'ed object, for that matter). But for *immutable* objects (like numbers, strings and tuples) the implementation is free to use caching. In practice, I believe ints between -5 and 100 are cached, and 1-character strings are often cached (but not always). Hope this helps! I would think this is in the docs somewhere but probably not in a place where one would ever think to look... -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
Michael Chermside wrote: Jim Jewett writes: As best I can tell, the anonymous blocks are used to take care of boilerplate code without changing the scope -- exactly what macros are used for. Folks, I think that Jim is onto something here. I've been following this conversation, and it sounds to me as if we are stumbling about in the dark, trying to feel our way toward something very useful and powerful. I think Jim is right, what we're feeling our way toward is macros. I am very excited about the discussion of blocks. I think they can potentially address two things that are sticky to express in python right now. The first is to compress the common try/finally use cases around resource usage as with files and database commits. The second is language extensibility, which makes us think of what macros did for Lisp. Language extensibility has two motivations. First and foremost is to allow the programmer to express his or her *intent*. The second motivation is to reuse code and thereby increase productivity. Since methods already allow us to reuse code, our motivation is to increase expressivity. What blocks offer is to make Python's suites something a programmer can work with. Much like using a metaclass putting control of class details into the programmer's hands. Or decorators allowing us to modify method semantics. If the uses of decorators tells us anything, I'm pretty sure there are more potential uses of blocks than we could shake many sticks at. ;) So, the question comes back to what are blocks in the language extensibility case? To me, they would be something very like a code object returned from the compile method. To this we would need to attach the globals and locals where the block was from. Then we could use the normal exec statement to invoke the block whenever needed. Perhaps we could add a new mode 'block' to allow the ControlFlow exceptions mentioned elsewhere in the thread. We still need to find a way to pass arguments to the block so we are not tempted to insert them in locals and have them magically appear in the namespace. ;) Personally, I'm rather attached to as (x, y): introducing the block. To conclude, I mocked up some potential examples for your entertainment. ;) Thanks for your time and consideration! -Shane Holloway Interfaces:: def interface(interfaceName, *bases, ***aBlockSuite): blockGlobals = aBlockSuite.globals().copy() blockGlobals.update(aBlockSuite.locals()) blockLocals = {} exec aBlock in blockGlobals, blockLocals return iterfaceType(interfaceName, bases, blockLocals) IFoo = interface('IFoo'): def isFoo(self): pass IBar = interface('IBar'): def isBar(self): pass IBaz = interface('IBaz', IFoo, IBar): def isBaz(self): pass Event Suites:: def eventSinksFor(events, ***aBlockSuite): blockGlobals = aBlockSuite.globals().copy() blockGlobals.update(aBlockSuite.locals()) blockLocals = {} exec aBlock in blockGlobals, blockLocals for name, value in blockLocals.iteritems(): if aBlockSuite.locals().get(name) is value: continue if callable(value): events.addEventFor(name, value) def debugScene(scene): eventSinksFor(scene.events): def onMove(pos): print pos:, pos def onButton(which, state): print button:, which, state def onKey(which, state): print key:, which, state ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
On Mon, Apr 25, 2005, Shane Holloway (IEEE) wrote: Interfaces:: def interface(interfaceName, *bases, ***aBlockSuite): blockGlobals = aBlockSuite.globals().copy() blockGlobals.update(aBlockSuite.locals()) blockLocals = {} exec aBlock in blockGlobals, blockLocals return iterfaceType(interfaceName, bases, blockLocals) IFoo = interface('IFoo'): def isFoo(self): pass Where does ``aBlock`` come from? -- Aahz ([EMAIL PROTECTED]) * http://www.pythoncraft.com/ It's 106 miles to Chicago. We have a full tank of gas, a half-pack of cigarettes, it's dark, and we're wearing sunglasses. Hit it. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] defmacro (was: Anonymous blocks)
Guido: My problem with macros is actually more practical: Python's compiler is too dumb. I am assuming that we want to be able to import macros from other modules, and I am assuming that macros are expanded by the compiler, not at run time; but the compiler doesn't follow imports ... Expanding at run-time is less efficient, but it works at least as well semantically. If today's alternative is manual cut-n-paste, I would still rather have the computer do it for me, to avoid accidental forks. It could also be done (though not as cleanly) by making macros act as import hooks. import defmacro# Stop processing until defmacro is loaded. # All future lines will be preprocessed by the # hook collection ... from defmacro import foo # installs a foo hook, good for the rest of the file Michael Chermside: I think it would be useful if we approached it like this: either what we want is the full power of macros (in which case the syntax we choose should be guided by that choice), or we want LESS than the full power of macros. If we want less, then HOW less? In other words, rather than hearing what we'd like to be able to DO with blocks, I'd like to hear what we want to PROHIBIT DOING with blocks. I think this might be a fruitful way of thinking about the problem which might make it easier to evaluate syntax suggestions. And if the answer is that we want to prohibit nothing, then the right solution is macros. I'm personally at a loss understanding your question here. Perhaps you could try answering it for yourself? Why not just introduce macros? If the answer is We should, it is just hard to code, then use a good syntax for macros. If the answer is We don't want xx sss (S\! 2k3 ] to ever be meaningful, then we need to figure out exactly what to prohibit. Lisp macros are (generally, excluding read macros) limited to taking and generating complete S-expressions. If that isn't enough to enforce readability, then limiting blocks to expressions (or even statements) probably isn't enough in python. Do we want to limit the changing part (the anonymous block) to only a single suite? That does work well with the yield syntax, but it seems like an arbitrary restriction unless *all* we want are resource wrappers. Or do we really just want a way to say that a function should share its local namespace with it's caller or callee? In that case, maybe the answer is a lexical or same_namespace keyword. Or maybe just a recipe to make exec or eval do the right thing. def myresource(rcname, callback, *args): rc=open(rcname) same_namespace callback(*args) close(rc) def process(*args): ... ... if __name__ == '__main__': myresource(file1, process, arg1, arg2) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
RE: [Python-Dev] defmacro (was: Anonymous blocks)
Guido writes: My problem with macros is actually more practical: Python's compiler is too dumb. I am assuming that we want to be able to import macros from other modules, and I am assuming that macros are expanded by the compiler, not at run time; but the compiler doesn't follow imports (that happens at run time) so there's no mechanism to tell the compiler about the new syntax. And macros that don't introduce new syntax don't seem very interesting (compared to what we can do already). That's good to hear. It expresses fairly clearly what the challenges are in implementing macros for Python, and expressing the challenges makes it easier to attack the problem. My interest comes because some recent syntax changes (generators, generator expressions) have seemed to me like true language changes, but others (decorators, anonymous-blocks) to me just cry out this would be easy as a macro!. I wrote: I think it would be useful if we approached it like this: either what we want is the full power of macros (in which case the syntax we choose should be guided by that choice), or we want LESS than the full power of macros. If we want less, then HOW less? In other words, rather than hearing what we'd like to be able to DO with blocks, I'd like to hear what we want to PROHIBIT DOING with blocks. I think this might be a fruitful way of thinking about the problem which might make it easier to evaluate syntax suggestions. And if the answer is that we want to prohibit nothing, then the right solution is macros. Guido replied: I'm personally at a loss understanding your question here. Perhaps you could try answering it for yourself? You guys just think too fast for me. When I started this email, I replied Fair enough. One possibility is But while I was trying to condense my thoughts down from 1.5 pages to something short and coherent (it takes time to write it short) everything I was thinking became obscelete as both Paul Moore and Jim Jewett did exactly the kind of thinking I was hoping to inspire: Paul: What I feel is the key concept here is that of injecting code into a template form (try...finally, or try..except..else, or whatever) [...] Specifically, cases where functions aren't enough. If I try to characterise precisely what those cases are, all I can come up with is when the code being injected needs to run in the current scope, not in the scope of a template function. Is that right? Jim: Why not just introduce macros? If the answer is We should, it is just hard to code, then use a good syntax for macros. If the answer is We don't want xx sss (S\! 2k3 ] to ever be meaningful, then we need to figure out exactly what to prohibit. [...] Do we want to limit the changing part (the anonymous block) to only a single suite? That does work well with the yield syntax, but it seems like an arbitrary restriction unless *all* we want are resource wrappers. Or do we really just want a way to say that a function should share its local namespace with it's caller or callee? In that case, maybe the answer is a lexical or same_namespace keyword. My own opinion is that we DO want macros. I prefer a language have a few, powerful constructs rather than lots of specialized ones. (Yet I still believe that doing different things should look different... which is why I prefer Python to Lisp.) I think that macros could solve a LOT of problems. There are lots of things one might want to replace within macros, from identifiers to punctuation, but I'd be willing to live with just two of them: expressions, and series-of-statements (that's almost the same as a block). There are only two places I'd want to be able to USE a macro: where an expression is called for, and where a series-of-statements is called for. In both cases, I'd be happy with a function-call like syntax for including the macro. Well, that's a lot of wanting... now I all I need to do is invent a clever syntax that allows these in an elegant fashion while also solving Guido's point about imports (hint: the answer is that it ALL happens at runtime). I'll go think some while you guys zoom past me again. wink -- Michael Chermside ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Re: Re: Caching objects in memory
Guido: But for *immutable* objects (like numbers, strings and tuples) the implementation is free to use caching. In practice, I believe ints between -5 and 100 are cached, and 1-character strings are often cached (but not always). Hope this helps! I would think this is in the docs somewhere but probably not in a place where one would ever think to look... --- I am sure that the fact that immutables *may* be cached is in the ref manual, but I have been under the impression that the private, *mutable* specifics for CPython are intentionally omitted so that people will not think of them as either fixed or as part of the language/library. I have previously suggested that there be a separate doc for CPython implementation details like this that some people want but which are not part of the language or library definition. Terry J. Reedy ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
Robert Brewer wrote: So currently, all subclasses just override __set__, which leads to a *lot* of duplication of code. If I could write the base class' __set__ to call macros like this: def __set__(self, unit, value): self.begin() if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: self.pre() unit._properties[self.key] = value self.post() self.end() defmacro begin: pass defmacro pre: pass defmacro post: pass defmacro end: pass Here is a way to write that using anonymous blocks: def __set__(self, unit, value): with self.setting(unit, value): if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: with self.changing(oldvalue, value): unit._properties[self.key] = value def setting(self, unit, value): # begin code goes here yield None # end code goes here def changing(self, oldvalue, newvalue): # pre code goes here yield None # post code goes here ...(which would require macro-blocks which were decidedly *not* anonymous) then I could more cleanly write a subclass with additional macro methods: defmacro pre: old_children = self.children() defmacro post: for child in self.children: if child not in old_children: notify_somebody(New child %s % child) def changing(self, oldvalue, newvalue): old_children = self.children() yield None for child in self.children: if child not in old_children: notify_somebody(New child %s % child) Which do you prefer? I like fewer methods. ;-) Shane ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
RE: [Python-Dev] defmacro (was: Anonymous blocks)
Shane Hathaway wrote: Robert Brewer wrote: So currently, all subclasses just override __set__, which leads to a *lot* of duplication of code. If I could write the base class' __set__ to call macros like this: def __set__(self, unit, value): self.begin() if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: self.pre() unit._properties[self.key] = value self.post() self.end() defmacro begin: pass defmacro pre: pass defmacro post: pass defmacro end: pass Here is a way to write that using anonymous blocks: def __set__(self, unit, value): with self.setting(unit, value): if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: with self.changing(oldvalue, value): unit._properties[self.key] = value def setting(self, unit, value): # begin code goes here yield None # end code goes here def changing(self, oldvalue, newvalue): # pre code goes here yield None # post code goes here ... Which do you prefer? I like fewer methods. ;-) I still prefer more methods, because my actual use-cases are more complicated. Your solution would work for the specific case I gave, but try factoring in: * A subclass which needs to share locals between begin and post, instead of pre and post. or * A set of 10 subclasses which need the same begin() but different end() code. Yielding seems both too restrictive and too inside-out to be readable, IMO. Robert Brewer MIS Amor Ministries [EMAIL PROTECTED] ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
On 4/25/05, Guido van Rossum [EMAIL PROTECTED] wrote: It could also be done (though not as cleanly) by making macros act as import hooks. Brrr. What about imports that aren't at the top level (e.g. inside a function)? Bad style already. :D If you want to use the macro, you have to ensure it was already imported. That said, I did say it wasn't as clean; think of it like pre-caching which dictionary that resolved an attribute lookup. Don't start with the complexity, but consider not making the optimization impossible. Why not just introduce macros? Because I've been using Python for 15 years without needing them? And also without anonymous blocks or generator finalizers or resource managers. Sorry, but why not add feature X is exactly what we're trying to AVOID here. If anything is added, it might be better to add a single generalized tool instead of several special cases -- unless the tool is so general as to be hazardous. Unlimited macros are that hazardous. If the answer is We don't want xx sss (S\! 2k3 ] to ever be meaningful, then we need to figure out exactly what to prohibit. I don't understand what the point is of using an example like xx sss (S\! 2k3 ]. The simplest way to implement macros is to add an import hook that can modify (replace) the string containing the source code. Unfortunately, that would allow rules like replace any line starting with 'xx' with the number 7 Outside of obfuscators, almost no one would do something quite so painful as that ... but some people would start using regex substitutions or monkey-patching. I would hate to debug code that fails because a standard library module is secretly changed (on load, not on disk where I can grep for it) by another module, which doesn't even mention that library by name... As Michael said, we have to think about what transformations we do not want happening out of sight. I would have said Just use it responsibly if I hadn't considered pathological cases like that one. [yield works great for a single anonymous block, but not so great for several blocks per macro/template.] Pehaps you've missed some context here? Nobody seems to be able to come up with other [than resource wrappers] use cases, that's why yield is so attractive. Sorry; to me it seemed obvious that you would occasionally want to interleave the macro/template and the variable portion. Robert Brewer has since provided good examples at http://mail.python.org/pipermail/python-dev/2005-April/052923.html http://mail.python.org/pipermail/python-dev/2005-April/052924.html Or do we really just want a way to say that a function should share its local namespace with it's caller or callee? In that case, maybe the answer is a lexical or same_namespace keyword. Or maybe just a recipe to make exec or eval do the right thing. But should the same_namespace modifier be part of the call site or part of the callee? IMHO, it should be part of the calling site, because it is the calling site that could be surprised to find its own locals modified. The callee presumably runs through a complete call before it has a chance to be surprised. I did leave the decision open because I'm not certain that mention-in-caller wouldn't end up contorting a common code style. (It effectively forces the macro to be in control, and the meaningful code to be callbacks.) -jJ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: anonymous blocks
Paul Moore wrote: Hmm, it took me a while to get this, but what you're ssaying is that if you modify Guido's what I really want solution to use VAR = next(it, exc) then this builtin next makes API v2 stuff using __next__ work while remaining backward compatible with old-style API v1 stuff using 0-arg next() (as long as old-style stuff isn't used in a context where an exception gets passed back in). Yes, but it could also be used (almost) anywhere an explicit obj.next() is used. it = iter(seq) while True: print next(it) for loops would also change to use builtin next() rather than calling it.next() directly. I'd suggest that the new builtin have a magic name (__next__ being the obvious one :-)) to make it clear that it's an internal implementation detail. There aren't many builtins that have magic names, and I don't think this should be one of them - it has obvious uses other than as an implementation detail. PS The first person to replace builtin __next__ in order to implement a next hook of some sort, gets shot :-) Damn! There goes the use case ;) Tim Delaney ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
It seems that what you call macros is really an unlimited preprocessor. I'm even less interested in that topic than in macros, and I haven't seen anything here to change my mind. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Re: switch statement
M.-A. Lemburg wrote: Having a simple switch statement would enable writing very fast parsers in Python - ... Instead of having one function call per token, you'd only have a single dict lookup. BTW, has anyone in this thread actually read the PEP 275 ? I haven't actually seen any use cases outside of parsers branching on a constant token. When I see stacked elif clauses, the condition almost always includes some computation (perhaps only .startswith or in or a regex match), and there are often cases which look at a second variable. If speed for a limited number of cases is the only advantage, then I would say it belongs in (at most) the implementation, rather than the language spec. -jJ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
Robert Brewer wrote: I still prefer more methods, because my actual use-cases are more complicated. Your solution would work for the specific case I gave, but try factoring in: * A subclass which needs to share locals between begin and post, instead of pre and post. or * A set of 10 subclasses which need the same begin() but different end() code. Yielding seems both too restrictive and too inside-out to be readable, IMO. Ok, that makes sense. However, one of your examples seemingly pulls a name, 'old_children', out of nowhere. That's hard to fix. One of the greatest features of Python is the simple name scoping; we can't lose that. Shane ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defmacro (was: Anonymous blocks)
Robert Brewer wrote: Shane Hathaway wrote: Robert Brewer wrote: So currently, all subclasses just override __set__, which leads to a *lot* of duplication of code. If I could write the base class' __set__ to call macros like this: def __set__(self, unit, value): self.begin() if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: self.pre() unit._properties[self.key] = value self.post() self.end() defmacro begin: pass defmacro pre: pass defmacro post: pass defmacro end: pass Here is a way to write that using anonymous blocks: def __set__(self, unit, value): with self.setting(unit, value): if self.coerce: value = self.coerce(unit, value) oldvalue = unit._properties[self.key] if oldvalue != value: with self.changing(oldvalue, value): unit._properties[self.key] = value def setting(self, unit, value): # begin code goes here yield None # end code goes here def changing(self, oldvalue, newvalue): # pre code goes here yield None # post code goes here ... Which do you prefer? I like fewer methods. ;-) I still prefer more methods, because my actual use-cases are more complicated. Your solution would work for the specific case I gave, but try factoring in: * A subclass which needs to share locals between begin and post, instead of pre and post. or * A set of 10 subclasses which need the same begin() but different end() code. Yielding seems both too restrictive and too inside-out to be readable, IMO. it seems what you are asking for are functions that are evaluated in namespace of the caller: - this seems fragile, the only safe wat to implement 'begin' etc is to exactly know what goes on in __set__ and what names are used there - if you throw in deferred evaluation for exprs or suites passed in as arguments and even without considering that, it seems pretty horrid implementation-wise Notice that even in Common Lisp you cannot really do this, you could define a macro that produce a definition for __set__ and takes fragments corresponding to begin ... etc ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: switch statement
On Mon, 2005-04-25 at 18:20 -0400, Jim Jewett wrote: [...] If speed for a limited number of cases is the only advantage, then I would say it belongs in (at most) the implementation, rather than the language spec. Agreed. I don't find any switch syntaxes better than if/elif/else. Speed benefits belong in implementation optimisations, not new bad syntax. -- Donovan Baarda [EMAIL PROTECTED] http://minkirri.apana.org.au/~abo/ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: switch statement
On Mon, 2005-04-25 at 21:21 -0400, Brian Beck wrote: Donovan Baarda wrote: Agreed. I don't find any switch syntaxes better than if/elif/else. Speed benefits belong in implementation optimisations, not new bad syntax. I posted this 'switch' recipe to the Cookbook this morning, it saves some typing over the if/elif/else construction, and people seemed to like it. Take a look: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/410692 Very clever... you have shown that current python syntax is capable of almost exactly replicating a C case statement. My only problem is C case statements are ugly. A simple if/elif/else is much more understandable to me. The main benefit in C of case statements is the compiler can optimise them. This copy of a C case statement will be slower than an if/elif/else, and just as ugly :-) -- Donovan Baarda [EMAIL PROTECTED] http://minkirri.apana.org.au/~abo/ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: anonymous blocks
Tim Delaney wrote: There aren't many builtins that have magic names, and I don't think this should be one of them - it has obvious uses other than as an implementation detail. I think there's some confusion here. As I understood the suggestion, __next__ would be the Python name of the method corresponding to the tp_next typeslot, analogously with __len__, __iter__, etc. There would be a builtin function next(obj) which would invoke obj.__next__(), for use by Python code. For loops wouldn't use it, though; they would continue to call the tp_next typeslot directly. Paul Moore wrote: PS The first person to replace builtin __next__ in order to implement a next hook of some sort, gets shot :-) I think he meant next(), not __next__. And it wouldn't work anyway, since as I mentioned above, C code would bypass next() and call the typeslot directly. I'm +1 on moving towards __next__, BTW. IMO, that's the WISHBDITFP. :-) -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: anonymous blocks
Guido van Rossum wrote: with VAR = EXPR: BODY This would translate to the following code: it = EXPR err = None while True: try: if err is None: VAR = it.next() else: VAR = it.next_ex(err) except StopIteration: break try: err = None BODY except Exception, err: # Pretend except Exception: == except: if not hasattr(it, next_ex): raise I like the general shape of this, but I have one or two reservations about the details. 1) We're going to have to think carefully about the naming of functions designed for use with this statement. If 'with' is going to be in there as a keyword, then it really shouldn't be part of the function name as well. Instead of with f = with_file(pathname): ... I would rather see something like with f = opened(pathname): ... This sort of convention (using a past participle as a function name) would work for some other cases as well: with some_data.locked(): ... with some_resource.allocated(): ... On the negative side, not having anything like 'with' in the function name means that the fact the function is designed for use in a with-statement could be somewhat non-obvious. Since there's not going to be much other use for such a function, this is a bad thing. It could also lead people into subtle usage traps such as with f = open(pathname): ... which would fail in a somewhat obscure way. So maybe the 'with' keyword should be dropped (again!) in favour of with_opened(pathname) as f: ... 2) I'm not sure about the '='. It makes it look rather deceptively like an ordinary assignment, and I'm sure many people are going to wonder what the difference is between with f = opened(pathname): do_stuff_to(f) and simply f = opened(pathname) do_stuff_to(f) or even just unconsciously read the first as the second without noticing that anything special is going on. Especially if they're coming from a language like Pascal which has a much less magical form of with-statement. So maybe it would be better to make it look more different: with opened(pathname) as f: ... * It seems to me that this same exception-handling mechanism would be just as useful in a regular for-loop, and that, once it becomes possible to put 'yield' in a try-statement, people are going to *expect* it to work in for-loops as well. Guido has expressed concern about imposing extra overhead on all for-loops. But would the extra overhead really be all that noticeable? For-loops already put a block on the block stack, so the necessary processing could be incorporated into the code for unwinding a for-block during an exception, and little if anything would need to change in the absence of an exception. However, if for-loops also gain this functionality, we end up with the rather embarrassing situation that there is *no difference* in semantics between a for-loop and a with-statement! This could be fixed by making the with-statement not loop, as has been suggested. That was my initial thought as well, but having thought more deeply, I'm starting to think that Guido was right in the first place, and that a with-statement should be capable of looping. I'll elaborate in another post. So a block could return a value to the generator using a return statement; the generator can catch this by catching ReturnFlow. (Syntactic sugar could be VAR = yield ... like in Ruby.) This is a very elegant idea, but I'm seriously worried by the possibility that a return statement could do something other than return from the function it's written in, especially if for-loops also gain this functionality. Intercepting break and continue isn't so bad, since they're already associated with the loop they're in, but return has always been an unconditional get-me-out-of-this-function. I'd feel uncomfortable if this were no longer true. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Weekly Python Patch/Bug Summary
Patch / Bug Summary ___ Patches : 316 open ( +2) / 2831 closed ( +7) / 3147 total ( +9) Bugs: 908 open (+10) / 4941 closed (+20) / 5849 total (+30) RFE : 178 open ( +1) / 153 closed ( +2) / 331 total ( +3) New / Reopened Patches __ package_data chops off first char of default package (2005-04-15) http://python.org/sf/1183712 opened by Wummel [ast] fix for 1183468: return/yield in class (2005-04-16) http://python.org/sf/1184418 opened by logistix urllib2 dloads failing through HTTP proxy w/ auth (2005-04-18) http://python.org/sf/1185444 opened by Mike Fleetwood binascii.b2a_qp does not handle binary data correctly (2005-04-18) http://python.org/sf/1185447 opened by Eric Huss Automatically build fpectl module from setup.py (2005-04-18) http://python.org/sf/1185529 opened by Jeff Epler Typo in Curses-Function doc (2005-04-20) http://python.org/sf/1186781 opened by grzankam subprocess: optional auto-reaping fixing os.wait() lossage (2005-04-21) http://python.org/sf/1187312 opened by Mattias Engdegård Add const specifier to PySpam_System prototype (2005-04-21) http://python.org/sf/1187396 opened by Luis Bruno Don't assume all exceptions are SyntaxError's (2005-04-25) http://python.org/sf/1189210 opened by John Ehresman Patches Closed __ fix typos in Library Reference (2005-04-10) http://python.org/sf/1180062 closed by doerwalter [AST] Fix for core in test_grammar.py (2005-04-08) http://python.org/sf/1179513 closed by nascheme Implemented new 'class foo():pass' syntax (2005-04-03) http://python.org/sf/1176019 closed by nascheme range() in for loops, again (2005-04-12) http://python.org/sf/1181334 closed by arigo Info Associated with Merge to AST (2005-01-07) http://python.org/sf/1097671 closed by kbk New / Reopened Bugs ___ Minor error in tutorial (2005-04-14) CLOSED http://python.org/sf/1183274 opened by Konrads Smelkovs check for return/yield outside function is wrong (2005-04-15) http://python.org/sf/1183468 opened by Neil Schemenauer try to open /dev/null as directory (2005-04-15) http://python.org/sf/1183585 opened by Roberto A. Foglietta PyDict_Copy() can return non-NULL value on error (2005-04-15) CLOSED http://python.org/sf/1183742 opened by Phil Thompson Popen4 wait() fails sporadically with threads (2005-04-15) http://python.org/sf/1183780 opened by Taale Skogan return val in __init__ doesn't raise TypeError in new-style (2005-04-15) CLOSED http://python.org/sf/1183959 opened by Adal Chiriliuc dest parameter in optparse (2005-04-15) http://python.org/sf/1183972 opened by ahmado Missing trailing newline with comment raises SyntaxError (2005-04-15) http://python.org/sf/1184112 opened by Eric Huss example broken in section 1.12 of Extending Embedding (2005-04-16) http://python.org/sf/1184380 opened by bamoore Read-only property attributes raise wrong exception (2005-04-16) CLOSED http://python.org/sf/1184449 opened by Barry A. Warsaw itertools.imerge: merge sequences (2005-04-18) CLOSED http://python.org/sf/1185121 opened by Jurjen N.E. Bos pydoc doesn't find all module doc strings (2005-04-18) http://python.org/sf/1185124 opened by Kent Johnson PyObject_Realloc bug in obmalloc.c (2005-04-19) http://python.org/sf/1185883 opened by Kristján Valur python socketmodule dies on ^c (2005-04-19) CLOSED http://python.org/sf/1185931 opened by nodata tempnam doc doesn't include link to tmpfile (2005-04-19) http://python.org/sf/1186072 opened by Ian Bicking [AST] genexps get scoping wrong (2005-04-19) http://python.org/sf/1186195 opened by Brett Cannon [AST] assert failure on ``eval(u'\Ufffe')`` (2005-04-19) http://python.org/sf/1186345 opened by Brett Cannon [AST] automatic unpacking of arguments broken (2005-04-19) http://python.org/sf/1186353 opened by Brett Cannon Python Programming FAQ should be updated for Python 2.4 (2005-02-09) CLOSED http://python.org/sf/1119439 reopened by montanaro nntplib shouldn't raise generic EOFError (2005-04-20) http://python.org/sf/1186900 opened by Matt Roper TypeError message on bad iteration is misleading (2005-04-21) http://python.org/sf/1187437 opened by Roy Smith Pickle with HIGHEST_PROTOCOL ord() expected... (2005-04-22) CLOSED http://python.org/sf/1188175 opened by Heiko Selber Rebuilding from source on RH9 fails (_tkinter.so missing) (2005-04-22) http://python.org/sf/1188231 opened by Marty Heyman Python 2.4 Not Recognized by Any Programs (2005-04-23) http://python.org/sf/1188637 opened by Yoshi Nagasaki zipfile module and 2G boundary (2005-04-24) http://python.org/sf/1189216 opened by Bob Ippolito Seg Fault when
Re: [Python-Dev] Re: anonymous blocks
Brett C. wrote: And before anyone decries the fact that this might confuse a newbie (which seems to happen with every advanced feature ever dreamed up), remember this will not be meant for a newbie but for someone who has experience in Python and iterators at the minimum, and hopefully with generators. This is dangerously close to the you don't need to know about it if you're not going to use it argument, which is widely recognised as false. Newbies might not need to know all the details of the implementation, but they will need to know enough about the semantics of with-statements to understand what they're doing when they come across them in other people's code. Which leads me to another concern. How are we going to explain the externally visible semantics of a with-statement in a way that's easy to grok, without mentioning any details of the implementation? You can explain a for-loop pretty well by saying something like It executes the body once for each item from the sequence, without having to mention anything about iterators, generators, next() methods, etc. etc. How the items are produced is completely irrelevant to the concept of the for-loop. But what is the equivalent level of description of the with-statement going to say? It executes the body with... ??? And a related question: What are we going to call the functions designed for with-statements, and the objects they return? Calling them generators and iterators (even though they are) doesn't seem right, because they're being used for a purpose very different from generating and iterating. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: Re: anonymous blocks
Terry Reedy wrote: Not supporting iterables makes it harder to write a class which is inherently usable in a with block, though. The natural way to make iterable classes is to use 'yield' in the definition of __iter__ - if iter() is not called, then that trick can't be used. If you're defining it by means of a generator, you don't need a class at all -- just make the whole thing a generator function. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Re: switch statement
Donovan Baarda wrote: Agreed. I don't find any switch syntaxes better than if/elif/else. Speed benefits belong in implementation optimisations, not new bad syntax. Two things are mildly annoying about if-elif chains as a substitute for a switch statement: 1) Repeating the name of the thing being switched on all the time, and the operator being used for comparison. 2) The first case is syntactically different from subsequent ones, even though semantically all the cases are equivalent. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com