[Python-ideas] name for new Enum decorator
Greetings! The Flag type in the enum module has had some improvements, but I find it necessary to move one of those improvements into a decorator instead, and I'm having a hard time thinking up a name. What is the behavior? Well, a name in a flag type can be either canonical (it represents one thing), or aliased (it represents two or more things). To use Color as an example: class Color(Flag): RED = 1# 0001 GREEN = 2 # 0010 BLUE = 4 # 0100 PURPLE = RED | BLUE# 0101 WHITE = RED | GREEN | BLUE # 0111 The flags RED, GREEN, and BLUE are all canonical, while PURPLE and WHITE are aliases for certain flag combinations. But what if we have something like: class Color(Flag): RED = 1# 0001 BLUE = 4 # 0100 WHITE = 7 # 0111 As you see, WHITE is an "alias" for a value that does not exist in the Flag (0010, or 2). That seems like it's probably an error. But what about this? class FlagWithMasks(IntFlag): DEFAULT = 0x0 FIRST_MASK = 0xF FIRST_ROUND = 0x0 FIRST_CEIL = 0x1 FIRST_TRUNC = 0x2 SECOND_MASK = 0xF0 SECOND_RECALC = 0x00 SECOND_NO_RECALC = 0x10 THIRD_MASK = 0xF00 THIRD_DISCARD = 0x000 THIRD_KEEP = 0x100 Here we have three flags (FIRST_MASK, SECOND_MASK, THIRD_MASK) that are aliasing values that don't exist, but it seems intentional and not an error. So, like the enum.unique decorator that can be used when duplicate names should be an error, I'm adding a new decorator to verify that a Flag has no missing aliased values that can be used when the programmer thinks it's appropriate... but I have no idea what to call it. Any nominations? -- ~Ethan~ ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/C4JRARG6RZKRJCD7745JIK6ZMFRYM76M/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 7:24 PM Chris Angelico wrote: > But `len_ = len` does work. However, that doesn't change the > > calculus at all for me. My point wasn't about using the exact same > > variable name. It's that ANY ability to create a local variable that is > > a fast-lookup shortcut for a global one is enough. > > The change of variable name is significant, though. It means that this > is no longer a minor change to the function's header; in order to use > this optimization, you have to replace every use of the name "len" > with "len_" (or "_len"). sure. but this is only worth doing if you are using that name in a tight loop somewhere -- so you only need the special name in one or two places, and I've always put that hack right next to that loop anyway. if you are calling, e.g. len() in hundreds of separate places in one function -- you've got a much larger problem. -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/UHBNGLJIK5WZLZXGG3TRJKINAUYGXCWX/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
> > My concern about thread safety is about how easy it would be to make it > > thread unsafe accidentally. > > I'm intrigued what gives you the impression that Python functions and > classes are, by default, thread safe. Well, there is thread safe, and there is thread dangerous. I have an enormous amount of code that is not strictly thread safe, but works fine when run under a multi-threaded web server because there are no cases where the same instances of objects are running in different threads. Well, not true, there are many shared function objects (and class objects). I very much assume that those function objects are not changing at run time. But having static variables would totally break that assumption. That would be like mutating class attributes, which, of course, is perfectly possible, but a little harder to do without thinking about it. or don't use shared mutable data. Exactly— and functions are always shared if you are using threads at all. But not mutable if used in the usual way. (Now that I think about it, the suggestions on this thread about putting things in the function namespace makes them mutable — but I at least have never done that. Function local static variables would be no worse in this regard than > existing features: globals, mutable default values, Mutable default values are a notable “gotcha”. classes with > attributes, etc. Yes, I think this is most like class attributes, which are probably less well known as a “gotcha”. But also not that widely used by accident. I don’t think I’ve even seen a student use them. And I sure have seen mutable default values mistakenly used in students' code. Another point -- whether there is a static variable in the function becomes a not obvious part of its API. That's probably the biggest issue -- some library author used static, and all its users may not know that, and then use the code in a multi-threaded application, or frankly in a single threaded application that isn't expecting functions to be mutated. Anyway, agreed — it is very easy to write not-thread safe code in Python now. So maybe this wouldn’t provide significantly more likelihood of it happening accidentally. One of the more disheartening things about the culture of this mailing > list is the way new proposals are held to significantly higher standards > that existing language and stdlib features do not meet. A lot of people (I thought you included) hold the view that new features SHOULD be held to a higher standard. > This is a perfect example: it's not like regular Python functions and > classes are thread-safe by default and this is introducing a new problem > that is almost unique to static variables. no -- but as above, functions themselves are immutable by default in normal usage --that would change. And since the entire point of classes is to hold state and the functions that work with that state in one place, it's expected behaviour that that state changes. Which makes me realize why I never wanted a function static variable -- if I wanted changable state associated with a function, I used a class. And most commonly there was more than one function associated with that state, so a class was the rigth solution anyway. I know that Jack Diederich says: "if a class has only two functions, and one of them is __init__ -- it's not a class", and I totally agree with him, but if you do have a case where you have some state and only one function associated with it, maybe a class IS the right solution. -CHB ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/OX6IYZWC3SE2UEE7KXBPTI7FLHZ7DKW3/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-28 at 12:22:22 +1000, Chris Angelico wrote: > [...] calculate something once and reuse the value, because you know > that it won't change (or don't care if it changes) [...] > (Some day I'll learn how to do this in real life. Why can't I buy just > one egg, and then reuse the same egg for every meal?) Those are mutable eggs. Try immutable eggs instead. Or obtain a hen, aka an egg factory (and a rooster, too, but that's off topic, even for Python Ideas). ObPython: >>> egg = Egg() >>> egg.scramble() >>> egg.fry() Traceback (most recent call last): File "", line 1, in EggStateError: cannot fry a scrambled egg Mutability is the root of all evil. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/7TTGNMQIHZKENGIYASYFBZI3KNJWY6CB/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 12:13 PM Brendan Barnwell wrote: > > On 2021-05-27 13:15, Chris Angelico wrote: > > H let's see. > > > def merge_shortest(things): > > ... len=len > > ... ... > > ... > merge_shortest([]) > > Traceback (most recent call last): > >File "", line 1, in > >File "", line 2, in merge_shortest > > UnboundLocalError: local variable 'len' referenced before assignment > > Okay, yeah, mea culpa. As several people pointed out that doesn't > work. But `len_ = len` does work. However, that doesn't change the > calculus at all for me. My point wasn't about using the exact same > variable name. It's that ANY ability to create a local variable that is > a fast-lookup shortcut for a global one is enough. My point is that > manually creating fast-lookup local-variable shortcuts is inherently a > performance hack and there's no real use in making it slightly > nicer-looking. > The change of variable name is significant, though. It means that this is no longer a minor change to the function's header; in order to use this optimization, you have to replace every use of the name "len" with "len_" (or "_len"). That isn't necessarily a deal-breaker; I've seen code that optimizes method lookups away by retaining the callable (eg "ap = some_list.append"), so there's uses for that kind of rename; but refactoring becomes harder if you have to be aware of whether you're using the optimized version or not. Why should a performance-improving hoist look ugly? It's a perfectly normal thing to do - calculate something once and reuse the value, because you know that it won't change (or don't care if it changes). It's not a "hack" - it's a legit method of saving effort. (Some day I'll learn how to do this in real life. Why can't I buy just one egg, and then reuse the same egg for every meal?) ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/IF5WGNJNQCKYCYMOC7E4NTVJF5WCHJVV/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 13:15, Chris Angelico wrote: H let's see. def merge_shortest(things): ... len=len ... ... ... merge_shortest([]) Traceback (most recent call last): File "", line 1, in File "", line 2, in merge_shortest UnboundLocalError: local variable 'len' referenced before assignment Okay, yeah, mea culpa. As several people pointed out that doesn't work. But `len_ = len` does work. However, that doesn't change the calculus at all for me. My point wasn't about using the exact same variable name. It's that ANY ability to create a local variable that is a fast-lookup shortcut for a global one is enough. My point is that manually creating fast-lookup local-variable shortcuts is inherently a performance hack and there's no real use in making it slightly nicer-looking. -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/DPVT5ORXCDRHJXTLUVSYAFXJ3Y7A6VVM/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 02:02:11PM -0700, Christopher Barker wrote: > My concern about thread safety is about how easy it would be to make it > thread unsafe accidentally. I'm intrigued what gives you the impression that Python functions and classes are, by default, thread safe. The FAQ is a little bit misleading: https://docs.python.org/3/faq/library.html#what-kinds-of-global-value-mutation-are-thread-safe While it is true that builtin operations like list.append are thread safe, as soon as you have two of them, the compound operation is no longer thread safe unless guarded with a lock. L.append(x) # thread-safe L.append(y) # L may have already been modified by another thread assert L[-2:] == (x, y) # may fail And if L is a subclass of list or duck-typed, all bets are off. Even L.append may not be safe, if L is something other than an actual list. Of course, there is a simple solution to that (apart from locks): don't use threads, or don't use shared mutable data. It is only the combination of concurrency with shared mutable state that is problematic. Remove either of those, and you're cool. Function local static variables would be no worse in this regard than existing features: globals, mutable default values, classes with attributes, etc. I'm not sure about closures and thread-safety, but closures come with their own pitfalls: https://docs.python-guide.org/writing/gotchas/#late-binding-closures > Sure, global is not thread safe, but it is well known that use of global > is, to newbies, “bad”, and to more experienced programmers, “to be used > with caution, understanding the risks”. Is it well-known that writing classes is "bad", "to be used with caution, understanding the risks"? https://stackoverflow.com/questions/8309902/are-python-instance-variables-thread-safe One of the more disheartening things about the culture of this mailing list is the way new proposals are held to significantly higher standards that existing language and stdlib features do not meet. This is a perfect example: it's not like regular Python functions and classes are thread-safe by default and this is introducing a new problem that is almost unique to static variables. Regular Python functions and classes are almost never thread-safe unless carefully written to be, which most people don't bother to do unless they specifically care about thread safety, which most people don't. Any time you have a function with state, then it requires care to make it thread-safe. Doesn't matter whether that storage is a global, mutable defaults, instance or class attributes, or these hypothetical static variables. Pure functions with no state or side-effects are thread-safe, but beyond that, every non-builtin, and some builtins, should be assumed to be unsafe unless carefully designed for concurrent use. It's not always obvious either: print(x) Not thread-safe. Two threads can write to stdout simultaneously, interleaving their output. -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/DNDSNPOF57TOWACQDG72HRZHVN3OIBT4/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 21:02, Brendan Barnwell wrote: On 2021-05-27 12:33, Chris Angelico wrote: With statics, you could write it like this: def merge_shortest(things): static len=len ... Simple. Easy. Reliable. (And this usage would work with pretty much any of the defined semantics.) There's no more confusion. You can already do that: def merge_shortest(things): len=len ... No, that raises an UnboundLocalError exception. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/CRVOQZMZJFLLCGUHRJ3LXIJIIV4UIYCM/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 7:04 AM Christopher Barker wrote: > > My concern about thread safety is about how easy it would be to make it > thread unsafe accidentally. > > Sure, global is not thread safe, but it is well known that use of global is, > to newbies, “bad”, and to more experienced programmers, “to be used with > caution, understanding the risks”. > > But particularly if static provides a performance boost, people will be very > tempted to use it without considering the implications. > > If people want a high performance local constant— that sounds something like > the constant proposal the OP brought up earlier. > Variable statics are no less thread safe than globals are (nor any more thread safe). They behave virtually identically. Constant statics are completely thread safe. If you're doing it just for the performance improvement, there's no way that threading can possibly affect it. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/TXCHUD5WAGZBYVPACDY2TFHZSHL6DWZJ/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 01:02:15PM -0700, Brendan Barnwell wrote: > You can already do that: > > def merge_shortest(things): > len=len > ... > > Yes, it does require a single global lookup on each function call, > but if that's really a bottleneck for you I don't think there's much > hope. > :-) It's not so much the single global lookup on each function call as the fact that you can't do that at all :-( UnboundLocalError: local variable 'len' referenced before assignment [...] > Sorry, I was a bit vague there. What I was envisioning is that you > would specify len as a constant at the GLOBAL level, meaning that all > functions in the module could always assume it referred to the same > thing. (It's true this might require something different from what was > proposed in the other thread about constants.) Getting actual constants that the interpreter can trust will not change is likely to be a much bigger language change than taking advantage of existing mechanisms that already exist in functions, with perhaps a little extra work, to get per function local static storage. But even if we did have actual constants, how does that help get static *variables*, you know, things that aren't constant but can vary? -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/IDDOIZTBSW3LSMGOLXCUQCKIQ2R6J42H/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
> > I am confused why you are okay with > @decorator var: bool > > but not > > @decorator var > > Yes, a bare name is currently an error while just a name and a type hint > is valid, > but the latter doesn't bind anything to the name, and using that > identifier is still > a NameError. So a decorator presumably can't return a value for either (or > it > could, but it would always be dropped). What could it do with a name and a > type hint that is so much better than just a name? > To reiterate my point from earlier in the thread, I am quite firmly opposed to having the decorator on the same line as the statement being decorated. I'll just copy-paste my reasoning for it: I like this: > > @typing.TypeVar > T = str, bytes > > > about a million times better than: > > @typing.TypeVar T = str, bytes > > > Because the latter feels (to me) too similar for comfort to this: > > int foo = 3 > > > Which is in my mind not very pythonic. Also, my brain just has an easier > time parsing the multiline version than the single-line one (though I > concede that a combination of familiarity and syntax highlighting would > solve that issue eventually). > > It also represents an asymmetry between the syntax of the proposed > assignment decorators and the syntax for function and class decorators. > > And finally, it doesn't cleanly accommodate use-cases like those proposed > by Stéfane in the previous thread: > > @acl(READ, WRITE) > @constraint(10 < _ < 100) > @not_null > @indexed > @depends_on(whatever) > @inject > first_name: str > > > Whereas the multiline variant does. > And regarding your proposal to relax the newline requirement on function/class decorators: I got the sense that people both liked reading my examples of same-line > decorators > and pushed back against not making them appear just like > function decorators. One > way to have my cake and you eat it too would be to relax the current > decorator grammar > to not require a NEWLINE. AFAICT there would be no ambiguity since after a > decorator > there must either be another "@" or a "def". Then both function and > assignment > decorating can be both. These would all be possible and not change the > status quo. > > @cache def factorial(n): > pass > This would admittedly resolve the asymmetry between the proposed single-line variable decorator syntax and current decorators, but personally I just *really* don't like it. The only benefit is saving a line, but it comes at (in my opinion, your mileage may vary) a huge cost to legibility, and it goes against the zen: There should be one-- and preferably only one --obvious way to do it. And it's just something that has (almost) no precedent in python (aside from `async def`). It has the feel of something like: public static final native abstract void SomeMethod() {...} I'm imagining a future of reading python code like: @too @many("!") @decorators @on("a", "single", "line") def foo(...): And it makes me unhappy :( So going back to your original point of why I'm okay with decorating a bare type-hint, like: @decorate foo: int But not @decorate foo The reason is simply that if you take the decorator away the first one is legal, and the second one will raise `NameError`. I'm happy to decorate a statement that is valid on its own, but I'm against the idea of special-casing decorator syntax so that it can decorate otherwise-invalid statements. And I do think that there are legitimate uses for decorating a bare type-hint, since it does actually contain useful information the decorator might want to capture (for example, in one of its own instance attributes if the decorator is an object rather than a function, which it would have to be in order to implement __decoration_call__). You're right in that it wouldn't be able to return anything since no assignment is taking place, but there are still potential use-cases for it. I'll concede that there are also use-cases for decorating a completely naked name, but none of the ones I've seen so far seem to me compelling enough to break the rules like this. On Thu, May 27, 2021 at 8:43 PM Ricky Teachey wrote: > Whoops, replying all this time. > > On Thu, May 27, 2021 at 2:32 PM micro codery wrote: > >> >> >> On Thu, May 27, 2021 at 10:40 AM Matt del Valle >> wrote: >> >> I am still very confused as to the scope of this counter proposal re >> variable >> decorating. I have only seen two such examples here >> >> @decorator variable >> # variable = decorator.__decoration_call__(None, "variable") >> >> @decorator variable = "spam" >> # variable = decorator.__decoration_call__(variable, "variable") >> >> But what is actually valid to follow a decorator in this proposal? >> Any simple expression, any expression? Is it limited to assignment >> espressions? >> > > At this point, I have in mind any expression that appears to the right, > which I believe is what is allowed today: > > @1/2 "lala" and money > def func(): ... > > >> Here are some interesting uses that
[Python-ideas] Re: Add static variable storage in functions
My concern about thread safety is about how easy it would be to make it thread unsafe accidentally. Sure, global is not thread safe, but it is well known that use of global is, to newbies, “bad”, and to more experienced programmers, “to be used with caution, understanding the risks”. But particularly if static provides a performance boost, people will be very tempted to use it without considering the implications. If people want a high performance local constant— that sounds something like the constant proposal the OP brought up earlier. -CHB On Thu, May 27, 2021 at 12:18 PM Brendan Barnwell wrote: > On 2021-05-27 05:18, Steven D'Aprano wrote: > > On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: > > > >> Lot of programming languages have something known as static variable > >> storage in *functions* not *classes*. Static variable storage means a > >> variable limited to a function yet the data it points to persists > >> until the end of the program. > > > > +1 on this idea. > > > > One common use for function defaults is to optimize function lookups to > > local variables instead of global or builtins: > > > > def func(arg, len=len): > > # now len is a fast local lookup instead of a slow name lookup > > > > Benchmarking shows that this actually does make a significant difference > > to performance, but it's a technique under-used because of the > > horribleness of a len=len parameter. > > > > (Raymond Hettinger is, I think, a proponent of this optimization trick. > > At least I learned it from his code.) > > I don't see this as a great motivation for this feature. If the > goal > is to make things faster I think that would be better served by making > the interpreter smarter or adding other global-level optimizations. As > it is, you're just trading one "manual" optimization (len=len) for > another (static len). > > Yes, the new one is perhaps slightly less ugly, but it still puts > the > onus on the user to manually "declare" variables as local, not because > they are semantically local in any way, but just because we want a > faster lookup. I see that as still a hack. A non-hack would be some > kind of JIT or optimizing interpreter that actually reasons about how > the variables are used so that the programmer doesn't have to waste time > worrying about hand-tuning optimizations like this. So basically for me > anything that involves the programmer saying "Please make this part > faster" is a hack. :-) We all want everything to be as fast as possible > all the time, and in sofar as we're concerned about speed we should > focus on making the entire interpreter smarter so everything is faster, > rather than adding new ways for the programmer do extra work to make > just a few things faster. > > Even something like a way of specifying constants (which has been > proposed in another thread) would be better to my eye. That would let > certain variables be marked as "safe" so that they could always be > looked up fast because we'd be sure they're never going to change. > > As to the original proposal, I'm not in favor of it. It's fairly > uncommon for me to want to do this, and in the cases where I do, Python > classes are simple enough that I can just make a class with a method (or > a __call__ if I want to be really cool) that stores the data in a way > that's more transparent and more clearly connected to the normal ways of > storing state in Python. It just isn't worth adding yet another > complexity to the language for this minor use case. > > -- > Brendan Barnwell > "Do not follow where the path may lead. Go, instead, where there is no > path, and leave a trail." > --author unknown > ___ > Python-ideas mailing list -- python-ideas@python.org > To unsubscribe send an email to python-ideas-le...@python.org > https://mail.python.org/mailman3/lists/python-ideas.python.org/ > Message archived at > https://mail.python.org/archives/list/python-ideas@python.org/message/AT5HA2Y7ADBCXSYRQ5NWZZM4VJLS4TVE/ > Code of Conduct: http://python.org/psf/codeofconduct/ > -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/Q2ELTC6ZFZSNUT4VVBFKE5TLANNMMYBH/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 6:04 AM Brendan Barnwell wrote: > > On 2021-05-27 12:33, Chris Angelico wrote: > > With statics, you could write it like this: > > > > def merge_shortest(things): > > static len=len > > ... > > > > Simple. Easy. Reliable. (And this usage would work with pretty much > > any of the defined semantics.) There's no more confusion. > > You can already do that: > > def merge_shortest(things): > len=len > ... > > Yes, it does require a single global lookup on each function call, but > if that's really a bottleneck for you I don't think there's much hope. :-) H let's see. >>> def merge_shortest(things): ... len=len ... ... ... >>> merge_shortest([]) Traceback (most recent call last): File "", line 1, in File "", line 2, in merge_shortest UnboundLocalError: local variable 'len' referenced before assignment There are languages in which you're allowed to do this (using a name in an initializer to fetch from a parent scope), but Python isn't one of them. At best, you could write "_len=len", but then you have to rewrite the function body to use _len, leaving the question of "why is this _len and not len?" for every future maintainer. Since a static declaration is evaluated at function definition time (just like a default argument is), this problem doesn't come up, because the local name "len" won't exist at that point. > >> Even something like a way of specifying constants (which has been > >> proposed in another thread) would be better to my eye. That would let > >> certain variables be marked as "safe" so that they could always be > >> looked up fast because we'd be sure they're never going to change. > > > > Question: When does this constant get looked up? > > > > def merge_shortest(things): > > constant len=len > > ... > > > > Is it looked up as the function begins execution, or when the function > > is defined? How much are you going to assume that it won't change? > > Sorry, I was a bit vague there. What I was envisioning is that you > would specify len as a constant at the GLOBAL level, meaning that all > functions in the module could always assume it referred to the same > thing. (It's true this might require something different from what was > proposed in the other thread about constants.) > Gotcha, gotcha. I think module-level constants could *also* be useful, but they're orthogonal to this proposal. Unless it's a compile-time constant (so, as the module gets imported, all references to "len" become LOAD_CONST of whatever object was in the builtins at that point), I doubt it would have the same performance benefits, and it obviously couldn't handle the mutable statics use-case. I think there are very good use-cases for module-level constants, but the trouble is, there are so many variants of the idea and so many not-quite-overlapping purposes that they can be put to :) ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/I6L52IIK7LUWSN26WRU3653KXHGZ4YCL/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 12:33, Chris Angelico wrote: With statics, you could write it like this: def merge_shortest(things): static len=len ... Simple. Easy. Reliable. (And this usage would work with pretty much any of the defined semantics.) There's no more confusion. You can already do that: def merge_shortest(things): len=len ... Yes, it does require a single global lookup on each function call, but if that's really a bottleneck for you I don't think there's much hope. :-) Even something like a way of specifying constants (which has been proposed in another thread) would be better to my eye. That would let certain variables be marked as "safe" so that they could always be looked up fast because we'd be sure they're never going to change. Question: When does this constant get looked up? def merge_shortest(things): constant len=len ... Is it looked up as the function begins execution, or when the function is defined? How much are you going to assume that it won't change? Sorry, I was a bit vague there. What I was envisioning is that you would specify len as a constant at the GLOBAL level, meaning that all functions in the module could always assume it referred to the same thing. (It's true this might require something different from what was proposed in the other thread about constants.) -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/KFCOPJYUUDTOZ22VVC23ETJJBIW265VI/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
Reply to Chris: Also it's rarely the case where it can become thread unsafe suddenly. 1 / 10*something chances. Because I've repeatedly run a thread-unsafe code and have not encountered thread unsafe state yet. GIL executes the code to a very good extent. And is it hypothetically even possible to have thread unsafe state that can affect functions? Because locals of the different functions are different. Since the variable will be loaded by LOAD_FAST it will look into locals for the static variable and both locals will differ. The only dangerous code is op=. Because this depends on the current value of static variable that can make the function go to an undefined state. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/T5234WEZ2HJZVLJNYAV565IGRSVYQLIC/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
Whoops, replying all this time. On Thu, May 27, 2021 at 2:32 PM micro codery wrote: > > > On Thu, May 27, 2021 at 10:40 AM Matt del Valle > wrote: > > I am still very confused as to the scope of this counter proposal re > variable > decorating. I have only seen two such examples here > > @decorator variable > # variable = decorator.__decoration_call__(None, "variable") > > @decorator variable = "spam" > # variable = decorator.__decoration_call__(variable, "variable") > > But what is actually valid to follow a decorator in this proposal? > Any simple expression, any expression? Is it limited to assignment > espressions? > At this point, I have in mind any expression that appears to the right, which I believe is what is allowed today: @1/2 "lala" and money def func(): ... > Here are some interesting uses that were brought up > in the other thread and I would like to know how they would work. > > @decorator > spam = eggs = cheese = "tomatoes" > > @decorator > spam, eggs, cheese = "tomatoes" > > @decorator > spam, eggs = cheese = "tomatoes" > > @decorator > spam = (eggs := "cheese) > > @decorator > locals()[find_it() or "default"] = spam() > > > Regards, > ~Jeremiah > I would also like to know how all of these work :D I am not sure about most of them but open to suggestions. The only one that I feel confident about is: @decorator spam = (eggs := "cheese) ...which, I think, should be: decorator.__decoration_call__(spam, "spam") Unfortunately for the proposal most people don't seem too thrilled with it. So I don't plan to spend a lot of time thinking through these examples and suggesting behavior. Anyone is welcome to do that though, this isn't MINE in the sense I am jealously guarding ownership of the details. :) On Thu, May 27, 2021 at 3:03 PM Brendan Barnwell wrote: > On 2021-05-26 09:43, Ricky Teachey wrote: > > These two ideas of a decorator syntax result are not the same: > > > > RESULT A: function decorator > > # func = decorator("spam")(func) > > > > RESULT B: variable decorator > > # name = decorator("spam")("name") > > > > ...because func is passed as an object, but "name" a string representing > > the name of the object. Two very different things. > > > > For this reason I think I would agree even more so that the differences > > in the decorator behavior would be an extremely significant point of > > confusion. > > > > This got me to thinking: what if access to the variable name were > > provided by another means, and ONLY when the decorator syntax is > employed? > > This seems contradictory to me. It looks like you're saying, "We > shouldn't use decorator syntax to represent two different things (object > vs name), but instead decorator syntax should give us access to two > different things (object vs name)." I realize based on your proposal > there is a distinction here but I think it's quite a narrow one and > doesn't resolve the basic problem, which is that currently decorators > operate on objects and these new proposals are about making them operate > on names. > > I think there may be value in having some feature that lets us get > access to the name side of an assignment. But I wouldn't call such a > thing a "decorator", nor would I want to use the same @ syntax that is > used for decorators. To me that would be confusing, because the > behavior is totally different. Even with your __decorator_call__ > proposal, there's still a jarring shift from, in some cases, using just > the object, and in other cases stuffing a new parameter (the name) into > the parameter list. That seems awkward to me. > > > -- > Brendan Barnwell > Yes, and for this reason I really liked Steve's googly eyes proposal in the other thread. But I wonder if there value in specifically giving decorators access to the name side? It seems to me that it would open up a lot of possibilities, just as when descriptors learned their names. class Desc: def __set_name__(self, owner, name): self.name = name def __get__(self, instance, owner): if instance is None: pass print(f"I am { owner.__name__}.{self.name}") class C: v = Desc() >>> C().v I am C.v We could make the __decoration_call__ method even more powerful. We could give it access not just to the name side, but to the type info, and even the code object/expression side (i.e., RHS). @decorator x: Fraction = 1/2 # decorator.__decoration_call__(x, "x", "1/2", Fraction) So, for example, a Math library can create the feature: @Math x: Fraction = 1/2 And x is: Fraction(1, 2) Yes, you can do: x = Fraction("1/2") ...today. I get that. But it's not as if this Math example is the ONLY thing it allows you to do. You also can easily avoid repeating things and making mistakes: @namedtuple_factory Point = "x y" I'm sure there are many other things I haven't thought of. --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." -
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 5:25 AM Shreyan Avigyan wrote: > > Chris wrote: > > This is thread-safe: > > > > from threading import Lock > > > > lock = Lock() > > counter = 0 > > def get_next(): > >with lock: > >global counter > >counter += 1 > >my_counter = counter > > This is a great workaround. I can try to improve this. But first of all > should we depend on the user to do this locking? I don't think so. So is it > possible to implement this in the background without affecting current > performance? > No, you can't, because it's impossible to know when you're done mutating. However, if the mutation is inherently atomic - or if subsequent lookups don't require atomicity - then the lock becomes unnecessary, and your code will be thread-safe already. (If you use async functions or recursion but no threads, then every yield point becomes explicit in the code, and you effectively have a lock that governs every block of code between those points. But that has many many other implications. Point is, statics should be compatible with ALL use-cases, and that shouldn't be difficult.) ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/UDCI23RMBFGW27U3VUAUAEDYAI2XRH7A/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 5:19 AM Brendan Barnwell wrote: > > On 2021-05-27 05:18, Steven D'Aprano wrote: > > On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: > > > >> Lot of programming languages have something known as static variable > >> storage in *functions* not *classes*. Static variable storage means a > >> variable limited to a function yet the data it points to persists > >> until the end of the program. > > > > +1 on this idea. > > > > One common use for function defaults is to optimize function lookups to > > local variables instead of global or builtins: > > > > def func(arg, len=len): > > # now len is a fast local lookup instead of a slow name lookup > > > > Benchmarking shows that this actually does make a significant difference > > to performance, but it's a technique under-used because of the > > horribleness of a len=len parameter. > > > > (Raymond Hettinger is, I think, a proponent of this optimization trick. > > At least I learned it from his code.) > > I don't see this as a great motivation for this feature. If the goal > is to make things faster I think that would be better served by making > the interpreter smarter or adding other global-level optimizations. As > it is, you're just trading one "manual" optimization (len=len) for > another (static len). > > Yes, the new one is perhaps slightly less ugly, but it still puts the > onus on the user to manually "declare" variables as local, not because > they are semantically local in any way, but just because we want a > faster lookup. I see that as still a hack. A non-hack would be some > kind of JIT or optimizing interpreter that actually reasons about how > the variables are used so that the programmer doesn't have to waste time > worrying about hand-tuning optimizations like this. If you're doing a lot of length checks, the standard semantics of Python demand that the name 'len' be looked up every time it's called. That's expensive - first you check the module globals, then you check the builtins. With some sort of local reference, the semantics change: now the name 'len' is looked up once, and the result is cached. That means that creating globals()["len"] in the middle of the function will no longer affect its behaviour. An optimizing compiler that did this would be a nightmare. Explicitly choosing which names to retain means that the programmer is in control. The biggest problem with the default argument trick is that it makes it look as if those arguments are part of the function's API, where they're really just an optimization. Consider: def shuffle(things, *, randrange=random.randrange): ... def merge_shortest(things, *, len=len): ... Is it reasonable to pass a different randrange function to shuffle()? Absolutely! You might have a dedicated random.Random instance (maybe a seeded PRNG for reproducible results). Is it reasonable to pass a different len function to merge_shortest()? Probably not - it looks like it's probably an optimization. Yes, you could say "_len=len", but now your optimization infects the entire body of the function, instead of being a simple change in the function header. With statics, you could write it like this: def merge_shortest(things): static len=len ... Simple. Easy. Reliable. (And this usage would work with pretty much any of the defined semantics.) There's no more confusion. > So basically for me > anything that involves the programmer saying "Please make this part > faster" is a hack. :-) We all want everything to be as fast as possible > all the time, and in sofar as we're concerned about speed we should > focus on making the entire interpreter smarter so everything is faster, > rather than adding new ways for the programmer do extra work to make > just a few things faster. It's never a bad thing to make the interpreter smarter and faster, if it can be done without semantic changes. (Mark Shannon has some current plans that, I believe, fit that description.) This is different, though - the behaviour WILL change, so it MUST be under programmer control. > Even something like a way of specifying constants (which has been > proposed in another thread) would be better to my eye. That would let > certain variables be marked as "safe" so that they could always be > looked up fast because we'd be sure they're never going to change. Question: When does this constant get looked up? def merge_shortest(things): constant len=len ... Is it looked up as the function begins execution, or when the function is defined? How much are you going to assume that it won't change? > As to the original proposal, I'm not in favor of it. It's fairly > uncommon for me to want to do this, and in the cases where I do, Python > classes are simple enough that I can just make a class with a method (or > a __call__ if I want to be really cool) that stores the data in a way > that's more transparent and more clearly connected
[Python-ideas] Re: Add static variable storage in functions
Chris wrote: > This is thread-safe: > > from threading import Lock > > lock = Lock() > counter = 0 > def get_next(): >with lock: >global counter >counter += 1 >my_counter = counter This is a great workaround. I can try to improve this. But first of all should we depend on the user to do this locking? I don't think so. So is it possible to implement this in the background without affecting current performance? ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/MMC4MFCMWH2VZXKKAKVC2NFQJENOMI74/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 05:18, Steven D'Aprano wrote: On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: Lot of programming languages have something known as static variable storage in *functions* not *classes*. Static variable storage means a variable limited to a function yet the data it points to persists until the end of the program. +1 on this idea. One common use for function defaults is to optimize function lookups to local variables instead of global or builtins: def func(arg, len=len): # now len is a fast local lookup instead of a slow name lookup Benchmarking shows that this actually does make a significant difference to performance, but it's a technique under-used because of the horribleness of a len=len parameter. (Raymond Hettinger is, I think, a proponent of this optimization trick. At least I learned it from his code.) I don't see this as a great motivation for this feature. If the goal is to make things faster I think that would be better served by making the interpreter smarter or adding other global-level optimizations. As it is, you're just trading one "manual" optimization (len=len) for another (static len). Yes, the new one is perhaps slightly less ugly, but it still puts the onus on the user to manually "declare" variables as local, not because they are semantically local in any way, but just because we want a faster lookup. I see that as still a hack. A non-hack would be some kind of JIT or optimizing interpreter that actually reasons about how the variables are used so that the programmer doesn't have to waste time worrying about hand-tuning optimizations like this. So basically for me anything that involves the programmer saying "Please make this part faster" is a hack. :-) We all want everything to be as fast as possible all the time, and in sofar as we're concerned about speed we should focus on making the entire interpreter smarter so everything is faster, rather than adding new ways for the programmer do extra work to make just a few things faster. Even something like a way of specifying constants (which has been proposed in another thread) would be better to my eye. That would let certain variables be marked as "safe" so that they could always be looked up fast because we'd be sure they're never going to change. As to the original proposal, I'm not in favor of it. It's fairly uncommon for me to want to do this, and in the cases where I do, Python classes are simple enough that I can just make a class with a method (or a __call__ if I want to be really cool) that stores the data in a way that's more transparent and more clearly connected to the normal ways of storing state in Python. It just isn't worth adding yet another complexity to the language for this minor use case. -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/AT5HA2Y7ADBCXSYRQ5NWZZM4VJLS4TVE/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On 2021-05-26 09:43, Ricky Teachey wrote: These two ideas of a decorator syntax result are not the same: RESULT A: function decorator # func = decorator("spam")(func) RESULT B: variable decorator # name = decorator("spam")("name") ...because func is passed as an object, but "name" a string representing the name of the object. Two very different things. For this reason I think I would agree even more so that the differences in the decorator behavior would be an extremely significant point of confusion. This got me to thinking: what if access to the variable name were provided by another means, and ONLY when the decorator syntax is employed? This seems contradictory to me. It looks like you're saying, "We shouldn't use decorator syntax to represent two different things (object vs name), but instead decorator syntax should give us access to two different things (object vs name)." I realize based on your proposal there is a distinction here but I think it's quite a narrow one and doesn't resolve the basic problem, which is that currently decorators operate on objects and these new proposals are about making them operate on names. I think there may be value in having some feature that lets us get access to the name side of an assignment. But I wouldn't call such a thing a "decorator", nor would I want to use the same @ syntax that is used for decorators. To me that would be confusing, because the behavior is totally different. Even with your __decorator_call__ proposal, there's still a jarring shift from, in some cases, using just the object, and in other cases stuffing a new parameter (the name) into the parameter list. That seems awkward to me. -- Brendan Barnwell "Do not follow where the path may lead. Go, instead, where there is no path, and leave a trail." --author unknown ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/JNJNBYMNH43TX575WWB6R726C6V2CRXW/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 4:49 AM Shreyan Avigyan wrote: > > Reply to Chris: > > The only problem is that with that approach that we can't understand if > that's the last yield statement. To achieve that we need to keep going until > we encounter a StopIteration. And the value of x would 3. Because we're not > iterating over a particular generator. We're creating multiple instances > which actually would increase x. > > And also is there another way we can make it thread safe? Steven's idea is > actually the only solution we've encountered till now. I'd be really happy if > someone could come up with even a better idea. > Also - Steven's idea is NOT a solution. It worsens the problem. I don't see how it is at all a solution. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/WWK7YMX36TSJFUURARNIVKMO3GO3X3XI/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 4:49 AM Shreyan Avigyan wrote: > > Reply to Chris: > > The only problem is that with that approach that we can't understand if > that's the last yield statement. To achieve that we need to keep going until > we encounter a StopIteration. And the value of x would 3. Because we're not > iterating over a particular generator. We're creating multiple instances > which actually would increase x. > > And also is there another way we can make it thread safe? Steven's idea is > actually the only solution we've encountered till now. I'd be really happy if > someone could come up with even a better idea. > This is thread-safe: from threading import Lock lock = Lock() counter = 0 def get_next(): with lock: global counter counter += 1 my_counter = counter ... ... ... The equivalent with a static counter and a static Lock object would also be thread-safe under my proposed semantics. This is guaranteed, because ALL mutation happens while the lock is held, and then there's a stack-local variable for the value you want to use. The language promises that the assignment back to the global happens immediately, not at some later point, after the lock has been released. This is, in fact, the normal expectation of locks and assignment, and it works whether you're using a global, a closure (and a nonlocal assignment), a mutable function default argument, an attribute on the function object, or in fact, *any other assignment in the language*. They all happen immediately. You would have to ensure that you don't have a yield inside the locking context, but anywhere else would be fine. The biggest downside of this sort of system is the overhead of the locking. A high-performance thread-aware static system should be able to avoid some or even most of that overhead. But mainly, the semantics have to be such that locks behave sanely, and can be a solution to other problems. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/4FZD2HBRHB7XJPQBZRFOEXT5PMS4PYY4/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
Reply to Chris: The only problem is that with that approach that we can't understand if that's the last yield statement. To achieve that we need to keep going until we encounter a StopIteration. And the value of x would 3. Because we're not iterating over a particular generator. We're creating multiple instances which actually would increase x. And also is there another way we can make it thread safe? Steven's idea is actually the only solution we've encountered till now. I'd be really happy if someone could come up with even a better idea. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/63VEAJCXRA7MODB57JOCIPNGXXCIAML3/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 10:40 AM Matt del Valle wrote: > Bikesheddable, but I don't know why having these two be equivalent: >> >> @decorator var >> @decorator var = None >> >> ..would be a problem. Having an implied default of None for var above >> makes sense to my brain. Do you have an example in mind where you think it >> would create a problem? >> > > I don't have anything immediately in mind, no, but I think that given the > semantics of those two statements are very different, it is at least > worthwhile to allow the decorator to know which of them is actually > happening (is `None` being assigned to the name or is it maybe just a type > hint for that name?). Keep in mind that if no assignment is happening then > there is no need to return anything from the decorator, since it will just > be lost anyway. > > And I'm pretty much 100% positive that even if I can't think of a use-case > off the top of my head, there will eventually be a library author (if this > proposal is accepted) who will have some cool idea that will require > distinguishing these two scenarios. Like for example if this were used for > a CLI-building library (think something like Typer) how the assignment to > `None` could signify that it is an option with a default value of `None`, > whereas the bare name would signify that it is a mandatory argument. > I agree that if we allow both then they should be distinguished somehow. They are different not just in value but in existence since on the next line down "var" will either be just fine or raise a NameError. If the decorator is expecting for some reason to reach into its parent scope and modify this variable, those are two very different situations. > Oh, and I think I've just discovered another thing that I'm not 100% sure > I like. Even putting aside that I'm not a fan of decorators on the same > line as the statement they are decorating (as I mentioned in an earlier > response), you've got examples of variable decorators where no assignment > is happening such as: > > @decorator var > > To me this breaks the symmetry between function decorators, which always > decorate a function definition (an implicit form of assignment), and the > proposed variable decorators. > > They are also confusing in the sense that the decorator is de-facto > turning an otherwise invalid python statement legal. If you remove the > decorator from the above example you will presumably get a `NameError`. > > I imagine this would then have to be special-cased somehow in the language > spec so that an undefined name is not evaluated, but only when preceded by > a decorator? I don't know, it seems messy to me. > > Also, I just can't quite see the value in them if I'm honest, whereas the > version that is applied to an assignment statement: > > @decorator var: bool = True > > And even a bare type-hint version: > > @decorator var: bool > > seem to me to be far more self-evidently useful. > I am confused why you are okay with @decorator var: bool but not @decorator var Yes, a bare name is currently an error while just a name and a type hint is valid, but the latter doesn't bind anything to the name, and using that identifier is still a NameError. So a decorator presumably can't return a value for either (or it could, but it would always be dropped). What could it do with a name and a type hint that is so much better than just a name? I am still very confused as to the scope of this counter proposal re variable decorating. I have only seen two such examples here @decorator variable # variable = decorator.__decoration_call__(None, "variable") @decorator variable = "spam" # variable = decorator.__decoration_call__(variable, "variable") But what is actually valid to follow a decorator in this proposal? Any simple expression, any expression? Is it limited to assignment espressions? Here are some interesting uses that were brought up in the other thread and I would like to know how they would work. @decorator spam = eggs = cheese = "tomatoes" @decorator spam, eggs, cheese = "tomatoes" @decorator spam, eggs = cheese = "tomatoes" @decorator spam = (eggs := "cheese) @decorator locals()[find_it() or "default"] = spam() Regards, ~Jeremiah ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/FWXYYRQLI3EVIHIL2DJFFATYF2YQ5SNC/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 2:00 PM micro codery wrote: > ... > By providing the name as the first argument, all > of my examples of callables currently in the standard library will work as > you > say out of the box. If it were to be passed in last, this new syntax would > not be > usable by any standard library callable (or even third party? Does anyone > create factory functions that need the name and take it last?) and lots of > new > functions would have to be added. > > > Regards, > ~Jeremiah > Yes that's true. I originally wrote the proposal with the arguments switched around. But I it seemed like it could be important to have the decorated_object argument appear first in the signatures for both __call__ and __decoration_call__, or it could cause confusion... However, I am open to the idea I was right the first time. If so, the default implementation at the top of the object food chain would be something like (using Matt's SENTINEL idea): def __decoration_call__(self, by_name, obj=SENTINEL): if func is SENTINEL: return self(by_name) return self(obj) On the one hand, this might be too expensive; an identity check and potentially two function calls occur for all @ decorations. And I don't see how to optimize this away with opcodes... On the other hand, how often are people invoking millions of @ decorations in a loop...? --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/O7TW667B7PYBOQ4M52WEXXYNQ6NZN7VM/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 4:14 AM Shreyan Avigyan wrote: > > > A context switch can happen between any two of those instructions. > > That means one thread could load the global, then another thread could > > load the same value, resulting in both of them writing back the same > > incremented value. Or, between opcodes 6 and 8 (between the lines of > > Python code), you could store the value, then fetch back a different > > value. > > I see now. Then we can go with Steven's idea. Let's keep the changes in > locals temporarily and when it yields or returns then modify the __statics__ > member. And even if it's a generator it will stop iteration some time and if > it doesn't then the member wasn't meant to be modified. > So you're saying that... def f(): static x = 0 x += 1 yield x next(f()) next(f()) next(f()) will yield 1 every time? According to the "write-back later" semantics, this is only going to actually update the static once it gets nexted the second time. This would be utterly unique in all of Python: that assignment doesn't happen when you say it should happen, it instead happens arbitrarily later. I would have to call that behaviour "buggy". If you use literally any other form of persistent state, ANY other, it would increment. Even worse: I don't see how your conclusion relates to my explanation of threading, which you describe. Instead of having a minor risk of a context switch in the middle of a statement (which can be controlled with a lock), you have a major risk of a context switch ANYWHERE in the function - and now there's no way you can use a lock to protect it, because the lock would (by definition) be released before the function epilogue. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/RJSWSPNCT35RJLGUM4L7LN3ETF5IHYJU/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
> A context switch can happen between any two of those instructions. > That means one thread could load the global, then another thread could > load the same value, resulting in both of them writing back the same > incremented value. Or, between opcodes 6 and 8 (between the lines of > Python code), you could store the value, then fetch back a different > value. I see now. Then we can go with Steven's idea. Let's keep the changes in locals temporarily and when it yields or returns then modify the __statics__ member. And even if it's a generator it will stop iteration some time and if it doesn't then the member wasn't meant to be modified. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/FCDWRSCIF6KZQ73N2AKOGWIEY6W4FGK3/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 9:34 AM Ricky Teachey wrote: > On Thu, May 27, 2021 at 10:25 AM Steven D'Aprano > wrote: > > No, I understood the OP's proposal perfectly. I was agreeing with you > implicitly when you previously said the inconsistency between the OP's > proposal and current decorator is a problem: > > : > > On the other hand, it would be terribly confusing if the same syntax: >> @decorate >> had radically different meaning depending on whether it was followed by >> a class/function or a bare name > > > And furthermore, it is even a bigger problem than you explicitly made it > out to be because of passing and object vs. passing a string representing > the name of an object. > I got the sense that people both liked reading my examples of same-line decorators and pushed back against not making them appear just like function decorators. One way to have my cake and you eat it too would be to relax the current decorator grammar to not require a NEWLINE. AFAICT there would be no ambiguity since after a decorator there must either be another "@" or a "def". Then both function and assignment decorating can be both. These would all be possible and not change the status quo. @cache def factorial(n): pass # This is your example @namedtuple Point = "x y z" # OP was @namedtuple("x y z") Point @not_null @indexed first_name: str > So I was not illustrating the OP's proposal, but showing (but not > proposing) a modified version of it that acts more like decorators today, > so that we would have this: > > # NOT THE OP's PROPOSAL -- more consistent with decorators today, but not > as quickly useful on its own > @decorator("spam this") var > # decorator("spam this")("var") > > ..rather than this: > > # OP's PROPOSAL > @decorator("spam this") var > # decorator("spam this", "var") > Actually the original proposal was @decorator("spam this") var # var = decorator("var", "spam this") The implied assignment that is not written was a big part of the proposal, as most of my examples were defining the name for the first time, and also the original proposal disallowed assignment (or any statement) after the decorator; it is only valid to have an identifier after. Also the "var" coming first is a subtle but important distinction. By providing the name as the first argument, all of my examples of callables currently in the standard library will work as you say out of the box. If it were to be passed in last, this new syntax would not be usable by any standard library callable (or even third party? Does anyone create factory functions that need the name and take it last?) and lots of new functions would have to be added. Regards, ~Jeremiah ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/ANVAFR7OMMZN6UBOV2PVVP255GS77PCB/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 1:40 PM Matt del Valle wrote: > ... Oh, and I think I've just discovered another thing that I'm not 100% sure I > like. Even putting aside that I'm not a fan of decorators on the same line > as the statement they are decorating (as I mentioned in an earlier > response), you've got examples of variable decorators where no assignment > is happening such as: > > @decorator var > > To me this breaks the symmetry between function decorators, which always > decorate a function definition (an implicit form of assignment), and the > proposed variable decorators. > > They are also confusing in the sense that the decorator is de-facto > turning an otherwise invalid python statement legal. If you remove the > decorator from the above example you will presumably get a `NameError`. > > I imagine this would then have to be special-cased somehow in the language > spec so that an undefined name is not evaluated, but only when preceded by > a decorator? I don't know, it seems messy to me. > > Also, I just can't quite see the value in them if I'm honest, whereas the > version that is applied to an assignment statement: > > @decorator var: bool = True > > And even a bare type-hint version: > > @decorator var: bool > > seem to me to be far more self-evidently useful. > Ok, agreed on all points. I think an eventual full-fledged proposal could easily put naked decorations like: @decorator var ...to the wayside, to be added later if people have a really good reason for it. --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/WYE3N7XCWRCY4HJZ7P644BXAXEUBZL2M/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
> > Bikesheddable, but I don't know why having these two be equivalent: > > @decorator var > @decorator var = None > > ..would be a problem. Having an implied default of None for var above > makes sense to my brain. Do you have an example in mind where you think it > would create a problem? > I don't have anything immediately in mind, no, but I think that given the semantics of those two statements are very different, it is at least worthwhile to allow the decorator to know which of them is actually happening (is `None` being assigned to the name or is it maybe just a type hint for that name?). Keep in mind that if no assignment is happening then there is no need to return anything from the decorator, since it will just be lost anyway. And I'm pretty much 100% positive that even if I can't think of a use-case off the top of my head, there will eventually be a library author (if this proposal is accepted) who will have some cool idea that will require distinguishing these two scenarios. Like for example if this were used for a CLI-building library (think something like Typer) how the assignment to `None` could signify that it is an option with a default value of `None`, whereas the bare name would signify that it is a mandatory argument. Oh, and I think I've just discovered another thing that I'm not 100% sure I like. Even putting aside that I'm not a fan of decorators on the same line as the statement they are decorating (as I mentioned in an earlier response), you've got examples of variable decorators where no assignment is happening such as: @decorator var To me this breaks the symmetry between function decorators, which always decorate a function definition (an implicit form of assignment), and the proposed variable decorators. They are also confusing in the sense that the decorator is de-facto turning an otherwise invalid python statement legal. If you remove the decorator from the above example you will presumably get a `NameError`. I imagine this would then have to be special-cased somehow in the language spec so that an undefined name is not evaluated, but only when preceded by a decorator? I don't know, it seems messy to me. Also, I just can't quite see the value in them if I'm honest, whereas the version that is applied to an assignment statement: @decorator var: bool = True And even a bare type-hint version: @decorator var: bool seem to me to be far more self-evidently useful. On Thu, May 27, 2021 at 5:45 PM Ricky Teachey wrote: > On Thu, May 27, 2021 at 11:09 AM Matt del Valle > wrote: > >> >>> I'm not the OP, but the way I understand the proposal >> __decoration_call__ is only invoked when you actually *use an object to >> decorate something*. That means that a decorator factory will just >> invoke __call__ as normal, because it's nothing but a convenient way to >> generate a decorator. It is not itself a decorator, nor is it used to >> actually decorate anything. To illustrate this point we can separate it out >> across several lines: >> >> @factory("foo") >> def bar(): >> pass >> >> >> Can be rewritten as: >> >> decorator = factory("foo") >> >> @decorator >> def bar(): >> pass >> >> >> So __decorator_call__ will only be invoked on the object that gets >> returned from `factory("foo")`, not on `factory`. >> > > Correct. > > >> It seems to me that this proposal means that we can't even tell which of >>> the two protocols (classic decoration, or new `__decoration_call__` >>> style decoration) without digging into the implementation of the >>> decorator. >>> >>> To be precise, the problem here as reader isn't so much the fact that I >>> don't know whether the object is called using the `__call__` protocol or >>> the new-style `__decorator_call__` protocol, but the fact that I can't >>> tell whether the calls will involve the name being passed or not. >>> >> >> >> The OP mentioned a default implementation for __decoration_call__ of: >> >> def __decoration_call__(self, func, by_name): >> if func is None: >> return self(by_name) >> return self(func) >> >> >> Such that you can assume that the decorator will *always* receive the >> name, but may choose to discard it and not make use of it if it doesn't >> implement the __decoration_call__ interface and instead opts to use default >> implementation which falls back on __call__. >> > > Yes but I am on the fence as to whether this default implementation (I > suppose it would live on the object class?) should be considered or not. It > would certainly provide a lot of functionality "out-of-the-box". > > For decorated functions the name can always be pulled out of the function >> object as normal even when using __call__, but to make use of the name in a >> decorated assignment statement the decorator would have to override >> __decoration_call__. >> >> >> At this point I will say that I may be putting words into OPs mouth, and >> would be happy to be corrected if I've misunderstood. >> > > Nah you got it. > >
[Python-ideas] Re: Add static variable storage in functions
On Thu, 27 May 2021 at 10:39, Paul Moore wrote: [...] > the performance aspect, function > attributes provide this functionality, but there's a significant > problem with using them because you can't access them other than by > referencing the *name* of the function being defined. > [...] > > It would be nice to have a better way to reference function attributes > from within a function. (This would also help write recursive > functions that could be safely renamed, but I'm not sure many people > would necessarily think that's a good thing ;-)) > Now, yes, being able to reference a function from inside itself is a feature I had missed over the years. > > Paul > ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/QVECOJRFJ4DVIN4CV3RH5T2AAPRWBF4P/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
You can just use nonlocal variables: def stator(): static_var_1 = 0 def myfunc(n): nonlocal static_var_1 static_var_1 += n return static_var_1 return myfunc myfunc = stator() del stator Or you can attach any variable to the function itself: def myfunc(n): if not hasattr(myfunc, "static_var_1"): myfunc.static_var_1 = 0 myfunc.static_var_1 += n return myfunc.static_var_1 There are several options to get the effect that static variables would have, not to mention that states of this type are better held as class attributes anyway. BTW: # callable singleton - AKA "function": class myfunc(metaclass=lambda *args: type(*args)()): def __init__(self): self.static_var_1 = 0 def __call__(self, n): self.static_var_1 += n return self.static_var_1 Or, as you put it in the first e-mail, the static var could be built into a data structure in the default arguments of the function. (There are also contextvars, threading.local, etc...) I can't see a separate "static" declaration being of any use. Beginners needing the functionality should just resort to either globals or plain classes to keep state. As you get the way Python works, there are plenty of ways to keep the state, without making the language more complicated. On Thu, 27 May 2021 at 14:06, Chris Angelico wrote: > On Fri, May 28, 2021 at 2:44 AM Shreyan Avigyan > wrote: > > > > My proposal is somewhat the sum of all of your ideas. Well I propose > there should a STORE_STATIC_FAST opcode that stores a static variable. > Static variable will be declared only once and will be initialized to None > (statement syntax will be similar to that of global). It will be > initialized in MAKE_FUNCTION. Now it will be set by STORE_STATIC_FAST. > Where will the variables be stored? It will have references in locals and > __statics__. Therefore LOAD_FAST can find it. So I don't hope there will be > performance decrease but performance increase is also not guaranteed. :-) > > > > The duplicated store fixes half the problem, but it still fails on the > recursion example that I posted in reply to Steve. It would be a nice > optimization, but it may or may not be sufficient. > > > And if these are thread unsafe then is __defaults__ also thread unsafe? > > > > Thread safety isn't a problem with constants. Python guarantees that > internal details (like CPython's reference counts) aren't going to be > trampled on, and inside your code, nothing is going to change > __defaults__ (unless you're doing something bizarre, in which case it > isn't about __defaults__ any more). Thread safety only becomes an > issue when you have something like this: > > counter = 0 > def get_next(): > global counter > counter += 1 > return counter > > This disassembles to: > > 6 0 LOAD_GLOBAL 0 (counter) > 2 LOAD_CONST 1 (1) > 4 INPLACE_ADD > 6 STORE_GLOBAL 0 (counter) > > 7 8 LOAD_GLOBAL 0 (counter) > 10 RETURN_VALUE > > A context switch can happen between any two of those instructions. > That means one thread could load the global, then another thread could > load the same value, resulting in both of them writing back the same > incremented value. Or, between opcodes 6 and 8 (between the lines of > Python code), you could store the value, then fetch back a different > value. > > None of this is a problem if you're using constants. The only reason > to use statics instead of global constants is performance - the > "len=len" trick is specific to this performance advantage - but you > don't have to worry about thread safety. > > ChrisA > ___ > Python-ideas mailing list -- python-ideas@python.org > To unsubscribe send an email to python-ideas-le...@python.org > https://mail.python.org/mailman3/lists/python-ideas.python.org/ > Message archived at > https://mail.python.org/archives/list/python-ideas@python.org/message/DD62RX2ZFOOKQFIHXRFH2LABOFYE4HSZ/ > Code of Conduct: http://python.org/psf/codeofconduct/ > ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/3BWG6CPWAVS4POG6WICZOHPVHNWWJTUT/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 17:44, Shreyan Avigyan wrote: My proposal is somewhat the sum of all of your ideas. Well I propose there should a STORE_STATIC_FAST opcode that stores a static variable. Static variable will be declared only once and will be initialized to None (statement syntax will be similar to that of global). It will be initialized in MAKE_FUNCTION. Now it will be set by STORE_STATIC_FAST. Where will the variables be stored? It will have references in locals and __statics__. Therefore LOAD_FAST can find it. So I don't hope there will be performance decrease but performance increase is also not guaranteed. :-) And if these are thread unsafe then is __defaults__ also thread unsafe? Why initialise them to None? Other variables don't behave like that. My preference would be to have something like this: def foo(): static x = 0 that would bind 'x' the first time it met that statement, and if it tried to use 'x' before that has met that statement for the same time it would raise something like UnboundLocalError (perhaps UnboundStaticError in this case?) like happens currently for local variables. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/TFTYZVYNRRX5FS2PFXRGEL6RTMCCRCRH/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 2:44 AM Shreyan Avigyan wrote: > > My proposal is somewhat the sum of all of your ideas. Well I propose there > should a STORE_STATIC_FAST opcode that stores a static variable. Static > variable will be declared only once and will be initialized to None > (statement syntax will be similar to that of global). It will be initialized > in MAKE_FUNCTION. Now it will be set by STORE_STATIC_FAST. Where will the > variables be stored? It will have references in locals and __statics__. > Therefore LOAD_FAST can find it. So I don't hope there will be performance > decrease but performance increase is also not guaranteed. :-) > The duplicated store fixes half the problem, but it still fails on the recursion example that I posted in reply to Steve. It would be a nice optimization, but it may or may not be sufficient. > And if these are thread unsafe then is __defaults__ also thread unsafe? > Thread safety isn't a problem with constants. Python guarantees that internal details (like CPython's reference counts) aren't going to be trampled on, and inside your code, nothing is going to change __defaults__ (unless you're doing something bizarre, in which case it isn't about __defaults__ any more). Thread safety only becomes an issue when you have something like this: counter = 0 def get_next(): global counter counter += 1 return counter This disassembles to: 6 0 LOAD_GLOBAL 0 (counter) 2 LOAD_CONST 1 (1) 4 INPLACE_ADD 6 STORE_GLOBAL 0 (counter) 7 8 LOAD_GLOBAL 0 (counter) 10 RETURN_VALUE A context switch can happen between any two of those instructions. That means one thread could load the global, then another thread could load the same value, resulting in both of them writing back the same incremented value. Or, between opcodes 6 and 8 (between the lines of Python code), you could store the value, then fetch back a different value. None of this is a problem if you're using constants. The only reason to use statics instead of global constants is performance - the "len=len" trick is specific to this performance advantage - but you don't have to worry about thread safety. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/DD62RX2ZFOOKQFIHXRFH2LABOFYE4HSZ/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 11:09 AM Matt del Valle wrote: > >> I'm not the OP, but the way I understand the proposal __decoration_call__ > is only invoked when you actually *use an object to decorate something*. > That means that a decorator factory will just invoke __call__ as normal, > because it's nothing but a convenient way to generate a decorator. It is > not itself a decorator, nor is it used to actually decorate anything. To > illustrate this point we can separate it out across several lines: > > @factory("foo") > def bar(): > pass > > > Can be rewritten as: > > decorator = factory("foo") > > @decorator > def bar(): > pass > > > So __decorator_call__ will only be invoked on the object that gets > returned from `factory("foo")`, not on `factory`. > Correct. > It seems to me that this proposal means that we can't even tell which of >> the two protocols (classic decoration, or new `__decoration_call__` >> style decoration) without digging into the implementation of the >> decorator. >> >> To be precise, the problem here as reader isn't so much the fact that I >> don't know whether the object is called using the `__call__` protocol or >> the new-style `__decorator_call__` protocol, but the fact that I can't >> tell whether the calls will involve the name being passed or not. >> > > > The OP mentioned a default implementation for __decoration_call__ of: > > def __decoration_call__(self, func, by_name): > if func is None: > return self(by_name) > return self(func) > > > Such that you can assume that the decorator will *always* receive the > name, but may choose to discard it and not make use of it if it doesn't > implement the __decoration_call__ interface and instead opts to use default > implementation which falls back on __call__. > Yes but I am on the fence as to whether this default implementation (I suppose it would live on the object class?) should be considered or not. It would certainly provide a lot of functionality "out-of-the-box". For decorated functions the name can always be pulled out of the function > object as normal even when using __call__, but to make use of the name in a > decorated assignment statement the decorator would have to override > __decoration_call__. > > > At this point I will say that I may be putting words into OPs mouth, and > would be happy to be corrected if I've misunderstood. > Nah you got it. > One final point I've just thought of is that Ricky suggested that when no > value is assigned to a name that the object reference be `None`. But I > don't think that works, because it becomes indistinguishable from when > `None` is explicitly assigned. We would need some sentinel value instead of > `None` to remove ambiguity in this situation: > > > from somewhere import NOTSET > > > @decorate > foo: int > > def __decoration_call__(self, obj, names, annotation): > print(obj is None)# False > print(obj is NOTSET) # True > > @decorate > foo: int = None > > > def __decoration_call__(self, obj, names, annotation): > print(obj is None)# True > print(obj is NOTSET) # False > > > Bikesheddable, but I don't know why having these two be equivalent: @decorator var @decorator var = None ..would be a problem. Having an implied default of None for var above makes sense to my brain. Do you have an example in mind where you think it would create a problem? --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/V5PENKCPAGWOFVQKGFGYK666RSD6VLDU/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
My proposal is somewhat the sum of all of your ideas. Well I propose there should a STORE_STATIC_FAST opcode that stores a static variable. Static variable will be declared only once and will be initialized to None (statement syntax will be similar to that of global). It will be initialized in MAKE_FUNCTION. Now it will be set by STORE_STATIC_FAST. Where will the variables be stored? It will have references in locals and __statics__. Therefore LOAD_FAST can find it. So I don't hope there will be performance decrease but performance increase is also not guaranteed. :-) And if these are thread unsafe then is __defaults__ also thread unsafe? ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/6FSYNBSC5HVQYMVCDGKENGT7G2MS4OTI/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Thu, May 27, 2021 at 10:25 AM Steven D'Aprano wrote: > On Wed, May 26, 2021 at 12:43:48PM -0400, Ricky Teachey wrote: > > [...] > > These two ideas of a decorator syntax result are not the same: > > > > RESULT A: function decorator > > # func = decorator("spam")(func) > > > > RESULT B: variable decorator > > # name = decorator("spam")("name") > > > > ...because func is passed as an object, but "name" a string representing > > the name of the object. Two very different things. > > Ricky, it's not clear to me whether you are proposing the above RESULT A > and RESULT B as an *alternative* to the "variable decorator" proposal, > or if you have just misunderstood it. No, I understood the OP's proposal perfectly. I was agreeing with you implicitly when you previously said the inconsistency between the OP's proposal and current decorator is a problem: : On the other hand, it would be terribly confusing if the same syntax: > @decorate > had radically different meaning depending on whether it was followed by > a class/function or a bare name And furthermore, it is even a bigger problem than you explicitly made it out to be because of passing and object vs. passing a string representing the name of an object. So I was not illustrating the OP's proposal, but showing (but not proposing) a modified version of it that acts more like decorators today, so that we would have this: # NOT THE OP's PROPOSAL -- more consistent with decorators today, but not as quickly useful on its own @decorator("spam this") var # decorator("spam this")("var") ..rather than this: # OP's PROPOSAL @decorator("spam this") var # decorator("spam this", "var") > The current variable decorator > proposal on the table is for this: > > @decorator(spam) name > # --> name = decorator("name", spam) > > rather than what you wrote: > > # name = decorator("spam")("name") > > So I can't tell whether the difference between your version and the OPs > is a bug or a feature :-) > Yes, I am not proposing this behavior or saying it was the OP's, just illustrating to agree with you that it is so different from current decorator behavior, I don't think it should be considered (even though I sort of like how it looks). > > For this reason I think I would agree even more so that the differences > in > > the decorator behavior would be an extremely significant point of > confusion. > [...] > > Maybe employment of decorator syntax could OPTIONALLY trigger a new > dunder > > method-- here I'll just call it __decoration_call__-- with the signature: > > > > def __decoration_call__(self, obj: Any, by_name: str) -> Any: ... > > To be clear here, I think that your proposal is that this method is to > be looked up on the *decorator*, not the thing being decorated. Is that > correct? > Yes, it is looked up on the decorator, i.e., the result of the expression immediately to the right of the @ symbol. So it is only used when using *decorator syntax* (i.e., the @ symbol). > In other words: > > @decorator > class X: ... # or a function, or something else > > it is *decorator*, not X, that is checked for a `__decoration_call__` > method. > > Correct? > Yes. > > My idea is to optionally allow any callable object to write a > > __decoration_call__ method that gets called in lieu of the __call__ > method > > when the callable object is employed using decorator syntax. When this > > happens, the decorated named is supplied- not counting self- as the first > > argument (e.g., by_name), which contains the str value of the name the > > decorator was applied to. > > In current Python, the only objects which can be decorated with the @ > syntax are callable functions and classes. So it is ambiguous to talk > about "any callable object" without stating whether it is the decorator > or the thing being decorated. > I am sorry, I thought I made this clear early on in the proposal. yes, the decorator is the thing. More precisely, the result of the expression to the right of the @ symbol. By "any callable", I had in mind how any callable can be used as a decorator (so long as it isn't expecting more than 1 positional argument and no required kwd arguments). But really, the result any expression can be a decorator now (as of 3.10 I believe?), thought you might get an error: @1/2 def func(): ... # TypeError: 'float' object is not callable > > In actuality, unless I'm wrong (I might be; not an expert) current > > decorator syntax is really sugar for: > > > > def func(): ... > > func = decorator.__call__("spam this").__call__(func) > > Roughly speaking, that would correspond to > > @decorator("spam this") > def func(): > ... > Yes, I had written exactly that above the quoted part: So, for any existing callable with the name decorator, this: > @decorator("spam this") > def func(): ... > ...continues to mean this, just as it does today: > def func(): ... > func = decorator("spam this")(func) Continuing on. > If we have a
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 2:03 AM Steven D'Aprano wrote: > I'll admit I'm not an expert on the various LOAD_* bytecodes, but I'm > pretty sure LOAD_FAST is used for local variables. Am I wrong? You're correct, but I dispute that that's the best way to do things. > Right. In principle we could just shove the static values in the > __defaults__ tuple, but it's probably better to use a distinct > __statics__ dunder. Right, agreed. > If you don't store the values away somewhere on function exit, how do > you expect them to persist from one call to the next? Remember that they > are fundamentally variables -- they should be able to vary from one call > to the next. > > > > and it could cause extremely confusing results with > > threading. > > Don't we have the same issues with globals, function attributes, and > instance attributes? Your proposal requires that every static involve a load at function start and a store-back at function end. My proposal requires that they get loaded directly from their one true storage location (a dict, or some kind of cell system, or whatever) attached to the function, and stored directly back there when assigned to. Huge difference. With your proposal, two threads that are *anywhere inside the function* can trample over each other's statics. Consider: def f(): static n = 0 n += 1 ... ... ... ... # end of function Your proposal requires that the "real" value of n not be updated until the function exits. What if that takes a long time to happen - should the static value remain at its old value until then? What if it never exits at all - if it's a generator function and never gets fully pumped? Mutating a static needs to happen immediately. > I'm okay with saying that if you use static *variables* (i.e. they > change their value from one call to the next) they won't be thread-safe. > As opposed to static "constants" that are only read, never written. They'll never be fully thread-safe, but it should be possible to have a short-lived lock around the mutation site itself, followed by an actual stack-local. By your proposal, the *entire function* becomes non-thread-safe, *no matter what you do with locks*. By my proposal, this kind of code becomes entirely sane: def f(): static lock = Lock() static count = 0 with lock: my_count = count + 1 count = my_count ... ... ... print(my_count) There's a language guarantee with every other form of assignment that it will happen immediately in its actual target location. There's no write-back caching anywhere else in the language. Why have it here? > But if you have a suggestion for a thread-safe way for functions to > keep variables alive from one call to the next, that doesn't suffer a > big performance hit, I'm all ears :-) The exact way that I described: a function attribute and a dedicated opcode pair :) > > Agreed, I'd use it too. But I'd define the semantics slightly differently: > > > > * If there's an expression given, evaluate that when the 'def' > > statement is executed, same as default args > > That's what I said, except I called it function definition time :-) Yep, that part we agree on. > > * Otherwise it'll be uninitialized, or None, bikeshedding opportunity, have > > fun > > I decided on initialising them to None because it is more convenient to > write: > > if my_static_var is None: > # First time, expensive computation. > ... > > than the alternative with catching an exception. YMMV. Not hugely important either way, I'd be happy with either. > > * Usage of this name uses a dedicated LOAD_STATIC or STORE_STATIC bytecode > > * The values of the statics are stored in some sort of > > high-performance cell collection, indexed numerically > > Isn't that what LOAD_FAST already does? This would be a separate cell collection. The LOAD_FAST cells are in the stack frame, the LOAD_STATIC cells are on the function object. But yes, the code could be pretty much identical. > > It would be acceptable to store statics in a dict, too, but I suspect > > that that would negate some or all of the performance advantages. > > Whichever way, though, it should ideally be introspectable via a > > dunder attribute on the function. > > Maybe I'm misunderstanding you, or you me. Let me explain further > what I think can happen. > > When the function is being executed, I think that static variables > should be treated as local variables. Why build a whole second > implementation for fast cell-based variables, with a separate set of > bytecodes, to do exactly what locals and LOAD_FAST does? We should reuse > the existing fast local variable machinery, not duplicate it. Because locals are local to the invocation, not the function. They're TOO local. > The only part that is different is that those static locals have to be > automatically initialised on function entry (like parameters are), and > then on function exit their values have to be
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 04:53:17PM +0200, Ronald Oussoren via Python-ideas wrote: > Statics are still hidden global state How are they *global* state when they are specific to an individual function? We can already get the basic behaviour of statics right now, only with an ugly hack that pollutes the function parameter list and is inconvenient to use. static = [0] def spam(arg, static=[0]): static[0] += arg return static[0] def eggs(arg, static=[0]): static[0] -= arg return static[0] Is it your argument that all three `static` variables are using shared global state? If not, then I have no idea what you mean by insisting that statics are "hidden global state". They are hidden state, but not global. Just like closures and instance attributes. > and those can be problematic regardless of being function local or > module global. Having global state like this affects testability and > can affect threading as well. It sounds like your problem is with *mutable state* in general. Global variables, instance attributes, class attributes, they all have exactly the same issues. So don't use mutable state. Nobody is forcing you to use this feature if you prefer to write in a functional style with no mutable state. > The factory function doesn’t need to be part of the public API of a > module, I’ve used a pattern like this to create APIs with some hidden > state: > > ``` > def make_api(): > state = ... > > def api1(…): … > def ap2(…): … > > return api1, api2 > api1, api2 = make_api() > ``` Congratulations, you've just used static local variables. You just used closures for the implementation. > I’m not saying that this is a particularly good way to structure code, > in general just using a private module global is better (assuming the > design calls for some kind of global state). You: "Global state is bad! Don't use global state!" Also you: "Don't use local state (closures)! Use global state, it's better!" *wink* -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/2X7UR6OF3BDISQJ7DXJKZKWDN42CP7LS/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 11:21:26PM +1000, Chris Angelico wrote: > On Thu, May 27, 2021 at 10:20 PM Steven D'Aprano wrote: > > Here is a sketch of how this could work, given a function like this: > > > > def func(arg): > > static spam, eggs > > static cheese = expression > > ... > > > > > > At function declaration time, the two static statements tell the > > compiler to: > > > > * treat spam, eggs and cheese as local variables (use LOAD_FAST instead > > of LOAD_GLOBAL for lookups); > > I don't think LOAD_FAST would be suitable here - isn't it always going > to look in the stack frame? Where else should it look? I'll admit I'm not an expert on the various LOAD_* bytecodes, but I'm pretty sure LOAD_FAST is used for local variables. Am I wrong? My idea is that the static variable should be a local variable that gets saved on function exit and restored on function entry. Is there another concept of static variables that I should know about? > > * allocate static storage for them using the same (or similar) mechanism > > used for function default values; > > Default values are attached to the function object (in either the > __defaults__ tuple or the __kwdefaults__ dict). Right. In principle we could just shove the static values in the __defaults__ tuple, but it's probably better to use a distinct __statics__ dunder. > > * spam and eggs get initialised as None; > > > > * cheese gets initialised to the value of `expression`, evaluated > > at function declaration time just as default arguments are. > > > > > > When the function is called: > > > > * the interpreter automatically initialises the static variables > > with the stored values; > > > > * when the function exits (whether by return or by raising an > > exception) the static storage will be updated with the current > > values of the variables. > > Hmm, I see what you mean. Not sure that this is really necessary > though - If you don't store the values away somewhere on function exit, how do you expect them to persist from one call to the next? Remember that they are fundamentally variables -- they should be able to vary from one call to the next. > and it could cause extremely confusing results with > threading. Don't we have the same issues with globals, function attributes, and instance attributes? I'm okay with saying that if you use static *variables* (i.e. they change their value from one call to the next) they won't be thread-safe. As opposed to static "constants" that are only read, never written. But if you have a suggestion for a thread-safe way for functions to keep variables alive from one call to the next, that doesn't suffer a big performance hit, I'm all ears :-) > Agreed, I'd use it too. But I'd define the semantics slightly differently: > > * If there's an expression given, evaluate that when the 'def' > statement is executed, same as default args That's what I said, except I called it function definition time :-) > * Otherwise it'll be uninitialized, or None, bikeshedding opportunity, have > fun I decided on initialising them to None because it is more convenient to write: if my_static_var is None: # First time, expensive computation. ... than the alternative with catching an exception. YMMV. > * Usage of this name uses a dedicated LOAD_STATIC or STORE_STATIC bytecode > * The values of the statics are stored in some sort of > high-performance cell collection, indexed numerically Isn't that what LOAD_FAST already does? > It would be acceptable to store statics in a dict, too, but I suspect > that that would negate some or all of the performance advantages. > Whichever way, though, it should ideally be introspectable via a > dunder attribute on the function. Maybe I'm misunderstanding you, or you me. Let me explain further what I think can happen. When the function is being executed, I think that static variables should be treated as local variables. Why build a whole second implementation for fast cell-based variables, with a separate set of bytecodes, to do exactly what locals and LOAD_FAST does? We should reuse the existing fast local variable machinery, not duplicate it. The only part that is different is that those static locals have to be automatically initialised on function entry (like parameters are), and then on function exit their values have to be stored away in a dunder so they aren't lost (as plain local variables are lost when the function exists). That bit is new. We already have a mechanism to initialise locals: it's used for default values. The persistent data is retrieved from the appropriate dunder on the function and bound to the local variable (parameter). We can do the same thing. We will probably use a different dunder. That doesn't mean that every single access to the local static variable needs to be retrieved from the dunder, that would likely be slow. Only on function entry. The
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 1:17 AM Christopher Barker wrote: >> >> >> Statics are still hidden global state, and those can be problematic >> regardless of being function local or module global. Having global state >> like this affects testability and can affect threading as well. > > > I think this is a very good point. I'm no expert, but I know a HUGE amount of > old C code isn't thread-safe -- and static has something to do with that? > > Not that people shouldn't be allowed to write non thread-safe code in Python, > but it shouldn't be encouraged. An awful lot of code is written with no idea > that it will be run in multi-threaded code later on. > > Personally, I can't think of any times when I would have used this -- maybe > because it wasn't there, so I didn't think about it. > Maybe, but it's worth noting that Python code has an inherent level of thread-safety as guaranteed by the language itself; you can't, for instance, have threads trample over each other's reference counts by trying to incref or decref the same object at the same time. Fundamentally, ANY form of mutable state will bring with it considerations for multithreading. Generally, the solution is to keep the state mutation to the tightest part possible, preferably some atomic action (if necessary, by guarding it with a lock), and then you have less to worry about. Hence my preference for direct manipulation of the state, rather than snapshotting it at function start and reapplying it at function exit. Multithreading isn't very hard in Python, and even when you get something wrong, it's not too difficult to figure out what happened. It's not nearly as hard as in C, where you have more possible things to go wrong. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VV3FGLWLJ2JEAQ3FOV73ZZ7PYKCQU5Y4/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, 27 May 2021 at 15:49, Chris Angelico wrote: > > On Fri, May 28, 2021 at 12:25 AM Paul Moore wrote: > > > > On Thu, 27 May 2021 at 15:04, Chris Angelico wrote: > > def static(**statics): > > def deco(func): > > for name, value in statics.items(): > > setattr(func, name, value) > > func.__globals__["__me__"] = func > > return func > > return deco > > > > @static(a=1) > > def f(): > > print(__me__.a) > > > > Can't use globals like that, since there's only one globals() dict per > module. It'd require some compiler magic to make __me__ work the way > you want. But on the plus side, this doesn't require a run-time > trampoline - all the work is done on the original function object. Rats, you're right. Hacking globals felt like a code smell. I considered trying to get really abusive by injecting some sort of local but that's not going to work because the code's already compiled by the time the decorator runs: >>> def f(): ... print(__me__.a) ... >>> dis.dis(f) 2 0 LOAD_GLOBAL 0 (print) 2 LOAD_GLOBAL 1 (__me__) 4 LOAD_ATTR2 (a) 6 CALL_FUNCTION1 8 POP_TOP 10 LOAD_CONST 0 (None) 12 RETURN_VALUE So yes, without compiler support you can only go so far (but you could always use the _getframe approach instead). > So, yeah, add it to the pile. Yep. It's an interesting exercise, is all, and honestly, I don't think I'd use static much anyway, so something "good enough" that works now is probably more than enough for me personally. I do think that having a compiler supported way of referring efficiently to the current function (without relying on looking it up by name) would be an interesting alternative proposal, if we *are* looking at actual language changes - it allows for something functionally equivalent to statics, without the performance advantage but in compensation it has additional uses (recursion, and more general access to function attributes). I'm not going to push for it though, as I say, I don't have enough use for it to want to invest the time in it. Paul ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/TFAU3JX3RH2RAQ3E2KMVD43HZ34454DH/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 04:37:10PM +0200, Ronald Oussoren wrote: > > One common use for function defaults is to optimize function lookups to > > local variables instead of global or builtins: > > > >def func(arg, len=len): > ># now len is a fast local lookup instead of a slow name lookup > > That’s a CPython performance hack, No it isn't. I think we can assume than any non-crappy implementation will have faster access to locals than globals and builtins, or at least no worse. So it is a fair expectation that any time you can turn a global lookup into a local lookup, you should have some performance benefit. Function parameters are guaranteed to be local variables. Default values are guaranteed to be evaluated once, at function definition time. These are language guarantees not CPython implementation details. The precise performance benefit will, of course, vary from VM to VM, but we should expect that any serious implementation should give some performance benefit. (All the usual optimization caveats still apply: measure, don't guess, etc. Optimizations that work in theory may not always work in practice, yadda yadda yadda.) > and “static” would just introduce a different performance hack. IIRC > there has been work in recent versions of CPython to reduce the need > for that hack by caching values in the VM. This trick has worked all the way back to Python 1.5, maybe even longer, so I think it's pretty stable against changes in the interpreter. But for the sake of the argument I'll accept that some future performance improvements that reduces the need for the `len=len` trick to negligible amounts. Static storage in functions will still be useful. Any time you need data to persist from one function call to another, you can either: - expose your data in a global variable; - or as a function attribute; - use a mutable function default; - rewrite your function as a callable instance; - obfuscate your function by wrapping it in a closure; all of which are merely work-arounds for the lack of proper static locals. -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/PPRYVBMW6MOMNMZ7SHAW5GAHS3FPIZXO/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
I'll try to implement the idea roughly and I'll try to observe how much performance improvements (or the opposite) will occur. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/FT2OTHOXJ6DRY4PFJNCFIH5IUDEMAAC6/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
> > > Statics are still hidden global state, and those can be problematic > regardless of being function local or module global. Having global state > like this affects testability and can affect threading as well. > I think this is a very good point. I'm no expert, but I know a HUGE amount of old C code isn't thread-safe -- and static has something to do with that? Not that people shouldn't be allowed to write non thread-safe code in Python, but it shouldn't be encouraged. An awful lot of code is written with no idea that it will be run in multi-threaded code later on. Personally, I can't think of any times when I would have used this -- maybe because it wasn't there, so I didn't think about it. -CHB -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/SPJW2VTP33IFKGXSKQFTP4WJLPUX6Z6F/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
> > What happens if the decorator factory has `__decoration_call__` and the > object it returns only has `__call__`? I presume you get this: > > func = decorator.__decoration_call__("spam this", > "func").__call__(func) > > And let's not forget the other two combinations: > > func = decorator.__decoration_call__("spam this", > "func").__decoration_call__(func, "func") > func = decorator.__call__("spam this").__call__(func) > > The last one is, of course, the current behaviour for a decorator > factory. > > The bottom line here is that is you have plain, unadorned decorator: > > @decorator > > there are two possible behaviours and no obvious way to tell which one > is used, short of digging into the implementation. But if you have a > decorarator factory: > > @factory(*args, **kwargs) > > there are now four possible behaviours. And anyone brave enough to use a > double-barrelled factory-factory > > @factory(*args, **kwargs)(*more_args) > > will be faced with eight possible combinations. > I'm not the OP, but the way I understand the proposal __decoration_call__ is only invoked when you actually *use an object to decorate something*. That means that a decorator factory will just invoke __call__ as normal, because it's nothing but a convenient way to generate a decorator. It is not itself a decorator, nor is it used to actually decorate anything. To illustrate this point we can separate it out across several lines: @factory("foo") def bar(): pass Can be rewritten as: decorator = factory("foo") @decorator def bar(): pass So __decorator_call__ will only be invoked on the object that gets returned from `factory("foo")`, not on `factory`. It seems to me that this proposal means that we can't even tell which of > the two protocols (classic decoration, or new `__decoration_call__` > style decoration) without digging into the implementation of the > decorator. > > To be precise, the problem here as reader isn't so much the fact that I > don't know whether the object is called using the `__call__` protocol or > the new-style `__decorator_call__` protocol, but the fact that I can't > tell whether the calls will involve the name being passed or not. > The OP mentioned a default implementation for __decoration_call__ of: def __decoration_call__(self, func, by_name): if func is None: return self(by_name) return self(func) Such that you can assume that the decorator will *always* receive the name, but may choose to discard it and not make use of it if it doesn't implement the __decoration_call__ interface and instead opts to use default implementation which falls back on __call__. For decorated functions the name can always be pulled out of the function object as normal even when using __call__, but to make use of the name in a decorated assignment statement the decorator would have to override __decoration_call__. At this point I will say that I may be putting words into OPs mouth, and would be happy to be corrected if I've misunderstood. One final point I've just thought of is that Ricky suggested that when no value is assigned to a name that the object reference be `None`. But I don't think that works, because it becomes indistinguishable from when `None` is explicitly assigned. We would need some sentinel value instead of `None` to remove ambiguity in this situation: from somewhere import NOTSET @decorate foo: int def __decoration_call__(self, obj, names, annotation): print(obj is None)# False print(obj is NOTSET) # True @decorate foo: int = None def __decoration_call__(self, obj, names, annotation): print(obj is None)# True print(obj is NOTSET) # False On Thu, May 27, 2021 at 3:25 PM Steven D'Aprano wrote: > On Wed, May 26, 2021 at 12:43:48PM -0400, Ricky Teachey wrote: > > [...] > > These two ideas of a decorator syntax result are not the same: > > > > RESULT A: function decorator > > # func = decorator("spam")(func) > > > > RESULT B: variable decorator > > # name = decorator("spam")("name") > > > > ...because func is passed as an object, but "name" a string representing > > the name of the object. Two very different things. > > Ricky, it's not clear to me whether you are proposing the above RESULT A > and RESULT B as an *alternative* to the "variable decorator" proposal, > or if you have just misunderstood it. The current variable decorator > proposal on the table is for this: > > @decorator(spam) name > # --> name = decorator("name", spam) > > rather than what you wrote: > > # name = decorator("spam")("name") > > So I can't tell whether the difference between your version and the OPs > is a bug or a feature :-) > > > > For this reason I think I would agree even more so that the differences > in > > the decorator behavior would be an extremely significant point of > confusion. > [...] > > Maybe employment of decorator syntax could OPTIONALLY trigger a new > dunder > > method-- here I'll just call
[Python-ideas] Re: Add static variable storage in functions
> On 27 May 2021, at 11:42, Chris Angelico wrote: > > On Thu, May 27, 2021 at 7:20 PM Ronald Oussoren via Python-ideas > wrote: >> On 27 May 2021, at 09:56, Shreyan Avigyan wrote: >> >>> Static should behave much like Python's for loop variables. > > I have no idea what this means. >> >> For this particular question/proposal: “static” variables in functions in C >> like languages are basically hidden global variables, and global variables >> are generally a bad idea. >> > > Hmm, I'd distinguish a couple of things here. Global constants are > most certainly not a problem, and objects that last the entire run of > the program are not a problem either. The usual problem with "global > variables" is that they become hidden state, action-at-a-distance. You > can change the global in one function and it notably affects some > other function. Part of the point of statics is that they are *not* > global; they might live for the entire run of the program, but they > can't be changed by any other function, so there's no AaaD that will > mess with your expectations. Statics are still hidden global state, and those can be problematic regardless of being function local or module global. Having global state like this affects testability and can affect threading as well. > >> In Python you can get the same result with a global variable and the use of >> the “global” keyword in a function (or cooperating set of functions) when >> you want to update the global variable from that function. >> > > That's globals, with all the risks thereof. > >> Closures or instances of classes with an ``__call__`` method can be used as >> well and can hide state (with the “consulting adults” caveat, the state is >> hidden, not inaccessible). >> > > This would be the easiest way to manage it. But both of them provide a > way to have multiple independent, yet equivalent, states. If that's > what you want, great! But it can also be an unnecessary level of > confusion ("why would I ever make a second one of these? Why is there > a factory function for something that I'll only ever need one of?"), > where static variables wouldn't do that. The factory function doesn’t need to be part of the public API of a module, I’ve used a pattern like this to create APIs with some hidden state: ``` def make_api(): state = ... def api1(…): … def ap2(…): … return api1, api2 api1, api2 = make_api() ``` I’m not saying that this is a particularly good way to structure code, in general just using a private module global is better (assuming the design calls for some kind of global state). > > There is one namespace that would very aptly handle this kind of > thing: the function object itself. > def count(): > ... count.cur += 1 > ... return count.cur > ... count.cur = 0 count() > 1 count() > 2 count() > 3 count() > 4 > > As long as you can reference your own function reliably, this will > work. There may be room for a keyword like this_function, but for the > most part, it's not necessary, and you can happily work with the > function by its name. It's a little clunkier than being able to say > "static cur = 0;" to initialize it (the initializer has to go *after* > the function, which feels backwards), but the functionality is all > there. I generally dislike functions with internal state like this. Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/ ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/6BBOX4WGC6HIPKY6MMACWMDD2YSYSYZW/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 08:46:03AM -0400, Ricky Teachey wrote: > Couldn't you already get pretty close to this by attaching your static > values to the function __dict__? Sure, we can use function attributes as a form of static storage, and I have done that. It is sometimes quite handy. But it's not really the same as proper static variables. (1) It will be much less convenient. We have to write `func.variable` when we want `variable`, and if the function gets rebound to a new name, the call will fail. (2) The performance will be worse. The name look-up to get `func` is a relatively slow LOAD_GLOBAL instead of LOAD_FAST, and then on top of that we need to do a second name look-up, using attribute access. And if the func happens to be a method rather than a top level function, then we end up with three lookups: `self.method.variable`. And that could involve a deep inheritence chain. (3) And the semantics are completely different: - Function attributes are intended to be visible to the caller; they should be part of the function's API, just like any other attribute. - Static variables are intended to be internal to the function, they are not part of the function's API. They are conceptually private to the function. (At least as "private" as Python allows, given it's dynamic nature.) [...] > But could there be a decorator that links the function __dict__ to > locals(), so they are intertwined? I doubt that this would be possible, without large changes to the internal workings of functions. > The locals dict in the function body would look something like this: > > ChainMap(locals(), {'a':1}) Are you aware that the locals() dictionary inside functions is a snapshot of the actual, true, hidden locals? Outside of a function, locals() returns the true global dict which is used to hold variables, so this works fine: x = 1 locals()['x'] = 99 # locals() here returns globals() print(x) # 99 But inside a function, well, try it for yourself :-) def test(): x = 1 locals()['x'] = 99 print(x) I'm not an expert on the internals of functions, but I think that my earlier proposal could be added to the existing implementation, while your linking values proposal would require a full re-implementation of how functions operate. -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/ZILE4Y2FNRIITZKWEQNAELFDLVRSCQEQ/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Introduce constants in Python (constant name binding)
oops, forgot to include the list. i really hate lists that don't have the the list as the default reply setting :-( (Yes, I know that goes against the conventional wisdom) On Wed, May 26, 2021 at 9:14 AM Shreyan Avigyan > wrote: > >> Well. How can I go beyond why constant was invented in the first place? > > > But constant was invented for other, far more static languages. And many > in this thread have said that even then it often needs to be worked around. > > > The main reason constant was invented was to provide an additional >> support to programmers so that they don't make a program unstable. > > > The same could be said for static typing. > > And Python’s solution to that is type annotations and type checkers. Which > supports Final. > > In short, yes, the concept of constant is useful, but it is not a good fit > for Python. > > However: > > When you spoke of “like a literal” that brought to mind something else: a > C preprocccesor like pre-complier time substitution. Which might have some > use, but it would only work within a single module, so not very useful. > > -CHB > > > -- > Christopher Barker, PhD (Chris) > > Python Language Consulting > - Teaching > - Scientific Software Development > - Desktop GUI and Web Development > - wxPython, numpy, scipy, Cython > -- Christopher Barker, PhD (Chris) Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/IKO74PNGTEO4WUDRIQDKZ7B77DM64YTF/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Fri, May 28, 2021 at 12:25 AM Paul Moore wrote: > > On Thu, 27 May 2021 at 15:04, Chris Angelico wrote: > > Hmm. > > > > def static(**kw): > > def deco(func): > > statics = types.SimpleNamespace(**kw) > > @functools.wraps(func) > > def f(*a, **kw): > > func(*a, **kw, _statics=statics) > > return f > > return deco > > > > @statics(n=0) > > def count(*, _statics): > > _statics.n += 1 > > return _statics.n > > > > Add it to the pile of clunky options, but it's semantically viable. > > Unfortunately, it's as introspectable as a closure (that is: not at > > all). > > Still arguably clunky, still doesn't have any performance benefits, > but possibly a better interface to function attributes than just using > them in their raw form. > > def static(**statics): > def deco(func): > for name, value in statics.items(): > setattr(func, name, value) > func.__globals__["__me__"] = func > return func > return deco > > @static(a=1) > def f(): > print(__me__.a) > Can't use globals like that, since there's only one globals() dict per module. It'd require some compiler magic to make __me__ work the way you want. But on the plus side, this doesn't require a run-time trampoline - all the work is done on the original function object. So, yeah, add it to the pile. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/7CBRVLP7VZSGJPGHLD43KBFT6LRZINYP/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
> On 27 May 2021, at 14:18, Steven D'Aprano wrote: > > On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: > >> Lot of programming languages have something known as static variable >> storage in *functions* not *classes*. Static variable storage means a >> variable limited to a function yet the data it points to persists >> until the end of the program. > > +1 on this idea. > > One common use for function defaults is to optimize function lookups to > local variables instead of global or builtins: > >def func(arg, len=len): ># now len is a fast local lookup instead of a slow name lookup That’s a CPython performance hack, and “static” would just introduce a different performance hack. IIRC there has been work in recent versions of CPython to reduce the need for that hack by caching values in the VM. Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/ ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/7R6MCO2UKF5DZXVGSEZEC55RCOAHF2ZW/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, 27 May 2021 at 15:04, Chris Angelico wrote: > Hmm. > > def static(**kw): > def deco(func): > statics = types.SimpleNamespace(**kw) > @functools.wraps(func) > def f(*a, **kw): > func(*a, **kw, _statics=statics) > return f > return deco > > @statics(n=0) > def count(*, _statics): > _statics.n += 1 > return _statics.n > > Add it to the pile of clunky options, but it's semantically viable. > Unfortunately, it's as introspectable as a closure (that is: not at > all). Still arguably clunky, still doesn't have any performance benefits, but possibly a better interface to function attributes than just using them in their raw form. def static(**statics): def deco(func): for name, value in statics.items(): setattr(func, name, value) func.__globals__["__me__"] = func return func return deco @static(a=1) def f(): print(__me__.a) Paul ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/SILEUKPYQSJN6ISHRQA2GBSZ2JDR7WEH/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: A __decoration_call__ method for Callable objects (WAS: Decorators on variables)
On Wed, May 26, 2021 at 12:43:48PM -0400, Ricky Teachey wrote: [...] > These two ideas of a decorator syntax result are not the same: > > RESULT A: function decorator > # func = decorator("spam")(func) > > RESULT B: variable decorator > # name = decorator("spam")("name") > > ...because func is passed as an object, but "name" a string representing > the name of the object. Two very different things. Ricky, it's not clear to me whether you are proposing the above RESULT A and RESULT B as an *alternative* to the "variable decorator" proposal, or if you have just misunderstood it. The current variable decorator proposal on the table is for this: @decorator(spam) name # --> name = decorator("name", spam) rather than what you wrote: # name = decorator("spam")("name") So I can't tell whether the difference between your version and the OPs is a bug or a feature :-) > For this reason I think I would agree even more so that the differences in > the decorator behavior would be an extremely significant point of confusion. [...] > Maybe employment of decorator syntax could OPTIONALLY trigger a new dunder > method-- here I'll just call it __decoration_call__-- with the signature: > > def __decoration_call__(self, obj: Any, by_name: str) -> Any: ... To be clear here, I think that your proposal is that this method is to be looked up on the *decorator*, not the thing being decorated. Is that correct? In other words: @decorator class X: ... # or a function, or something else it is *decorator*, not X, that is checked for a `__decoration_call__` method. Correct? > My idea is to optionally allow any callable object to write a > __decoration_call__ method that gets called in lieu of the __call__ method > when the callable object is employed using decorator syntax. When this > happens, the decorated named is supplied- not counting self- as the first > argument (e.g., by_name), which contains the str value of the name the > decorator was applied to. In current Python, the only objects which can be decorated with the @ syntax are callable functions and classes. So it is ambiguous to talk about "any callable object" without stating whether it is the decorator or the thing being decorated. > In actuality, unless I'm wrong (I might be; not an expert) current > decorator syntax is really sugar for: > > def func(): ... > func = decorator.__call__("spam this").__call__(func) Roughly speaking, that would correspond to @decorator("spam this") def func(): ... If we have a bare decorator, we have this: @decorator def func(): ... # --> func = decorator.__call__(func) > My proposal is to make it such that: > > @decorator > def func(): ... > > ...*can result* in this: > > def func(): ... > func = decorator.__decoration_call__( func, "func") Okay. Without reading the source code, does this code snippet use the old `__call__` protocol or the new `__decoration_call__` protocol? @flambé class Banana_Surprise: pass It seems to me that this proposal means that we can't even tell which of the two protocols (classic decoration, or new `__decoration_call__` style decoration) without digging into the implementation of the decorator. To be precise, the problem here as reader isn't so much the fact that I don't know whether the object is called using the `__call__` protocol or the new-style `__decorator_call__` protocol, but the fact that I can't tell whether the calls will involve the name being passed or not. This is because the name is being *implicitly* passed, in a way that is unclear whether or not it will be passed. I just don't know whether or not the decorator `flambé` receives the name or not. > And also so that this: > > @decorator("spam this") > def func(): ... > > ...*can result* in this: > > def func(): ... > func = decorator.__call__("spam this").__decoration_call__(func, "func") What happens if the decorator factory has `__decoration_call__` and the object it returns only has `__call__`? I presume you get this: func = decorator.__decoration_call__("spam this", "func").__call__(func) And let's not forget the other two combinations: func = decorator.__decoration_call__("spam this", "func").__decoration_call__(func, "func") func = decorator.__call__("spam this").__call__(func) The last one is, of course, the current behaviour for a decorator factory. The bottom line here is that is you have plain, unadorned decorator: @decorator there are two possible behaviours and no obvious way to tell which one is used, short of digging into the implementation. But if you have a decorarator factory: @factory(*args, **kwargs) there are now four possible behaviours. And anyone brave enough to use a double-barrelled factory-factory @factory(*args, **kwargs)(*more_args) will be faced with eight possible combinations. And at this point, I'm afraid I have run out of steam to
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 11:38 PM Paul Moore wrote: > > This reminds me, if we ignore the performance aspect, function > attributes provide this functionality, but there's a significant > problem with using them because you can't access them other than by > referencing the *name* of the function being defined. Yeah, I mentioned that earlier, but just as one of the wide variety of variously-clunky ways to achieve the same thing. I think it's semantically the closest, but defining the initial value *after* the function is pretty unexciting. > It would be nice to have a better way to reference function attributes > from within a function. (This would also help write recursive > functions that could be safely renamed, but I'm not sure many people > would necessarily think that's a good thing ;-)) The interaction with renaming isn't particularly significant, but the interaction with decoration is notable. Inside the execution of a function, you'd have a reference to the innermost function, NOT the one that would be identified externally. Whether that's a good thing or a bad thing remains to be seen... Hmm. def static(**kw): def deco(func): statics = types.SimpleNamespace(**kw) @functools.wraps(func) def f(*a, **kw): func(*a, **kw, _statics=statics) return f return deco @statics(n=0) def count(*, _statics): _statics.n += 1 return _statics.n Add it to the pile of clunky options, but it's semantically viable. Unfortunately, it's as introspectable as a closure (that is: not at all). ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/JRDLOROGRHBSRMY3DF2E3VHB3ANIDKRT/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, 27 May 2021 at 14:22, Chris Angelico wrote: > Note that the statics *must* be defined on the function, NOT on the > code object. Just like function defaults, they need to be associated > with individual instances of a function. > > >>> f = [] > >>> for n in range(10): > ... def spam(n=n): > ... # static n=n # Same semantics > ... print(n) > ... f.append(spam) > ... > > Each spam() should print out its particular number, even though they > all share the same code object. This reminds me, if we ignore the performance aspect, function attributes provide this functionality, but there's a significant problem with using them because you can't access them other than by referencing the *name* of the function being defined. >>> def f(): ... print(f.i) ... >>> f.i = 1 >>> g = f >>> del f >>> g() Traceback (most recent call last): File "", line 1, in File "", line 1, in f NameError: name 'f' is not defined OK, you can, if you're willing to mess around with sys._getframe and make some dodgy assumptions: >>> def me(): ... parent = sys._getframe(1) ... for obj in parent.f_globals.values(): ... if getattr(obj, "__code__", None) == parent.f_code: ... return obj ... >>> def f(): ... print(me().i) ... >>> f.i = 1 >>> g = f >>> del f >>> g() 1 It would be nice to have a better way to reference function attributes from within a function. (This would also help write recursive functions that could be safely renamed, but I'm not sure many people would necessarily think that's a good thing ;-)) Paul ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/I4I2UEWB3CYPD3HNNTNX5A4GJMIVBTKC/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 10:20 PM Steven D'Aprano wrote: > Here is a sketch of how this could work, given a function like this: > > def func(arg): > static spam, eggs > static cheese = expression > ... > > > At function declaration time, the two static statements tell the > compiler to: > > * treat spam, eggs and cheese as local variables (use LOAD_FAST instead > of LOAD_GLOBAL for lookups); I don't think LOAD_FAST would be suitable here - isn't it always going to look in the stack frame? > * allocate static storage for them using the same (or similar) mechanism > used for function default values; Default values are attached to the function object (in either the __defaults__ tuple or the __kwdefaults__ dict). > * spam and eggs get initialised as None; > > * cheese gets initialised to the value of `expression`, evaluated > at function declaration time just as default arguments are. > > > When the function is called: > > * the interpreter automatically initialises the static variables > with the stored values; > > * when the function exits (whether by return or by raising an > exception) the static storage will be updated with the current > values of the variables. Hmm, I see what you mean. Not sure that this is really necessary though - and it could cause extremely confusing results with threading. > As a sketch of one possible implementation, the body of the function > represented by ellipsis `...` might be transformed to this: > > # initialise statics > spam = LOAD_STATIC(0) > eggs = LOAD_STATIC(1) > cheese = LOAD_STATIC(2) > try: > # body of the function > ... > finally: >STORE_STATIC(spam, 0) >STORE_STATIC(eggs, 1) >STORE_STATIC(cheese, 2) > > > One subtlety: what if the body of the function executes `del spam`? No > problem: the spam variable will become undefined on the next function > call, which means that subsequent attempts to get its value will raise > UnboundLocalError: > > > try: > x = spam + 1 > except UnboundLocalError: > spam = 0 > x = 1 > > > I would use this static feature if it existed. +1 > Agreed, I'd use it too. But I'd define the semantics slightly differently: * If there's an expression given, evaluate that when the 'def' statement is executed, same as default args * Otherwise it'll be uninitialized, or None, bikeshedding opportunity, have fun * Usage of this name uses a dedicated LOAD_STATIC or STORE_STATIC bytecode * The values of the statics are stored in some sort of high-performance cell collection, indexed numerically It would be acceptable to store statics in a dict, too, but I suspect that that would negate some or all of the performance advantages. Whichever way, though, it should ideally be introspectable via a dunder attribute on the function. Semantically, this would be very similar to writing code like this: def count(): THIS_FUNCTION.__statics__["n"] += 1 return THIS_FUNCTION.__statics__["n"] count.__statics__ = {"n": 1} except that it'd be more optimized (and wouldn't require magic to get a function self-reference). Note that the statics *must* be defined on the function, NOT on the code object. Just like function defaults, they need to be associated with individual instances of a function. >>> f = [] >>> for n in range(10): ... def spam(n=n): ... # static n=n # Same semantics ... print(n) ... f.append(spam) ... Each spam() should print out its particular number, even though they all share the same code object. This has been proposed a few times, never really got a lot of support though. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/LT3A3RDH6DFSGODLQN5V7JVQS75QWODF/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
For the implementation I had the same idea as Steven. And I don't think static variables should stored in __dict__ or __defaults__. Instead to increase efficiency (more importantly not to decrease current efficiency) it should be stored as a dict in __static__ or some other dunder member. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/SLG54ROIMOROTGI7QLFB6WHCW4HL2QT5/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 8:19 AM Steven D'Aprano wrote: > On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: > > > This idea proposes to add a keyword > > (static, maybe?) that can create static variables that can persist > > throughout the program yet only accessible through the function they > > are declared and initialized in. > > > Here is a sketch of how this could work, given a function like this: > > def func(arg): > static spam, eggs > static cheese = expression > ... > > > At function declaration time, the two static statements tell the > compiler to: > > * treat spam, eggs and cheese as local variables (use LOAD_FAST instead > of LOAD_GLOBAL for lookups); > > * allocate static storage for them using the same (or similar) mechanism > used for function default values; > > * spam and eggs get initialised as None; > > * cheese gets initialised to the value of `expression`, evaluated > at function declaration time just as default arguments are. > > > When the function is called: > > * the interpreter automatically initialises the static variables > with the stored values; > > * when the function exits (whether by return or by raising an > exception) the static storage will be updated with the current > values of the variables. > > As a sketch of one possible implementation, the body of the function > represented by ellipsis `...` might be transformed to this: > > # initialise statics > spam = LOAD_STATIC(0) > eggs = LOAD_STATIC(1) > cheese = LOAD_STATIC(2) > try: > # body of the function > ... > finally: >STORE_STATIC(spam, 0) >STORE_STATIC(eggs, 1) >STORE_STATIC(cheese, 2) > > > Couldn't you already get pretty close to this by attaching your static values to the function __dict__? Example: def func(): print(func.a) func.a = 1 Usage: >>> func() 1 Of course that is slower because there is an attribute lookup. But could there be a decorator that links the function __dict__ to locals(), so they are intertwined? @staticify({ 'a':1}) def func(): print(a) print(b) func.b = 2 Usage: >>> func() 1 2 >>> func.a = 3 # dynamic update of func.__dict__ >>> func() 3 2 The locals dict in the function body would look something like this: ChainMap(locals(), {'a':1}) --- Ricky. "I've never met a Kentucky man who wasn't either thinking about going home or actually going home." - Happy Chandler > One subtlety: what if the body of the function executes `del spam`? No > problem: the spam variable will become undefined on the next function > call, which means that subsequent attempts to get its value will raise > UnboundLocalError: > > > try: > x = spam + 1 > except UnboundLocalError: > spam = 0 > x = 1 > > > I would use this static feature if it existed. +1 > > > > -- > Steve > Same thing would happen with my idea: del a would delete a from the func.__dict__ (just like with ChainMap). But if you add it back again later, it would not be static anymore. Example: @staticify('a': 1) def func(): print(a) # fast static lookup del a # static is deleted a = 2 # this is local now func.b = 3 # but this is a static ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/SB4PZLYCTKRWHNXWVFWIXQC3NEZQRKQL/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 at 22:33:25 +1000, Steven D'Aprano wrote: > Aside from globals, which we agree are Considered Harmful, you've > suggested two alternative implementations: > > - something with closures; > > - hidden state in an object with a `__call__` method. > > Closures are cool, but the hidden state really is inaccessible from > outside the function. (At least I've never worked out how to get to it.) > So the callable object is better for introspection and debugging. Then fix your debugger. ;-) As I recall, you're a proponent of fixing what's broken rather than creating workarounds. (I also recall many discussions regarding failed sandboxes because of Python's nearly infinite capacity for introspection. Maybe closures are the path to a true sandbox.) Globals, persistent locals, closures, statics, class variables, instance variables, etc. are all just different ways to hide state. IMO, we don't need another one. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/LOB2FKM5XV3Y3TGQB532G7F2RF4JOHLN/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 11:17:18AM +0200, Ronald Oussoren via Python-ideas wrote: > For this particular question/proposal: “static” variables in functions > in C like languages are basically hidden global variables, and global > variables are generally a bad idea. Python is not required to use the same design mistakes as C :-) Shreyan already said that static variables in a function should be local to that function. The semantic difference compared to regular locals is that they should persist from one call to the next. Aside from globals, which we agree are Considered Harmful, you've suggested two alternative implementations: - something with closures; - hidden state in an object with a `__call__` method. Closures are cool, but the hidden state really is inaccessible from outside the function. (At least I've never worked out how to get to it.) So the callable object is better for introspection and debugging. Functions are objects with a `__call__` method, and they already have persistent state! >>> def spam(arg="Hello world"): ... pass ... >>> spam.__defaults__ ('Hello world',) We could, I think, leverage this functionality to get this. -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/QZSWMMPQPOZJIENMKQLKGEWZUHKZT5LX/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 07:56:16AM -, Shreyan Avigyan wrote: > Lot of programming languages have something known as static variable > storage in *functions* not *classes*. Static variable storage means a > variable limited to a function yet the data it points to persists > until the end of the program. +1 on this idea. One common use for function defaults is to optimize function lookups to local variables instead of global or builtins: def func(arg, len=len): # now len is a fast local lookup instead of a slow name lookup Benchmarking shows that this actually does make a significant difference to performance, but it's a technique under-used because of the horribleness of a len=len parameter. (Raymond Hettinger is, I think, a proponent of this optimization trick. At least I learned it from his code.) If functions had static storage that didn't need to be declared in the function parameter list, then we could use that for this trick. As you correctly point out: > Well Python also kind of has that > functionality. Python's default values provide the same type of > functionality Indeed: def func(static_storage=[0]): static_storage[0] += 1 print(static_storage[0]) But that's kinda yucky for the same reason as above: we have to expose our static storage in the parameter list to the caller, and if the value we care about is immutable, we have to stuff it inside a mutable container. > but it's a *hack* I disagree that it is a hack. At least, the implementation is not a hack. The fact that the only way we can take advantage of this static storage is to declare a parameter and give it a default value is hacky. So I guess I agree with you :-) > and also *problematic* because only > mutable types that are mutated persists. Static should behave much > like Python's for loop variables. I have no idea what that comment about loop variables means. > This idea proposes to add a keyword > (static, maybe?) that can create static variables that can persist > throughout the program yet only accessible through the function they > are declared and initialized in. Here is a sketch of how this could work, given a function like this: def func(arg): static spam, eggs static cheese = expression ... At function declaration time, the two static statements tell the compiler to: * treat spam, eggs and cheese as local variables (use LOAD_FAST instead of LOAD_GLOBAL for lookups); * allocate static storage for them using the same (or similar) mechanism used for function default values; * spam and eggs get initialised as None; * cheese gets initialised to the value of `expression`, evaluated at function declaration time just as default arguments are. When the function is called: * the interpreter automatically initialises the static variables with the stored values; * when the function exits (whether by return or by raising an exception) the static storage will be updated with the current values of the variables. As a sketch of one possible implementation, the body of the function represented by ellipsis `...` might be transformed to this: # initialise statics spam = LOAD_STATIC(0) eggs = LOAD_STATIC(1) cheese = LOAD_STATIC(2) try: # body of the function ... finally: STORE_STATIC(spam, 0) STORE_STATIC(eggs, 1) STORE_STATIC(cheese, 2) One subtlety: what if the body of the function executes `del spam`? No problem: the spam variable will become undefined on the next function call, which means that subsequent attempts to get its value will raise UnboundLocalError: try: x = spam + 1 except UnboundLocalError: spam = 0 x = 1 I would use this static feature if it existed. +1 -- Steve ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/4BASARX57WKFOZKHCRRG5RYGZ7XF6ABW/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
Reply to Chris: I'm proposing a way to do this officially in Python. For example I know another hack, def count(cur={"cur":0}): cur["cur"] += 1 return cur >> Static should behave much like Python's for loop variables. > I have no idea what this means. That's a bad example. I was just trying to make it clear. But you have got the idea. So don't worry about that. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/VA37MHB7BH5UI52IMUFHID7QWTRC2VBJ/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On 2021-05-27 10:39, Shreyan Avigyan wrote: Well sometimes we don't want to pollute the module namespace. Two functions can have two variables with the same name but with different values that we want to be static. And this functionality already exists in Python but as a *hack*. This idea proposes to add a new dunder member and a keyword that allows us to use global variables but are limited to local scope. But since it's Python anyone can access it using the dunder member. This was discussed some years ago. IIRC, one of the questions was about how and when such variables would be initialised. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/O6YJOZYELDXGRX2ALWD3X3IUBEEBVOUA/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
On Thu, May 27, 2021 at 7:20 PM Ronald Oussoren via Python-ideas wrote: > On 27 May 2021, at 09:56, Shreyan Avigyan wrote: > > > Static should behave much like Python's for loop variables. I have no idea what this means. > > For this particular question/proposal: “static” variables in functions in C > like languages are basically hidden global variables, and global variables > are generally a bad idea. > Hmm, I'd distinguish a couple of things here. Global constants are most certainly not a problem, and objects that last the entire run of the program are not a problem either. The usual problem with "global variables" is that they become hidden state, action-at-a-distance. You can change the global in one function and it notably affects some other function. Part of the point of statics is that they are *not* global; they might live for the entire run of the program, but they can't be changed by any other function, so there's no AaaD that will mess with your expectations. > In Python you can get the same result with a global variable and the use of > the “global” keyword in a function (or cooperating set of functions) when you > want to update the global variable from that function. > That's globals, with all the risks thereof. > Closures or instances of classes with an ``__call__`` method can be used as > well and can hide state (with the “consulting adults” caveat, the state is > hidden, not inaccessible). > This would be the easiest way to manage it. But both of them provide a way to have multiple independent, yet equivalent, states. If that's what you want, great! But it can also be an unnecessary level of confusion ("why would I ever make a second one of these? Why is there a factory function for something that I'll only ever need one of?"), where static variables wouldn't do that. There is one namespace that would very aptly handle this kind of thing: the function object itself. >>> def count(): ... count.cur += 1 ... return count.cur ... >>> count.cur = 0 >>> count() 1 >>> count() 2 >>> count() 3 >>> count() 4 As long as you can reference your own function reliably, this will work. There may be room for a keyword like this_function, but for the most part, it's not necessary, and you can happily work with the function by its name. It's a little clunkier than being able to say "static cur = 0;" to initialize it (the initializer has to go *after* the function, which feels backwards), but the functionality is all there. ChrisA ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/2WA37J37YVUWEOKKCHOP7OT6BKNCRTVG/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
Well sometimes we don't want to pollute the module namespace. Two functions can have two variables with the same name but with different values that we want to be static. And this functionality already exists in Python but as a *hack*. This idea proposes to add a new dunder member and a keyword that allows us to use global variables but are limited to local scope. But since it's Python anyone can access it using the dunder member. ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/B5IXIQGWXVEELTHF7ZOCRPJLUJX7MBZE/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: division of integers should result in fractions not floats
> I should probably explain (again) why I am not a fan of such a change. We have read your blog, Guido:-) Yet, this "feature" is one of top Python's misfeatures, ex. for Fernando Perez. I share his opinion too. The numbers module borrowed from the Scheme numbers tower, yet it doesn't use the concept of "exactness". (Perhaps, one of the reasons, why the numbers module is not very useful, outside of the stdlib, see https://bugs.python.org/issue43602.) The conversion exact(known algebraic structure)->inexact (like floating point numbers) must be explicit, not as / does now. Of course, I realize that changing / (again!) - will be painful. Yet it's possible: the Python - is a language, that can fix design flaws. If this is too costly - what do you think about a special literal for Floats, e.g. suggested above 1.2F==Fraction(12, 10)? R suffix might be an alternative. > After some debugging, the cause would be that > internally the program was using rational numbers with thousands of digits > of precision to represent values that would be truncated to two or three > digits of precision upon printing. This seems to be an error from the programmer, not from the language designers. Use correct data types, etc. Now we have even the Decimal class in the stdlib... ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/67B2PY54KP6LG45EVJOI7NA2LYN6TC3I/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Introduce constants in Python (constant name binding)
On 26/05/2021 08:25, Shreyan Avigyan wrote: Reply to Chris: There are two things I want to say about constants :- 1) Global-Local Constants - The ALL_CAPS convention variables should become constant. Good luck enforcing that on every Python programmer. Should it apply to variable names like A, B, ID, BA (as in "Bachelor of Arts"), UN, US, HCF, WHO, NASA, UNICEF? Rob Cliffe ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/ROGFYD5RGJYOLBR2U6KDTBSIYG4F7SR6/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: Add static variable storage in functions
> On 27 May 2021, at 09:56, Shreyan Avigyan wrote: > > Lot of programming languages have something known as static variable storage > in *functions* not *classes*. Static variable storage means a variable > limited to a function yet the data it points to persists until the end of the > program. Well Python also kind of has that functionality. Python's default > values provide the same type of functionality but it's a *hack* and also > *problematic* because only mutable types that are mutated persists. Static > should behave much like Python's for loop variables. This idea proposes to > add a keyword (static, maybe?) that can create static variables that can > persist throughout the program yet only accessible through the function they > are declared and initialized in. How experienced are your with Python? On first glance your recent proposals appear to be for feature in languages you know about and can’t find in Python, without necessarily a good understanding of Python. For this particular question/proposal: “static” variables in functions in C like languages are basically hidden global variables, and global variables are generally a bad idea. In Python you can get the same result with a global variable and the use of the “global” keyword in a function (or cooperating set of functions) when you want to update the global variable from that function. Closures or instances of classes with an ``__call__`` method can be used as well and can hide state (with the “consulting adults” caveat, the state is hidden, not inaccessible). Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/ ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/JLCCATPIOY3SKF4PYVMLEPYJH4WZZ5XJ/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Add static variable storage in functions
Lot of programming languages have something known as static variable storage in *functions* not *classes*. Static variable storage means a variable limited to a function yet the data it points to persists until the end of the program. Well Python also kind of has that functionality. Python's default values provide the same type of functionality but it's a *hack* and also *problematic* because only mutable types that are mutated persists. Static should behave much like Python's for loop variables. This idea proposes to add a keyword (static, maybe?) that can create static variables that can persist throughout the program yet only accessible through the function they are declared and initialized in. Thanking you, With Regards ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/X3GQVZL3P7OLLK42AGPJBBSLKIX4ZJDK/ Code of Conduct: http://python.org/psf/codeofconduct/
[Python-ideas] Re: division of integers should result in fractions not floats
> Yes, but having a faster fraction type would be great. SymPy doesn't > actually use the fractions module because it's too slow. Instead SymPy > has its own pure Python implementation Oscar, I think now (3.10) the stdlib implementation arithmetics is optimized like the SymPy's pure Python fallback. Let me know if I miss something. (There is some slowdown for fractions with small components and a PR to address this.) ___ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/A4R76H36Q7ZL75PBVZYJFVOYBHQF4SLZ/ Code of Conduct: http://python.org/psf/codeofconduct/