On Tue, Nov 24, 2015 at 3:40 AM, Ian Kelly <[email protected]> wrote:
> On Mon, Nov 23, 2015 at 1:23 AM, Chris Angelico <[email protected]> wrote:
>> def latearg(f):
>> tot_args = f.__code__.co_argcount
>> min_args = tot_args - len(f.__defaults__)
>> defs = f.__defaults__
>> # With compiler help, we could get the original text as well as something
>> # executable that works in the correct scope. Without compiler help, we
>> # either use a lambda function, or an exec/eval monstrosity that can't
>> use
>> # the scope of its notional definition (since its *actual* definition
>> will
>> # be inside this decorator). Instead, just show a fixed bit of text.
>
> You should be able to get the correct globals from f.__globals__.
>
> For locals, the decorator might capture the locals of the previous
> stack frame at the moment the decorator was called, but that's
> potentially a pretty heavy thing to be retaining for this purpose; the
> definition of a @latearg function would indefinitely keep a reference
> to every single object that was bound to a variable in that scope, not
> just the things it needs. For better specificity you could parse the
> expression and then just grab the names that it uses. Even so, this
> would still act somewhat like early binding in that it would reference
> the local variables at the time of definition rather than evaluation.
>
> Nonlocals? Just forget about it.
And since nonlocals are fundamentally unsolvable, I took the simpler
option and just used an actual lambda function.
>> This does implement late binding, but:
>> 1) The adornment is the rather verbose "lambda:", where I'd much
>> rather have something shorter
>> 2) Since there's no way to recognize "the ones that were adorned", the
>> decorator checks for "anything callable"
>
> A parameter annotation could be used in conjunction with the decorator.
>
> @latearg
> def x(y: latearg = lambda: []):
> ...
>
> But that's even more verbose. In the simple case where all the
> defaults should be late, one could have something like:
>
> @latearg('*')
> def x(y=lambda: []):
> ...
>
> The argument could be generalized to pass a set of parameter names as
> an alternative to the annotation.
Yeah, that might help. With real compiler support, both of these could
be solved:
def x(y=>[]):
...
The displayed default could be "=>[]", exactly the way it's seen in
the source, and the run-time would know exactly which args were
flagged this way. Plus, it could potentially use a single closure to
evaluate all the arguments.
>> 3) Keyword args aren't handled - they're passed through as-is (and
>> keyword-only arg defaults aren't rendered)
>
> I would expect that Python 3 Signature objects would make this a lot
> simpler to handle.
Maybe. It's still going to be pretty complicated. I could easily
handle keyword-only arguments, but recognizing that something in **kw
is replacing something in *a is a bit harder. Expansion invited.
This is reminiscent of the manual "yield from" implementation in PEP
380. It looks simple enough, until you start writing in all the corner
cases.
ChrisA
--
https://mail.python.org/mailman/listinfo/python-list