Paul Sokolovsky <pfal...@users.sourceforge.net> added the comment:

> the idea was proposed purely to "close a gap"

That pinpoints it well. I was just writing a tutorial on implementing custom 
import hooks, with the idea to show people how easy it to do it in Python. As 
first step, I explained that it's bad idea to do any transformations on surface 
representation of a program. At the very least, it should be converted to token 
stream. But then I found that I need to explain that we need to convert it 
back, which sounds pretty weird and undermines the idea:

    def xform(token_stream):
        for t in token_stream:
            if t[0] == tokenize.NAME and t[1] == "function":
                yield (tokenize.NAME, "lambda") + t[2:]
            else:
                yield t

    with open(filename, "rb") as f:
        # Fairly speaking, tokenizing just to convert back to string form
        # isn't too efficient, but CPython doesn't offer us a way to parse
        # token stream so far, so we have no choice.
        source = tokenize.untokenize(xform(tokenize.tokenize(f.readline)))
    mod = type(imphook)("")
    exec(source, vars(mod))
    return mod

Having written that comment, I thought I could as well just make one more step 
and monkey-patch "ast" for parse_tokens() function - I'll need to explain that, 
but the explanation probably wouldn't sound worse than the explanation above. 
And then it was just one more step to actually submit patch for 
ast.parse_tokens(), and that's how this ticket was created!

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue42729>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to