> First, it only handles functions/methods. Python FIT needs > metadata on properties and assignable/readable attributes > of all kinds. So in no sense is it a replacement. Parenthetically, > neither is the decorator facility, and for exactly the same reason. >
I can't argue against docstrings and maybe annotations on attributes, I'd like them myself. That should be a separate PEP because the scope of this one is Function Annotations. The syntax for function annotations has been much more thoroughly discussed than for annotations on attributes. See Guido's blog and other references in the PEP. > Second, it has the potential to make reading the function > header difficult. In the languages I'm familiar with, static type > declarations are a very few, somewhat well chosen words. > In this proposal, it can be a general expression. In Python > FIT, that could well turn into a full blown dictionary with > multiple keys. Any code can be hard to read. Having the annotation be a general expression lets authors use normal code factoring to make the function header more readable. For instance, one can change this: def f(x: some_really_long_expression): ... to this: t_X = some_really_long_expression def f(x: t_X): ... > Third, it's half of a proposal. Type checking isn't the only use > for metadata about functions/methods, classes, properties > and other objects, and the notion that there are only going to > be a small number of non-intersecting libraries out there is > an abdication of responsibility to think this thing through. > That comes from this paragraph from the PEP: There is no worry that these libraries will assign semantics at random, or that a variety of libraries will appear, each with varying semantics and interpretations of what, say, a tuple of strings means. The difficulty inherent in writing annotation interpreting libraries will keep their number low and their authorship in the hands of people who, frankly, know what they're doing. Perhaps you are right and intersecting libraries will become an issue. Designing a solution in advance of the problems being evident seems risky to me. What if the solution invented in a vacuum really is more of a hindrance? There is a clear intersection between documentation packages and type-checking or type-coercing libraries. Documentation libraries can just use repr(annotation), possibly with a little bit of special casing to represent classes and types better. I'm not sure there will be an important use for overlap of different type-checking or type-coercing libraries within the same module. What else could intersect and why can't the intersecting pieces develop an solution when it arises? More feedback from the community on this point (whether the PEP needs to take responsibility for interoperability) would be nice. Thanks for the feedback from everyone so far, -Tony -- http://mail.python.org/mailman/listinfo/python-list