On 2005 Jan 10, at 18:43, Phillip J. Eby wrote: ...
At 03:42 PM 1/10/05 +0100, Alex Martelli wrote:
    The fourth case above is subtle.  A break of substitutability can
    occur when a subclass changes a method's signature, or restricts
    the domains accepted for a method's argument ("co-variance" on
    arguments types), or extends the co-domain to include return
    values which the base class may never produce ("contra-variance"
    on return types).  While compliance based on class inheritance
    _should_ be automatic, this proposal allows an object to signal
    that it is not compliant with a base class protocol.

-1 if this introduces a performance penalty to a wide range of adaptations (i.e. those using abstract base classes), just to support people who want to create deliberate Liskov violations. I personally don't think that we should pander to Liskov violators, especially since Guido seems to be saying that there will be some kind of interface objects available in future Pythons.

If interfaces can ensure against Liskov violations in instances of their subclasses, then they can follow the "case (a)" fast path, sure. Inheriting from an interface (in Guido's current proposal, as per his Artima blog) is a serious commitment from the inheritor's part; inheriting from an ordinary type, in real-world current practice, need not be -- too many cases of assumed covariance, for example, are around in the wild, to leave NO recourse in such cases and just assume compliance.



    Just like any other special method in today's Python, __conform__
    is meant to be taken from the object's class, not from the object
    itself (for all objects, except instances of "classic classes" as
    long as we must still support the latter).  This enables a
    possible 'tp_conform' slot to be added to Python's type objects in
    the future, if desired.

One note here: Zope and PEAK sometimes use interfaces that a function or module may implement. PyProtocols' implementation does this by adding a __conform__ object to the function's dictionary so that the function can conform to a particular signature. If and when __conform__ becomes tp_conform, this may not be necessary any more, at least for functions, because there will probably be some way for an interface to tell if the function at least conforms to the appropriate signature. But for modules this will still be an issue.


I am not saying we shouldn't have a tp_conform; just suggesting that it may be appropriate for functions and modules (as well as classic classes) to have their tp_conform delegate back to self.__dict__['__conform__'] instead of a null implementation.

I have not considered conformance of such objects as functions or modules; if that is important, I need to add it to the reference implementation in the PEP. I'm reluctant to just get __conform__ from the object, though; it leads to all sort of issues with a *class* conforming vs its *instances*, etc. Maybe Guido can Pronounce a little on this sub-issue...



I don't see the benefit of LiskovViolation, or of doing the exact type check vs. the loose check. What is the use case for these? Is it to allow subclasses to say, "Hey I'm not my superclass?" It's also a bit confusing to say that if the routines "raise any other exceptions" they're propagated. Are you saying that LiskovViolation is *not* propagated?

Indeed I am -- I thought that was very clearly expressed! LiskovViolation means to skip the loose isinstance check, but it STILL allows explicitly registered adapter factories a chance (if somebody registers such an adapter factory, presumably they've coded a suitable adapter object type to deal with some deuced Liskov violation, see...). On the other hand, if some random exception occurs in __conform__ or __adapt__, that's a bug somewhere, so the exception propagates in order to help debugging. The previous version treated TypeError specially, but I think (on the basis of just playing around a bit, admittedly) that offers no real added value and sometimes will hide bugs.



    If none of the first four mechanisms worked, as a last-ditch
    attempt, 'adapt' falls back to checking a registry of adapter
    factories, indexed by the protocol and the type of `obj', to meet
    the fifth case.  Adapter factories may be dynamically registered
    and removed from that registry to provide "third party adaptation"
    of objects and protocols that have no knowledge of each other, in
    a way that is not invasive to either the object or the protocols.

This should either be fleshed out to a concrete proposal, or dropped. There are many details that would need to be answered, such as whether "type" includes subtypes and whether it really means type or __class__. (Note that isinstance() now uses __class__, allowing proxy objects to lie about their class; the adaptation system should support this too, and both the Zope and PyProtocols interface systems and PyProtocols' generic functions support it.)

I disagree: I think the strawman-level proposal as fleshed out in the pep's reference implementation is far better than nothing. I mention the issue of subtypes explicitly later, including why the pep does NOT do anything special with them -- the reference implementation deals with specific types. And I use type(X) consistently, explicitly mentioning in the reference implementation that old-style classes are not covered.


I didn't know about the "let the object lie" quirk in isinstance. If that quirk is indeed an intended design feature, rather than an implementation 'oops', it might perhaps be worth documenting it more clearly; I do not find that clearly spelled out in the place I'd expect it to be, namely <http://docs.python.org/lib/built-in-funcs.html> under 'isinstance'. If the "let the object lie" quirk is indeed a designed-in feature, then, I agree, using x.__class__ rather than type(x) is mandatory in the PEP and its reference implementation; however, I'll wait for confirmation of design intent before I change the PEP accordingly.

One other issue: it's not possible to have standalone interoperable PEP 246 implementations using a registry, unless there's a standardized place to put it, and a specification for how it gets there. Otherwise, if someone is using both say Zope and PEAK in the same application, they would have to take care to register adaptations in both places. This is actually a pretty minor issue since in practice both frameworks' interfaces handle adaptation, so there is no *need* for this extra registry in such cases.

I'm not sure I understand this issue, so I'm sure glad it's "pretty minor".


    Adaptation is NOT "casting".  When object X itself does not
    conform to protocol Y, adapting X to Y means using some kind of
    wrapper object Z, which holds a reference to X, and implements
    whatever operation Y requires, mostly by delegating to X in
    appropriate ways.  For example, if X is a string and Y is 'file',
    the proper way to adapt X to Y is to make a StringIO(X), *NOT* to
    call file(X) [which would try to open a file named by X].

    Numeric types and protocols may need to be an exception to this
    "adaptation is not casting" mantra, however.

The issue isn't that adaptation isn't casting; why would casting a string to a file mean that you should open that filename?

Because, in most contexts, "casting" object X to type Y means calling Y(X).


I don't think that "adaptation isn't casting" is enough to explain appropriate use of adaptation. For example, I think it's quite valid to adapt a filename to a *factory* for opening files, or a string to a "file designator". However, it doesn't make any sense (to me at least) to adapt from a file designator to a file, which IMO is the reason it's wrong to adapt from a string to a file in the way you suggest. However, casting doesn't come into it
nywhere that I can see.

Maybe we're using different definitions of "casting"?

If I were going to say anything about that case, I'd say that adaptation should not be "lossy"; adapting from a designator to a file loses information like what mode the file should be opened in. (Similarly, I don't see adapting from float to int; if you want a cast to int, cast it.) Or to put it another way, adaptability should imply substitutability: a string may be used as a filename, a filename may be used to designate a file. But a filename cannot be used as a file; that makes no sense.

I don't understand this "other way" -- nor, to be honest, what you "would say" earlier, either. I think it's pretty normal for adaptation to be "lossy" -- to rely on some but not all of the information in the original object: that's the "facade" design pattern, after all. It doesn't mean that some info in the original object is lost forever, since the original object need not be altered; it just means that not ALL of the info that's in the original object used in the adapter -- and, what's wrong with that?!


For example, say that I have some immutable "record" types. One, type Person, defined in some framework X, has a huge lot of immutable data fields, including firstName, middleName, lastName, and many, many others. Another, type Employee, defines in some separate framework Y (that has no knowlege of X, and viceversa), has fewer data fields, and in particular one called 'fullName' which is supposed to be a string such as 'Firstname M. Lastname'. I would like to register an adapter factory from type Person to protocol Employeee. Since we said Person has many more data fields, adaptation will be "lossy" -- it will look upon Employee essentially as a "facade" (a simplified-interface) for Person.

Given the immutability, we MIGHT as well 'cast' here...:

def adapt_Person_to_Employee(person, protocol, alternate):
    assert issubclass(protocol, Y.Employee)
    return protocol(fullName='%s %s. %s' % (
        person.firstName, person.middleName[0], person.lastName), ...

although the canonical approach would be to make a wrapper:

class adapt_Person_to_Employee(object):
    def __init__(self, person, protocol, alternate):
        assert issubclass(protocol, Y.Employee)
        self.p = person
    def getFullName(self):
        return '%s %s. %s' % (
            self.p.firstName, self.p.middleName[0], self.p.lastName)
    fullName = property(getFullName)

which would be more general (work fine even for a mutable Person).

So, can you please explain your objections to what I said about adapting vs casting in terms of this example? Do you think the example, or some variation thereof, should go in the PEP?


Reference Implementation and Test Cases

    The following reference implementation does not deal with classic
    classes: it consider only new-style classes.  If classic classes
    need to be supported, the additions should be pretty clear, though
    a bit messy (x.__class__ vs type(x), getting boundmethods directly
    from the object rather than from the type, and so on).

Please base a reference implementation off of either Zope or PyProtocols' field-tested implementations which deal correctly with __class__ vs. type(), and can detect whether they're calling a __conform__ or __adapt__ at the wrong metaclass level, etc. Then, if there is a reasonable use case for LiskovViolation and the new type checking rules that justifies adding them, let's do so.

I think that if a PEP includes a reference implementation, it should be self-contained rather than require some other huge package. If you can critique specific problems in the reference implementation, I'll be very grateful and eager to correct them.


    Transitivity of adaptation is in fact somewhat controversial, as
    is the relationship (if any) between adaptation and inheritance.

The issue is simply this: what is substitutability? If you say that interface B is substitutable for A, and C is substitutable for B, then C *must* be substitutable for A, or we have inadequately defined "substitutability".

Not necessarily, depending on the pragmatics involved.

If adaptation is intended to denote substitutability, then there can be absolutely no question that it is transitive, or else it is not possible to have any meaning for interface inheritance!

If interface inheritance is intended to express ensured substitutability (finessing pragmatics), fine. I'm not willing to commit to that meaning in the PEP.


Dinnertime -- I'd better send this already-long answer, and deal with the highly controversial remaining issues later.


Thanks, BTW, for your highly detailed feedback.


Alex

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to