On 4/7/06, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> Mostly just the API docs:
> http://peak.telecommunity.com/protocol_ref/module-protocols.html

I just finally found this too. PJE's website is really a confusing
maze; the only *source* for PyProtocols I've found so far is a CVS
repository that seems to be telling me not to use it because it's been
converted to SVN. But the SVN repository doesn't seem to have
PyProtocols, only PEAK. :-(

Anyway, the quote at the top of that page also explains why I'm still
grappling for use cases: I don't think of my own use of Python as that
of "an integrator, someone who is connecting components from various
vendors." Or at least, that's not what it feels like. I write plenty
of code that uses multiple 3rd party libraries; for example my current
Google project blends Django templates, wsgiref, a home-grown Perforce
wrapper, a Google-internal database-ish API, and some other minor
Google-specific APIs. But it doesn't *feel* to me like there's a need
to adapt one package's API to that of another; instead, my application
is in control and makes calls into each package as needed. I wonder if
it would feel different to you, Phillip and Alex, and I'm just not
seeing the opportunities for adaptation, or if it really does depend
on the typical application.

> This was the specific page about using a concrete API on the protocols 
> themselves:
> http://peak.telecommunity.com/protocol_ref/protocols-calling.html

Aha! That's probably the earliest description of "adaptation by
calling the protocol/interface object" and exactly the lightbulb idea
in my blog.

[...]
> Another way of looking at it:
>
>    1. Suppose function A and function B do the roughly the same thing

That's not a very clear way of saying it, but I get it.

>    2. Suppose function A can be overloaded/extended
>
>    3. It would then be straightforward to add additional extensions to
>    function A that delegated certain call signatures to function B (possibly
>    with a wrapper to deal with slight differences in the order of arguments or
>    the type of the return value.
>
> Turning that into *transitivity* is a matter of asking the question, "well ,
> what if B can be overloaded as well?". The idea then is that you would pass B
> a callback to be invoked whenever a new signature was registered with B. The
> callback would take the new signature, rearrange it if necessary, and register
> the new version with function A.

Hm, couldn't you do this is A's default method?

>    @function_B.call_for_new_registrations
>    def extending_B_extends_A_too(signature, extension):
>        # The decorator means this function would be called whenever a new
>        # extension was registered with function B
>        # This particular callback implements the rule "any extension of B
>        # is automatically an extension of A as well"
>        function_A.register(signature)(extension)
>
> (using PJE's terminology here, because I'm not sure of the right word for the
> extending functions when the concept is called "overloading")

I've been use "method" instead of "extension" or "specialization".
Most people already know that a method is a callable, and I've heard a
few others call it that too. OTOH it's quite different from methods
elsewhere in Python so I'm not too keen on it either.

[...]
> There's certain attractive properties to doing it though - if we can get it
> down to the point where the runtime hit is minimal for non-extended functions,
> then it avoids the scenario "if only framework X had made function Y
> extensible, the fixing this would be easy!".

That's not the feeling that occurred to me when I was writing my
Google project (see above). I can think of a few things that were
inflexible. For example, the version of Django I'm using doesn't let
you override how it searches for templates, and I had to monkey-patch
the implementation to allow requesting the template source from an
abstract API that gets it from a zip file, rather than going to the
filesystem directly. But I don't see how adaptation or extensible
functions would have helped. I suppose you could *reframe* it to
benefit from adaptation, by proposing that there should be a "template
source" object, and then all you have to do is (a) replace the
template source with a zip file, and (b) write an adapter from zip
files to the required "template source" protocol. But that's a much
more elaborate refactoring than the simple load-template hook that
would have solved my problem. You could even say that the
load-template hook existed -- my monkey-patching required replacing an
existing two-line function with one I wrote (and arguably the hook
*did* exist, in the form of this single function that acted as a choke
point for the entire template loading system).

> OTOH, if there's an unavoidable call time cost, then a trivial @extensible or
> @overloadable decorator would do the trick.

Before we worry about the implementation overhead, we need to make
sure that the resulting paradigm shift isn't going to destroy Python
as we know it. We already have well-established and well-understood
machinery for method lookup in a class hierarchy. Are we sure that
replacing all that with overloadable/extensible functions is really a
good idea? While classes are technically mutable, in practice few
classes are modified at run-time. The overloadable functions approach
seems to encourage tinkering with globally defined (and used)
functions. For example, what if one library overloads bool(str) so
that bool("False") == False, while another library assumes the
traditional meaning (non-zero string => True)?

--
--Guido van Rossum (home page: http://www.python.org/~guido/)
_______________________________________________
Python-3000 mailing list
Python-3000@python.org
http://mail.python.org/mailman/listinfo/python-3000
Unsubscribe: 
http://mail.python.org/mailman/options/python-3000/archive%40mail-archive.com

Reply via email to