yeah.

at least in my experience, prototype systems tend to also lead to much simpler implementations than class/instance systems as well (I have implemented both types of system).

originally, and for a fairly long time, I had been using prototype systems (with a design essentially based on the self-spec, rather than the JavaScript variant which seemed to me to be a degraded form).

then, at one point, I ended up implementing an object system with its API largely being derived from the JNI spec (at the time I was trying for a Java implementation, it made sense to emulate JNI and JVMTI and similar...). the internals of this thing were much larger and much more complex (although the external API is the most complex part, as a lot of it is a big complex external API wrapping somewhat simpler internals, with most of the API calls simply re-calling to other functions).

actually, the C/I system is also capable of handling P-OO as well, but it is more awkward and has a higher overhead than the other system (and uses the C/I system's object type, rather than the P-OO system's object type).

I guess some of this is my punishment for emulating Sun's API design, them wanting to special-case pretty much any imaginable API call, grr... (to see this special-casing in action, one can note Java's API's, which have around 30 or so classes dedicated to file IO, ...). (the .NET framework gets by well enough with only a fraction as many classes for most of this stuff).


technically, both systems share some amount of the underlying machinery, but are not strictly compatible (the P-OO objects don't work with the C/I API, but the C/I objects can be used from the P-OO API). there is, however, a partial split of the C/I API which can also use the P-OO system's objects (while having more of the "look and feel" of the C/I API).

both systems also use a big hash-table to resolve requests, although with C/I objects, this is not used in the simple single-inheritence case, but hash-lookups are used for interfaces and to emulate multiple-inheritence (although the system can't fully emulate C++ MI semantics, nor those of other MI languages).


I have had some thoughts before about the possibility of "unifying" the systems, but have not done so as of yet (having a single "object" type which works fairly well for both, and with a consistent API and semantics). this would likely also imply some amount of API simplification, and maybe dropping some features which were never really used (such as MI and "structs"). the combined API should address both static and dynamically-typed usage (the C/I system favors static types, and the P-OO system uses dynamic types).

however, I don't know if/when I would do any of this (not a terribly high priority at the moment...).


they also compete with another type of "object" system (also in use in my case):
using C structs, and then using reflection facilities to work with them.
(although this strategy doesn't have a clean API as of yet).

this strategy is mostly used when working with C code though, and when interfacing with C.


but, yeah, it is all a bit of a mess...


----- Original Message ----- From: "Steve Dekorte" <[email protected]>
To: "Fundamentals of New Computing" <[email protected]>
Sent: Saturday, July 10, 2010 3:22 AM
Subject: Re: [fonc] goals



On 2010-07-10, at 12:25 AM, Hans-Martin Mosner wrote:
For quite some time I've been pondering the duality of the class/instance and method/context relations. In some sense, a context is an object created by instantiating its method, much like a normal object is instantiated from its class...


Self does just that:

http://labs.oracle.com/self/language.html

Io (following Self's example) does as well. In this recent video:

http://www.infoq.com/interviews/johnson-armstrong-oop

Ralph Johnson talks about how long it takes for computing culture to absorb new ideas (in his example, things like OO, garbage collection and dynamic message passing) despite them being obvious next steps in retrospect. I think prototypes could also be an example of this.

It seems as if each computing culture fails to establish a measure for it's own goals which leaves it with no means of critically analyzing it's assumptions resulting in the technical equivalent of religious dogma. From this perspective, new technical cultures are more like religious reform movements than new scientific theories which are measured by agreement with experiment. e.g. had the Smalltalk community said "if it can reduce the overall code >X without a performance cost >Y" it's better, perhaps prototypes would have been adopted long ago.

- Steve
_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc


_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to