In discussing with my wife some of the ideas of my previous post (on "Self" as the source of many ideas in Squeak) a greater clarity has emerged for me on a programming paradigm issue (in large part due to her input related to her own related experiences with some people resisting prototype-style approaches in application design and others embracing them).
Our best Physics and Psychology (taken together, as interpreted by me :-) right now suggests the universe is a sea of subatomic particles in relation to each other in complex ways (determining how those relations change in the future). "Objects" (and Platonic "classes" for objects) are fictions our mind uses to try to make sense of all that complexity. So too, the notions of "class" and "objects" are fictions we programmers invent to get a handle on the raw sea bits of computer memory as they change over time from computational rules applied to them. But they are also fictions we try to apply to the real world when we write accounting systems as well. And they often break down, as illustrated in depth in the late William Kent's book _Data and Reality_. http://www.bkent.net/ We often use the term "Abstraction" to point at this problem (or even cover it up), but that doesn't capture the whole notion, since it that word is often used with the intent that there is one abstraction, as opposed to the notion that the abstraction is just one possibility we chose because it was useful at the moment for specific human goals. So, as far as computer language paradigms go (and teaching people about them), there are then two sets of issues. One relates to modeling the "real world" (because computers are often used to do that) and one relates to getting a handle on "telling the computer what to do" (because you can't do the first without the second). These are somewhat related issues (after all, we are in some sense building models of the real world even when they are about programs running on VMs), though they are still distinct issues in my mind. In terms of increasing levels of computing machinery, we have something like: Machine -> Assembly -> Functions -> Classes -> Prototypes -> Relations In terms of world modeling done by a human mind about the world, we have somewhat the reverse, in: Classes -> Objects(Prototypes) -> Relations -> Reality [Kent outlines an idea for a relation programming system called ROSE or STAR, and I have worked towards something similar in my Pointrel ideas, but the common metaphor is an "entity-relational" model of data). http://en.wikipedia.org/wiki/Entity-relationship_model There of course may be other levels, or even a branching tree, and one might organize these differently, if say, you build a Smalltalk on top of Self, so this is all a big simplification to get one started thinking about this issue. Simplifying greatly even further, it has been the history of computing to move away from the machine to what is often called "higher levels" of abstraction. While it has been the history of (western) Physics to drill ever deeper into the notion of perceptions of reality starting from the perception of Platonic ideals (or classes). Integrating perceptions across these levels is not that easy. Forth does is somewhat for computing though IMHO, a tiny language that can easily be extended to support various paradigms. And here is the cultural divide: when a person is at one level of understanding of all this, it is very hard to convince them of the value or even feasibility of the other levels they may have no experience with. This notion is inspired somewhat by my reading "Future Shock Levels" by Eliezer S. Yudkowsky: http://yudkowsky.net/sing/shocklevels.html "The use of this measure is that it's hard to introduce anyone to an idea more than one Shock Level above - and Shock Levels measure what you accept calmly, not what you know about.") But clearly thinking about levels of understanding is a common teacher-type curriculum design thing to do. And, these other levels may (at least initially) impose discouragingly large cognitive and computational burdens to try out. So, it is just very hard to get someone used to thinking hierarchically about classes of birds that such distinctions are somewhat arbitrary if you truly understand taxonomic issues. Or further, that the very notion of a specific individual bird is a convenient fiction the mind invented for surviving in and making sense of the universe (instead of modeling the bird and its environment at a horrendously detailed quantum level, plus modeling the minds goals in such a way if such was possible). Or, looking at this issue from programming point of view, one can implement Classes in C, but they seem slower for the standard problems C is used to solve. Or one can implement Morphic and prototypes in Smalltalk, but they feel like a kludgey add on. Or one can implement relations with prototypes, but they will certainl at first seem inelegant and slow. [As an aside, I have a prototype of a Smalltalk-like VMs in C and Python that uses a entity-relational model of triads of data for their cores; inefficient yes, but elegant IMHO, and it turns out easy to reach into remotely, or have one simple unified browser to study system state, or be able to imagine the system running backwards for debugging.] There may also even be a personality effects related to feelings about authoritarianism and control, or where one falls on the "meshwork vs. hierarchy" preference landscape. http://www.t0.or.at/delanda/meshwork.htm For a (personal example), having some Dutch roots myself, yet being an outsider to Holland, I can see in that culture a paradox of both pride and success in being a permissive society, while at the same time pride and success in being a society influenced by some fairly strict religious ideals (including compassion) as well as pressures towards conformist behaviors. (This is true of all societies to some extent, but walking around Amsterdam is a very graphic example of this. :-) Or as is said in the intro to "The Portable Henry James" edited by John Auchard (my wife is reading it; I just glanced at it. :-): http://www.amazon.com/gp/product/0142437670/103-1147318-5179869?v=glance&n=283155 "The divide between a culture of individual freedom and a culture of tradition -- H.G. Wells identified communities of will, and communities of faith and obedience -- is a great divide, and it is a persistent and deep one in Henry James, as it should be, even though many Americans in the novels scarcely seem to notice. In the tales and novels it often is the exact place where things tear asunder, a place of broadly political crisis that stubbornly refuses reconciliation -- at least without the virtual annihilation of one side or the other". Beyond how this currently effects Dutch politics relating to immigration and integration issues, I think that factor is at play somewhat in Python's evolution too -- as the Python community is IMHO one that, with Pythons success in gaining broad interest, is currently struggling to reconcile both supporting a desire for prototypes and a desire to move in a strict static typing direction. :-) It's hard to do all these ideas justice in a few word, and I've probably lost many people trying to follow this by now who firmly believe objects really do exist they way they see them, but suffice to say, Python right now sits somewhat straddling the boundary between modeling data as classes and prototypes. Squeak is also at that boundary, less so as a language, but more so as a library and environment when it tries to patch Morphic prototype widgets on top of Smalltalk classes. So, in terms of moving PySqueak forward, the question is, whether to try to push a little like Squeak does, or to push even further, like on to entire prototype systems or even relational systems. Python has better support for prototypes than conventional Smalltalk since Python already uses a dictionary for instance variables as opposed to Smalltalk which uses an array of slots where the slot<->name correspondence is managed by a class. So if Python were to gain something like Morphic widgets, they would probably fit much more nicely into Python than they ever would on Smalltalk. (Ignoring the dominant paradigm determined by library code issue mentioned previously.) Still, Python doesn't try to do a entity-relationship model directly. So, one can ask, is it worth trying to do as Squeak specifically does (Morphic), or try to see the higher ideal of Squeak pushing up another level (Classes -> Prototypes), and try to go it one better (Prototypes -> Relations)? The Self video shows that once you do have a working prototype based system, making a Smalltalk with classes is fairly easy, and what more, actually runs faster than a typical Smalltalk. So too, perhaps a successful entity-relational language might be able to do prototypes easily and more quickly? In the triadal systems I work on (essentially a long list of items which are three pointers: P1 P2 P3) and a few operations on that, http://www.kurtz-fernhout.com/pointrel/ prototypes and classes merge naturally as useful approaches to many problems. But obviously, such a push risks drifting much further from any unifying vision driving interest in something like PySqueak, yet at the same time, it is less "reinventing the wheel", and more "burn the disk packs" to go beyond full steam ahead. At least in my own explorations of computation, trying to make systems that are more entity-relational has been what I have for two decades felt driven towards, informed by a certain conception of Physics and Psychology. But, the new insight from me is to see these two spectrums of (again): Machine -> Assembly -> Functions -> Classes -> Prototypes -> Relations and: Classes -> Objects -> Relations -> Reality and how human personality and cultural and economic pressures help determine individual choices about where to program and think on these spectrums. It's not an easy thing to wrestle with. Yet, as I learned from one smart person who had studied the patent literature, innovative progress comes from a successful attempt to reconcile seemingly opposing needs both at the same time (e.g. cheap and fast), rather than just compromise between them (e.g. cheaper but slower -- compromise being just engineering, or politics, as usual). As far as I am concerned, Squeak was a step backwards from Self in many ways (while forward in others, especially marketing and adoption-wise in an educational setting), as Self elegantly reconciles needs for prototypes and classes in a way that Squeak Smalltalk compromises with. Will PySqueak be a compromise or an innovation (or both)? It remains to be seen. Trying to keep this short, but failing. :-) --Paul Fernhout _______________________________________________ Edu-sig mailing list [email protected] http://mail.python.org/mailman/listinfo/edu-sig
