On 10/9/10 8:44 PM, John Zabroski wrote:
From experience, most people don't want to
discuss this because they're happy with Good Enough and scared of testing
something better.  They are always male, probably 40'ish, probably have a
wife and two kids.  We're on two different planets, so I understand
different priorities.

Well, that is probably pretty close by coincidence to describing me. :-)

But, as much as I agree with the general thrust of your arguments about design issues, including the network is the computer that needs debugging (and better design), I think there is another aspect of this that related to Manuel De Landa's point on design and meshworks, hierarchies, and interfaces.

See:
  "Meshwork, Hierarchy, and Interfaces"
  http://www.t0.or.at/delanda/meshwork.htm
"To make things worse, the solution to this is not simply to begin adding meshwork components to the mix. Indeed, one must resist the temptation to make hierarchies into villains and meshworks into heroes, not only because, as I said, they are constantly turning into one another, but because in real life we find only mixtures and hybrids, and the properties of these cannot be established through theory alone but demand concrete experimentation. Certain standardizations [of interfaces], say, of electric outlet designs or of data-structures traveling through the Internet, may actually turn out to promote heterogenization at another level, in terms of the appliances that may be designed around the standard outlet, or of the services that a common data-structure may make possible. On the other hand, the mere presence of increased heterogeneity is no guarantee that a better state for society has been achieved. After all, the territory occupied by former Yugoslavia is more heterogeneous now than it was ten years ago, but the lack of uniformity at one level simply hides an increase of homogeneity at the level of the warring ethnic communities. But even if we managed to promote not only heterogeneity, but diversity articulated into a meshwork, that still would not be a perfect solution. After all, meshworks grow by drift and they may drift to places where we do not want to go. The goal-directedness of hierarchies is the kind of property that we may desire to keep at least for certain institutions. Hence, demonizing centralization and glorifying decentralization as the solution to all our problems would be wrong. An open and experimental attitude towards the question of different hybrids and mixtures is what the complexity of reality itself seems to call for. To paraphrase Deleuze and Guattari, never believe that a meshwork will suffice to save us. {11}"

So, that's where the thrust of your point gets parried, :-) even if I may agree with your technical points about what is going to make good software. And even, given, as suggested in my previous note, that a bug in our socio-economic paradigm has led to the adoption of software that is buggy and sub-optimal in all sorts of ways. (I feel that having more people learn about the implications of cheap computing is part of fixing that socio-economic bug related to scarcity thinking in an abundant world.)

Building on Manuel De Landa's point, JavaScript as the ubiquitous standard local VM in our lives that potentially allows a diversity of things built on a layer above that (like millions of educational web pages about a variety of things). JavaScript/ECMAScript may be a language with various warts, it may be 100X slower than it needs to be as a VM, it may not be message-oriented, it may have all other issues including a standardization project that seems bent on removing all the reflection from it that allows people to write great tools for it, and so on, but it is what we have now as a sort-of "social hierarchy defined" standard for content that people can interact with using a web browser (or an email system that is extended in it, like Thunderbird). And so people are building a lot of goodness on top of JavaScript, like with Firebug and its extensions.
  http://getfirebug.com/wiki/index.php/Firebug_Extensions
One may rightfully say JavaScript has all sorts of issues (as a sort of hastily cobbled together "hierarchy" of common functionality), but all those extensions show what people in practice can do with it (through the power of a social meshwork).

And that syncs with a deeper implication of your point about, in Sun's language, that the network is the computer, and thus we need to adjust our paradigms (and tools) to deal with that. But whereas you are talking more about the technical side of that, there is another social side of that as well. Both the technical side and social side (and especially their interaction in the context of competition and commercial pressures and haste) can lead to unfortunate compromises in various ways. Still, it is what we have right now, and we can think about where that momentum is going and how to best work with it and build on it.

For example, basically, Firebug plus JavaScript/HTML5/CSS is even redefining a major niche where Squeak wanted to be (educational computing, especially simulations), and it is not that terrible a thing even if it could all be a lot better (in theory).
  "HTML5 + JS: The Future of Open Education"
  http://www.olpcnews.com/content/education/html5_js_future_open_education.html
"There are number of good tools today for creating interactive, educational content. These tools are very powerful but have a steep learning curve. Back in January of this year, I postulated a framework called "Karma" that would make it easy as possible for software developers to start creating educational software. Eight months later, I am proud to announce the release of version 0.1 of Karma, codenamed "Osito". ... While I believe that HTML5+JavaScript is the future of interactive, educational content I am not suggesting that Karma will be the best HTML5+JS solution. I certainly hope that others who recognize its shortcomings either join us to make it better or create their own solutions independently."

As Manuel De Landa implies, we need an open an experimental attitude towards meshwork/hierarchy hybrids when talking about the fundamentals of new computing. :-)

I think Manuel De Landa's point helps show how we can all be living on the same planet together in a healthy way. :-) We need to accept that we live in the interplay of meshworks and hierarchies that define interfaces (for message passing) and which keep turning into each other. :-)

Of course, that does not mean we should not stop hoping for something more or better ways to do things -- better meshworks, better hierarchies, better interfaces, better transitions (where we don't all agree on what "better" means). And it does not mean that we should not try to use appropriate cleverness to simplify messy things that don't have to be messy (like using OMeta or whatever to help us translate things to new abstractions, or have better debuggers that let us debug across the boundaries of different abstractions across a network, and so on).

Anyway, I think Manuel De Landa's point has been fundamental to helping me reconcile myself to the world as it is without losing hope, so I tend to quote it a lot. :-)

Here is another thing about hope, which one can think of as far as new computing: :-)
  "The Optimism of Uncertainty" by Howard Zinn
  http://www.commondreams.org/views04/1108-21.htm
"In this awful world where the efforts of caring people [trying to make better computer abstractions and implementations] often pale in comparison to what is done by those who have power, how do I manage to stay involved and seemingly happy? [Or when others have the luck of being in the right place at the right time when economic power is used, like when JavaScript got into Netscape Navigator, or when IBM got behind Java instead of Smalltalk, or when IBM picked Microsoft's DOS instead of an in-house Forth or the third-party amazing QNX for the IBM PC's OS, or when we got the IBM PC instead of the IBM CS-9000 with a real-time unixy OS as the personal computer because various IBM divisions were fighting each other, or how the VM concept and hypervisor concepts at IBM since the 1970s or so was not brought sooner into the personal computing world or programming systems, and when VisualWorks run-time fees led to Java, and so on for probably an endless parade of worse-is-better.] I am totally confident not that the world will get better [through FONC/OMeta/COLA/etc.], but that we should not give up the game before all the cards have been played. The metaphor is deliberate; life is a gamble. Not to play is to foreclose any chance of winning. To play, to act, is to create at least a possibility of changing the world [so that our information systems suck less, like through a dynamic semantic web built on open standards like ECMAScript and semantic triples and the HTML5 canvas]. There is a tendency to think that what we see in the present moment [like IE6] will continue. We forget how often we have been astonished by the sudden crumbling of institutions, by extraordinary changes in people's thoughts [and message passings], by unexpected eruptions of rebellion against tyrannies [like a standard flexible prototype-oriented language ruling the web browser that can support Lively Kernel instead of something proprietary like Microsoft Windows running complied source-inaccessible code], by the quick collapse of systems of power that seemed invincible [like the proprietary JVM going open source]. What leaps out from the history of the past hundred years is its utter unpredictability [unless you are inventive like Alan Kay or Abraham Lincoln :-) http://www.quoteworld.org/quotes/10258 ]. This confounds us, because we are talking about exactly the period when human beings became so ingenious technologically that they could plan and predict [using a cool and hip 1960s computer with spiffy punched cards] the exact time of someone landing on the moon, or walk down the street talking to someone halfway around the earth [on their SmartPhone or OLPC running JavaScript and/or Squeak and/or something FONC-ish]."

Anyway, so I do have hope that we may be able to develop platforms that let us work at a higher level of abstraction like for programming or semantic web knowledge representation, but we should still accept that (de facto social) standards like JavaScript and so on have a role to play in all that, and we need to craft our tools and abstractions with such things in mind (even if we might in some cases just use them as a VM until we have something better). That has always been the power of, say, the Lisp paradigm, even as Smalltalk's message passing paradigm has a lot going for it as a unifying and probably more scalable abstraction. What can be frustrating is when our "bosses" say "write in Fortran" instead of saying "write on top of Fortran", same as if they said, "Write in assembler" instead of "Write on top of assembler or JavaScript or whatever". I think work like VPRI through COLA is doing to think about getting the best of both worlds there is a wonderful aspiration, kind of like trying to understand the particle/wave duality mystery in physics (which it turns out is potentially explainable by a many-worlds hypothesis, btw). But it might help to have better tools to do that -- and tools that linked somehow with the semantic web and social computing and so on.

But even with better paradigms and better tools built with and for them, then the question is, what do we point them at? I think it is terrific, say, that VPRI has worked on "JOHN":
  "JOHN - A Knowledge Representation Language"
  http://tinlizzie.org/~hesam/pmwiki/pmwiki.php
but I can think it makes a lot of sense to continue to build in that direction (as far as aspirations to have FONC stuff working with knowledge representation) and integrate those ideas more with the Semantic Web somehow. JOHN, in the sense there of "... a programming language to build and reason about microworlds ..." is perhaps the beginning of a microscope/telescope to use with the Semantic Web as well as many other sorts of systems (even as it probably lacks a social component at the moment?). Although I still might ask if we need another programming language so much (why not just use Smalltalk syntax to define microworlds)?
  http://www.cs.ucla.edu/~hesam/john/samples/map.jmc
Maybe instead we can use the languages we have (including things like RDF-like triples) in new ways -- where, say, an abstraction that represents JOHN might "compile" itself to, say, JavaScript or whatever it needed/wanted/was-asked-to to at the moment. It might not even have a preferred text format but just exist as an ocean of messages (to maybe borrow a poetic metaphor from Michael's Ocean project?) where it wrote parts of itself out in various formats as it needed to.

Anyway, I'm just trying to break through some krufty conceptual barriers here that are keeping us from moving down a path to really good systems from what we have. The very notion of "programming language" may itself be one of the barriers. In one paradigm, we have messages (and services called objects that process messages, and with function calls being a subset of messages) but how that is represented as text is fairly arbitrary.

The Smalltalk system (especially the image) potentially moved beyond just being text decades ago, but it was not appreciated much then. And even then a lot of stuff got confused all together even as we now see them reprised individually in other contexts and combined in different ways stuff that Smalltalk-80 had. Look at all the things the early Smalltalks had:
* an image of objects
* a virtual machine
* the notion of message passing
* the notion of parallel processing and distributed systems
* the notion of hierarchical classes implemented a certain way (although also, now, prototypes and new implementations) * Smalltalk the textual language (a great language, but out-of-step with what was taught in schools with infix math operations)
* A class library, especially with streams, collections, and a great math tower
* Tools like the browser and inspector and debugger
* History
* Innovative applications, including "The Analyst" and painting programs
* Interfaces to system services
* GUIs
* Networking
* A certain culture, including "open source" and "educational computing"
* Probably other neat stuff

But, how do we reconcile the value of all that with the fact that JavaScript is now the de facto virtual machine of the world (which is mostly using a web browser, as right as you are that is a broken paradigm)? How do we not lose the goodnesses that we can preserve from all that given what we have now?

Well, hopefully, we either figure out a way to live with JavaScript's warts (Lively Kernel?), or we work at a higher level of abstraction (OMeta?), or we try to replace JavaScript with something better (COLA?). All those things will no doubt be tried (as they have been, by VPRI and many others, for many years), and eventually there will be some success in terms of reconciling meshwork/hierarchy/interface issues experimentally. :-)

I tried that with PataPata as far as trying to use Python as a backend and failed. :-) Though it was an interesting experiment none-the-less that I learned from. I can only think what could have been if I had picked JavaScript as a backend then instead of Python (but I dismissed JavaScript as just another stupid inefficient C-syntax scripting system and slow). I missed a big thing there -- PataPata could have been Firebug (and something much more) and run in every web browser and got a community to extend it.

I learned a lot by seeing Lively Kernel in action and thinking about it, as it shows the power of building one abstraction on top of another (even with the lower level JavaScript abstraction being slow). I do think Lively Kernel misses something though with the continue emphasis on self-hosting. It missed what PataPata aspired to do and what Firebug really does, which is be able to reach from one world to another and work on the remote world using tools hosted in the local world. It's something that may go against the paradigm of "personal computing" -- to be stuck with the limitations of the remote world (like DOM or XUL or whatever) and having to work within them. So, that's a paradigm shift that can be hard to get, even for people so good at inventing new paradigms. :-) It's also a frustrating paradigm shift, because you have to accept "the other" is not going to be transformed to be identical to you, even as you can send messages to it. And it's a paradigm shift that may be hard to make when people spend so much time talking about "personal computing" when really, more and more, what we are experiencing is "social computing" of some sort.

It is a paradigm shift that is really in keeping with the message sending roots of Smalltalk, because it involves tools that send messages to other programming worlds, and so is a sort of message-oriented programming written large across the social network. And people would be right to point out there are various implementations, including in Smalltalk, of remote debugging (especially in the embedded systems world) -- I'm just trying to get at a paradigm shift that may be lagging what the tools make possible. Anyway, as amazing as Lively Kernel is, I somehow feel it has not made that paradigm shift somehow, and that connects back with your points about the network as the computer and being stuck in programming assuming "single-tier IBM computers from the 1970s/80s". So, maybe I'm proposing that one could have a Lively Kernel that was more social? :-)

Still, social computing does not mean the end of personal computing, in the same way we can be both individual people and part of a bigger community or even a global social movement. So, in practice, we see a balance of the personal computing hierarchy and the social meshwork of networks of existing code and running servers, as well as the personal object meshwork and the socially-imposed hierarchical standards we may adopt as individuals for convenience or necessity (as all real systems are both meshworks and hierarchies that keep turning into each other, as Manuel De Landa suggests).

Anyway, even if only just for myself, I'm trying in a rambling way to bring all these fuzzy things into some better focus as I rethink whether to do anything more on the PataPata project (like in JavaScript) or to just use some other existing tools. :-)

--Paul Fernhout
http://www.pdfernhout.net/
====
The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those thinking in terms of scarcity.

_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to