On 10/8/10 1:51 PM, Waldemar Kornewald wrote:
On Fri, Oct 8, 2010 at 5:20 PM, Paul D. Fernhout
<[email protected]>  wrote:
The PataPata project (by me) attempted to bring some ideas for Squeak and
Self to Python about five years ago. A post mortem critique on it from four
years ago:
  "PataPata critique: the good, the bad, the ugly"
  http://patapata.sourceforge.net/critique.html

In that critique you basically say that prototypes *maybe* aren't
better than classes, after all. On the other hand, it seems like most
problems with prototypes weren't related to prototypes per se, but the
(ugly?) implementation in Jython which isn't a real prototype-based
language. So, did you have a fundamental problem with prototypes or
was it more about your particular implementation?

Waldemar-

Thanks for the comments.

A main practical concern was the issue of managing complexity and documenting intent in a prototype system, especially in a rough-edges environment that mixed a couple layers (leading to confusing error messages).

Still, to some extent, saying "complexity is a problem" is kind of like blaming disease on "bad vapors". We need to be specific about things to do, like suggesting that you usually have to name things before you can share them and talking about naming systems or ways of reconciling naming conflicts, which is the equivalent of understanding that there is some relation between many diseases and bacteria, even if that does not tell you exactly what you should be doing about bacteria (given that people could not survive without them).

I think, despite what I said, the success of JavaScript does show that prototypes can do the job -- even though, as Alan Kay said, what is most important about the magic of Smalltalk is message passing not objects (or classes, or for that matter prototypes). I think how prototype languages work in practice, without message passing, and with a confusion of communicating with them through either slots of functions, is problematical; so I'd still rather be working in Smalltalk, which ultimately I think has a paradigm that can scale better, than either imperative or functional languages.

But, the big picture issue I wanted to raise isn't about prototypes. It as about more general issues -- like how do we have general tools that let us look at all sorts of computing abstractions?

In biology, while it's true there are now several different types of microscopes (optical, electron, STM, etc.) in general, we don't have a special microscope developed for every different type of organism we want to look at, which is the case now with, say, debuggers and debugging processes.

So, what would the general tools look like to let us debug anything? And I'd suggest, that would not be "gdb" as useful as that might be.

I can usefully point the same microscope at a feather, a rock, a leaf, and pond water. So' why can't I point the same debugger at Smalltalk image, a web page with JavaScript served by a Python CGI script, a VirtualBox emulated Debian installation, and a semantic web trying to understand a "jobless recovery"?

I know that may sound ludicrous, but that's my point. :-)

But when you think about it, there might be a lot of similarities at some level in thinking about those four things in terms of displaying information, moving between conceptual levels, maintaining to do lists, doing experiments, recording results, communicating progress, looking at dependencies, reasoning about complex topics, and so on. But right now, I can't point one debugger at all those things, and even suggesting that we could sounds absurd. Of course, most things that sound absurd really are absurd, but still: "If at first, the idea is not absurd, then there is no hope for it (Albert Einstein)"

In March, John Zabroski wrote: "I am going to take a break from the previous thread of discussion. Instead, it seems like most people need a tutorial in how to think BIG."

And that's what I'm trying to do here. There are billions of computers out there running JavaScript, HTML, and CSS (and some other stuff, powered by CGI stuff). How can we think big about that overall global message passing system? Billions of computers connected closely to humans supported with millions of customized tiny applications (each a web page) interacting as a global dynamic semantic space are just going to be more interesting than a few thousand computers running some fancy new kernel with some fancy new programming language. But, should not "new computing" take in account this reality somehow?

I've also got eight cores on my desktop, most of them idle most of the time, and I have electric heat so most of the year it does not cost me anything to run them. So, raw performance is not so important as it used to be. What is important are the conceptual abstractions as well as the practical connection with what people are willing to easily try.

Now, people made the same argument to me thirteen years ago about Java (like Jim Spohrer), and I would not believe it. :-) I think I was right then to be skeptical and to say VisualWorks etc. was better, even as I probably would have been better off jumping on that bandwagon back then and helping it grow better. I'm sort of saying, don't make the same mistake I made, but this time about JavaScript and the dynamic semantic web.

(Of course, I also have my own twist on that -- thinking in terms of exchanging transactions of triples, but one could just get really far with mainstream technologies.)

For example, I think it is terrific that OMeta has a JavaScript version:
  http://tinlizzie.org/ometa/
And also useable from the web:
  http://www.tinlizzie.org/ometa-js/
That is a great way to make the ideas accessible to lots of people.

Again though, these issue are deeper than just JavaScript. JavaScript is maybe an example of my point I posted to the edusig list -- that what is of great interest now is interoperations with existing ecosystems of systems and layers already out there, even if the system closest to you and displaying your GUI is dynamic and very self-hosting.

I am wondering if there is some value in reviving the idea for JavaScript?

Firebug shows what is possible as a sort of computing microscope for
JavaScript and HTML and CSS. Sencha Ext Designer shows what is possible as
far as interactive GUI design.

What exactly does JavaScript give you that you don't get with Python?

If you want to have prototypes then JavaScript is probably the worst
language you can pick. You can't specify multiple delegates and you
can't change the delegates at runtime (unless your browser supports
__proto__, but even then you can only have one delegate). Also, as a
language JavaScript is just not as powerful as Python. If all you want
is a prototypes implementation that doesn't require modifications to
the interpreter then you can get that with Python, too (*without*
JavaScript's delegation limitations).

Well, both Python and JavaScript don't focus on message passing, so we could do better than either IHMO (and have, with, say, Smalltalk decades ago). And as I wrote four years ago about PataPata, adding a layer of machinery to do more things to an existing language that is from a very different paradigm is problematical. You probably end up having to build a whole new system if you want great error message. And user communities tend to be self-selecting both for some ideas they do like and some ideas they don't like. People aren't generally drawn to Python in the past for doing experimental stuff with languages the way Lisp hackers might be. And Smalltalkers can do stuff with the language (defining new conditionals) without even thinking much about it, just to show the power of that set of ideas. :-)

Still, culturally, JavaScript seems to be winning a certain battle for ubiquity and mindshare (maybe unfairly, but that's the way it is right now). And it is poised to do even more as the web momentum just grows and grows. Even the fact that JavaScript has hardly any standard libraries is working to its advantage in some places:
  "Javascript meet Gnome, Gnome meet Javascript"
  http://www.grillbar.org/wordpress/?p=307
Whereas anyone who uses Python expects the Python standard libraries that likely as not conflict with whatever Python is embedded in.

Still, ideally, one wants tools that can work at a higher layer of abstraction. JavaScript is being now used as the universal ubiquitous virtual machine. It is a really stupid Virtual Machine design (compared to, say, VisualWorks's VM, Strongtalk's, Forth's, or even now Java's JVM), but JavaScript as a VM is free and everywhere. Go to the Google home page today an you can see an animation about what would be John Lennon's 70th birthday, and it is in JavaScript:
  http://www.google.com/
  "John Lennon Google Doodle "
  http://www.youtube.com/watch?v=TYHCeUfoAnw

It's totally stupid to use JavaScript as a VM for "world peace" since it would be a lot better if every web page ran in its own well-designed VM and you could create content that just compiled to the VM, and the VMs had some sensible and secure way to talk to each other and respect each other's security zones in an intrinsically and mutually secure way. :-)
  "Stating the 'bleeding' obvious (security is cultural)"
  http://groups.google.com/group/diaspora-dev/msg/17cf35b6ca8aeb00

But, short of the major browser developers and all the other web developers out there agreeing to use a common VM, people end up using things like the Google Web Toolkit to generate JavaScript which is a bit like Smalltalk compiling methods to VM bytecodes (but much less efficient). And then they build junky privacy disrespecting stuff on top of that, too. :-) And Microsoft and Adobe are going to sabotage that (and essentially have in the past), because economic competition is often really harmful in regards to standards as people try to make money by locking people into their own proprietary stuff. See my section here on competition:
  http://knol.google.com/k/paul-d-fernhout/beyond-a-jobless-recovery

So, essentially I agree with your point, and even John Zabroski's point in reply about "Why are we stuck with such poor architecture?". Yes, JavaScript in a web browser is terrible. The problem is, everything else is worse in many real world situations.

As someone who's first real computer was a KIM-1 with 1K of memory, I find the entire notion of using JavaScript as a VM, on top of endless layers of other crud, very unasthetic and wasteful. But at what point do I stop fighting a stupid reality and work with it and transform it into at least the best that it could be? :-) That's what Dan Ingalls did with the Lively Kernel (even if my points here are a little different). And also I could hope that, maybe, someday, we could throw away all the intermediate layers at some point if they are not doing much of interest anymore so we could do more with less (especially when we reach the point as a society that people aren't finding there is an economic incentive to be middlemen and making it hard to get everyone to agree so that things stay stuck in inefficient paradigms). And it's great to have ideas like in OMeta that might support that eventual collapse in a good way, by just shift the output target to a different VM (although even then there are semantic issues).

Consider that even FONC-related compiler stuff written on JavaScript (or compiled to JavaScript) will have an immediate audience, where a web browser just manages downloading all dependencies for someone, whereas (practically) no one is going to install a compiler and dependencis, run a desktop application, track new versions of something, all the time running security risks about trojan code, etc.. Now, by "practically no one", sure there may be thousands of people (including many on this mailing list) who are interested in that, but overall, the social momentum is the web, and now the emerging semantic web, and JavaScript. Again, I'm not saying it is fair. I fought against Java for years because I knew ObjectWorks was just so much better -- but ten years later, the JVM is mostly as good as ObjectWorks was around 1996 when I had to spend $9K or so on VisualWorks+Envy+EnvyServer, but now the JVM is free (as in freedom) and so is (not so good as ENVY in some ways) DVCS.

That's the beauty of OMeta in that sense or anything else that takes that approach to have an accessible web application that can run in the browser, and even further, stuff that may be retargetable (although you still need to deal with a potential mismatch abstractions).

Still, one can ask -- how can that all be taken even further? OMeta working as an Ajax application using Dojo or Google Closure or whatever for the GUI? I'm glad to see a JavaScript together with Cola has been worked on. So, I'm not saying things are going wrong here. I think all that stuff is great, and very impressive, and wonderful progless. I'm learning from the examples (like Lively Kernel). I can just wonder about the next level -- the issue of what is this all for? And to me, that is the issue of a dynamic social semantic web (as well as discussions about simulations). So, this is, if anything, perhaps kind of like a plea to point FONC at the Semantic Web? :-) "Semantic" is only mentioned in the 2009 NSF report in a programming context, with not mention of the Semantic Web.
  http://www.vpri.org/pdf/tr2009016_steps09.pdf
(I liked the "telescope" mention in there, btw, on page 14, in a diagram about DBjr that is beginning to get at some of these points. :-) But I won't claim to be up on all VPRI is doing. FONC may be going more in those directions than I know.

Sure, we could argue about low level details, whether at the level of Python/JavaScript or below. For example, if I were implementing a VM, I'd like to, say, try implementing (in C) a protype-ish message passing system reminiscent of Forth that had all objects where the first value at an object's address was an index into a lookup table (where many objects might share the same table as a sort of common "class"), and all messages with the same selector name mapped to the same lookup index (even if there were a lot of blank items that were mapped to "doesNotUnderstand" maybe with some defaulting on that), which would make for a really fast system (so, no method cache or whatever, just:
  "jump (*theProtoype)[selector]"
Although, this would require a bunch of memory for lookup (since if there were say, 10,000 selectors needed for the current VM instance, and one hundred classes, that would be a million lookup slots or 4MB on a 32 bit system). And for that matter, I'd get Intel or someone else to build hardware acceleration for this, if I am dreaming about runtime efficiency, as well as hardware for processing lists of virtual instructions (or do it on a FPGA). But, that's not what we got, and even if I implement it, probably nobody is going to use it, and Intel probably isn't going to support it anyway because it would mean simpler processors could do more (so, on the face of it, less upgrades, even if it might empower entire new classes of applications). (Such a system might have the equivalent of class hierarchies or even multiple-inheritance, but that would probably be managed by tools more than infrastructure.)

In any case, whatever I or others can dream for greater code speed or smaller memory footprint, we're left on a practical basis with JavaScript as the ubiquitous VM (made barely acceptable speedwise by thirty years of Moore's Law, giving us what ObjectWorks could do when I first got it around 1988). Still, it could be worse -- the VM could be the old version of BASIC Dan Ingalls used to write an early Smalltalk implementation on in the 1970s. :-) So, we should be thankful for our blessings. :-)

Again though, the implementation details, while important, distract from the big picture of a common abstraction for looking at computer processes and structured data (including the semantic web). I'm not fully sure what I mean by that, but that's why it's research about the future of computing. :-) Anyway, I'm just suggesting VPRI and anyone else interested in FONC could look more into the Semantic Web as an application domain at which to point all the great tools

Of course, I suggested the same to the Diaspora (Facebook-clone) people, advice that will probably be ignored there: :-)
  "Raising the bar to supporting a Social Semantic Desktop"

http://groups.google.com/group/diaspora-dev/browse_thread/thread/4cd369bdf16a346f

So, I'm just repeating the same thing here, perhaps. :-)

The key point being:
http://sourceforge.net/mailarchive/forum.php?thread_name=4CAF294B.8010101%40kurtz-fernhout.com&forum_name=patapata-discuss
"PataPata's powerful ideas go beyond Smalltalk's in part from adding
collective support. These include:
* The most important components in a computing system are *both* the
individual and the group of human users.
* Programming should be a natural extension of *both* thinking and
communicating.
* Programming should be a dynamic, evolutionary process consistent with the
model of human learning activity for *both* individuals and groups.
* A computing environment is both languages and productivity enhancing
interfaces of programmer/user *and* group "power tools" -- utilities to
express yourself in those languages and to organize and flexibly use both
procedural and factual knowledge created by yourself or others."

So, I'm also getting at the limits of a "Personal Computing" paradigm in that point (which was expanded from something about an early Smalltalk implementation to include social aspects). Dynabook as a concept saw the local, but what is more clearer now is the social (including semantic community) aspect of computing. We need both. Either by itself is problematical as we get either isolated fancy computers or we get social networks that run on the bottom common denominator. We need some hybrid -- something reminiscent of the blend of meshworks (social networks in this case) and hierarchies (top-down tight personal control over a system in this case), kind of like talked about by Manuel De Landa:
  "Meshworks, Hierarchies, and Interfaces"
  http://www.t0.or.at/delanda/meshwork.htm
"Indeed, one must resist the temptation to make hierarchies into villains and meshworks into heroes, not only because, as I said, they are constantly turning into one another, but because in real life we find only mixtures and hybrids, and the properties of these cannot be established through theory alone but demand concrete experimentation."

There is a bit of irony there in that personal computing about freedom is really about tight control over a system, whereas giving into social computing demands means taking part in a somewhat democratic but anarchistic network. But that's one of the good kinds of irony in this world. :-)

So, maybe we need a "Semantic Smalltalk"? :-) And implemented on top of FONC-type systems? On top of JavaScript and web browsers as a first cut? :-)

Not to say mixing Smalltalk and the semantic web is a new idea; here is something related suggested by someone else from 2005:
  http://morenews.blogspot.com/2005/12/smalltalk-meets-semantic-web.html
"Smalltalk:::OWL-Project "OWL has emerged from the AI/semantic community and tends to be in the open-source community which appears to be a direction for Smalltalk (e.g. Smalltalk Solutions at Linux World) Much of the work to date has been implemented in Python and Ruby which, from a language perspective, is very close to Smalltalk. However, those languages become less appealing if you have ever worked in the IDE's supporting those languages. OWL can provide the Smalltalk community with a "market" that is a good fit for the features of the ST language and supporting IDE's.""

And Doug Englebart talked about this and more decades ago:
  "Augmenting Society's Collective IQ"
  http://www.dougengelbart.org/about/vision-highlights.html

One reason I do like JavaScript though is the notion of making systems that are easy to change by the end user, and more and more people know some JavaScript, and more and more people's time spent with computers is spent with a web browser. So, it seems to me that if we are to pursue Doug Engelbart's dream, having some way it can (at times) rest on top of web browsers and JavaScript seems like a way to make something accessible and changeable to the point where a lot of people could get involved with it.

--Paul Fernhout
http://www.pdfernhout.net/
====
The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those thinking in terms of scarcity.

_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to