Good points, and I agree with most if not all of them.

Cheers,

Alan





________________________________
From: John Zabroski <johnzabro...@gmail.com>
To: Fundamentals of New Computing <fonc@vpri.org>
Sent: Mon, July 25, 2011 5:17:02 PM
Subject: Re: [fonc] HotDraw's Tool State Machine Editor




On Mon, Jul 25, 2011 at 4:08 AM, Alan Kay <alan.n...@yahoo.com> wrote:

So when people started talking in the 60s about "POL"s in research (Problem 
Oriented Languages -- what are called DSLs today) this seemed like a very good 
idea to most people (provided that you could get them to be efficient enough). 
This led partly to Ted Steele's idea of an "UNCOL" (Universal Computer Oriented 
Language) which was a relatively low-level target for higher level languages 
whose back-end could be optimized just once for each cpu. Historically, C wound 
up filling this role about 10 years later for people who wanted a universal 
target with an optimizer attached.
>
>Overall, I would say that the biggest difficulties -- in general -- are still 
>the result of not knowing how to design each and every level of software well 
>enough.
>


Yet, it is well known that source-to-source compilation techniques are not 
really good for optimization, as documented in Kennedy and Allen's text on 
dependence-based graph optimizing compilers.  They summarize their huge 
mistakes 
in source-to-source experiments by noting the semantic information thrown away 
from stage-to-stage, such as structural information, could be kept and re-used 
to implement optimizations like explicit vectorization. [1]

Such an obvious idea as the importance of preserving structural information for 
each compiler pass has been downplayed by "worse is better" software and 
monolithic compilers like gcc.  We repeat history with C-as-UNCOL by promoting 
"JavaScript as Web assembly".  Most JavaScript VMs have to dynamically 
determine 
invariants for JavaScript code as it executes, JITing some code and un-JITing 
it 
when the invariant breaks.  What I am trying to say is that the difference 
between C and UNCOL is that UNCOL was itself a tool for defining 
POLs/DSLs/whatever-you-want-to-call-it.  C and JavaScript are just 2 examples.  
F#, C# and VB.NET running on the so-called "Common Language Runtime" 
demonstrates another, and the joke inside Microsoft is that the CLR has become 
"the C# runtime", since the CLR engine team optimizes code that would be 
difficult for C# specifically to generate, and so functional programming 
compiler techniques like closure conversion currently have very poor support on 
the CLR.

Beyond UNCOL, there are very few noteworthy thoughts on pluggable compiler 
architecture up until a few years ago (CoSy is one exception I found [2]).  
Later, LLVM came as a replacement to C, but it is probably not as general as it 
could be, even though many are happy with the LLVM IR and want LLVM's IR to be 
"the last IR you'll ever need".

I think this perspective augments your own, since I don't see value in 
distinguishing JavaScript with "low level" languages that could perform 
"faster".  It is not low level languages that you need; it is higher and higher 
level languages that better describe the semantics of your problem domain, and 
a 
clean mapping to hardware abstractions -- this is Ivan Sutherland's key 
software 
architecture contribution, Alan.  He wrote a paper that many have nicknamed 
"The 
Great Wheel of Reincarnation" to describe this phenomenon he saw in computer 
graphics hardware architecture in the '60s and '70s, where designs would evolve 
from "specific" to "General". [3]  For some reason, in other areas of computer 
science, the wheel does not reincarnate as quickly as graphics hardware and 
graphics languages.

The final challenge is preventing things like Denial of Service by rogue remote 
code hogging a graphics card or whatever.  While you mention 2.5 D windowing, 
most people have a practical wish for 3 D abstractions, and 99% of the chip 
space on GPUs these days is devoted to 3D, and chip space for 2D primitives 
have 
shrunk expontentially in the last 15 years.

Thoughts?

[1] For those who don't know much about vectorization, it is why MATLAB is a  
fairly efficient language for its problem domain, despite being an  interpreted 
language.
[2] http://www.ace.nl/compiler/cosy.html
[3] 
http://cva.stanford.edu/classes/cs99s/papers/myer-sutherland-design-of-display-processors.pdf
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to