Re: [fonc] Error trying to compile COLA

2012-03-14 Thread David Barbour
On Mon, Mar 12, 2012 at 10:24 AM, Martin Baldan martino...@gmail.comwrote:

 And that's how you get a huge software stack. Redundancy can be
 avoided in centralized systems, but in distributed systems with
 competing standards that's the normal state. It's not that programmers
 are dumb, it's that they can't agree on pretty much anything, and they
 can't even keep track of each other's ideas because the community is
 so huge.



I've been interested in how to make systems that work together despite
these challenges. A major part of my answer is seeking data model
independence:
http://awelonblue.wordpress.com/2011/06/15/data-model-independence/
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Block-Strings / Heredocs (Re: Magic Ink and Killing Math)

2012-03-14 Thread Loup Vaillant

BGB wrote:

   On 3/13/2012 4:37 PM, Julian Leviston wrote:

I'll take Dave's point that penetration matters, and at the same time,
most new ideas have old idea constituents, so you can easily find
some matter for people stuck in the old methodologies and thinking to
relate to when building your new stuff ;-)



well, it is like using alternate syntax designs (say, not a C-style
curly brace syntax).

one can do so, but is it worth it?
in such a case, the syntax is no longer what most programmers are
familiar or comfortable with, and it is more effort to convert code
to/from the language, ...


Alternate syntaxes are not always as awkward as you seem to think they
are, especially the specialized ones.  The trick is to ask yourself how
you would have written such an such piece of program if there were no
pesky parser to satisfy.  Or how you would have written a complete spec
in the comments.  Then you write the parser which accepts such input.

My point is, new syntax don't always have to be unfamiliar.
For instance:

+---+---+---+---+---+---+---+
| 0 | 2 | 3 | 4 | 5 | 6 | 7 |
+---+---+---+---+---+---+---+
|foo|bar|
+---+---+
|baz|
+---+

It should be obvious to anyone who has read an RFC (or a STEPS progress
report) that it describes a bit field (16 bits large, with 3 fields).
And those who didn't should have learned this syntax by now.

Now the only question left is, is it worth the trouble _implementing_
the syntax?  Considering that code is more often read than written,
I'd say it often is.  Even if the code that parses the syntax isn't
crystal clear, what the syntax should mean is.

You could also play the human compiler: use the better syntax in the
comments, and implement a translation of it in code just below.  But
then you have to manually make sure they are synchronized.  Comments
are good.  Needing them is bad.

Loup.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-03-14 Thread Robin Barooah

On Mar 14, 2012, at 2:22 AM, Max Orhai wrote:

 But, that's exactly the cause for concern! Aside from the fact of Smalltalk's 
 obsolescence (which isn't really the point), the Squeak plugin could never be 
 approved by a 'responsible' sysadmin, because it can run arbitrary user code! 
 Squeak's not in the app store for exactly that reason. You'll notice how 
 crippled the allowed 'programming apps' are. This is simple strong-arm bully 
 tactics on the part of Apple; technical problems  solved by heavy-handed 
 legal means. Make no mistake, the iPad is the anti-Dynabook.

To my mind writing Apple's solution off as 'strong-arm bully tactics' obscures 
very real issues. Code expresses human intentions. Not all humans have good 
intentions, and so not all code is well intentioned.  Work at HP labs in the 
90's showed that it's impossible, even if you have full control of the virtual 
machine and can freeze and inspect memory, to mechanically prove with certainty 
that a random software agent is benign. So when code is exchanged publicly, 
provenance becomes important.

Apple's solution is as much technical as it is legal.  They use code signing to 
control the provenance of code that is allowed to execute, and yes, they have a 
quasi-legal apparatus for determining what code gets signed.  As it stands, 
they have established themselves as the sole arbiter of provenance.

I think one can easily argue that as the first mover, they have set things up 
to greatly advantage themselves as a commercial entity (as they do in other 
areas like the supply chain), and that it would be generally better if there 
was freedom about who to trust as the arbiter of provenance, however I don't 
see a future in which non-trivial unsigned code is generally exchanged.  This 
is the beginning of a necessary trend.  I'd love to hear how I'm wrong about 
this.

My suspicion is that for the most part, Apple's current set up is as locked 
down as it's ever going to be, and that over time the signing system will be 
extended to allow more fine grained human relationships to be expressed.  

For example at the moment, as an iOS developer, I can allow different apps that 
I write to access the same shared data via iCloud.  That makes sense because I 
am solely responsible for making sure that the apps share a common 
understanding of the meaning of the data, and Apple's APIs permit multiple 
independent processes to coordinate access to the same file.  

I am curious to see how Apple plans to make it possible for different 
developers to share data.  Will this be done by a network of cryptographic 
permissions between apps?

 
 -- Max
 
 On Tue, Mar 13, 2012 at 9:28 AM, Mack m...@mackenzieresearch.com wrote:
 For better or worse, both Apple and Microsoft (via Windows 8) are attempting 
 to rectify this via the Terms and Conditions route.
 
 It's been announced that both Windows 8 and OSX Mountain Lion will require 
 applications to be installed via download thru their respective App Stores 
 in order to obtain certification required for the OS to allow them access to 
 features (like an installed camera, or the network) that are outside the 
 default application sandbox.  
 
 The acceptance of the App Store model for the iPhone/iPad has persuaded them 
 that this will be (commercially) viable as a model for general public 
 distribution of trustable software.
 
 In that world, the Squeak plugin could be certified as safe to download in a 
 way that System Admins might believe.
 
 
 On Feb 29, 2012, at 3:09 PM, Alan Kay wrote:
 
 Windows (especially) is so porous that SysAdmins (especially in school 
 districts) will not allow teachers to download .exe files. This wipes out 
 the Squeak plugin that provides all the functionality.
 
 But there is still the browser and Javascript. But Javascript isn't fast 
 enough to do the particle system. But why can't we just download the 
 particle system and run it in a safe address space? The browser people don't 
 yet understand that this is what they should have allowed in the first 
 place. So right now there is only one route for this (and a few years ago 
 there were none) -- and that is Native Client on Google Chrome. 
 
  But Google Chrome is only 13% penetrated, and the other browser fiefdoms 
 don't like NaCl. Google Chrome is an .exe file so teachers can't 
 download it (and if they could, they could download the Etoys plugin).
 
 
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc
 
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-03-14 Thread Alan Kay
As I've mentioned a few times on this list and in the long ago past, I think 
that the way to go is to make a hardware software system that assumes no piece 
of code is completely benign. This was the strategy of the B5000 long ago, and 
of several later OS designs (and of the Internet itself). Many of these ideas 
were heavily influenced by the work of Butler Lampson over the years.


The issue becomes: you can get perfect safety by perfect confinement, but how 
do you still get things to work together and make progress? For example, the 
Internet TCP/IP mechanism only gets packets from one place to another -- this 
mechanism cannot command a receiver computer to do anything. (Stupid software 
done by someone else inside a computer could decide to do what someone on the 
outside says -- but the whole object of design here is to retain confinement 
and avoid the idea of commands at every level.)

In theory real objects are confined virtual computers that cannot be 
commanded. But most use of objects today is as extensions of data ideas. Once 
you make a setter message, you have converted to a data structure that is now 
vulnerable to imperative mischief.

In between we have hardware based processes that are supposed to be HW 
protected virtual computers. These have gotten confused with storage swapping 
mechanisms, and the results are that most CPUs cannot set up enough of them 
(and for different reasons, the interprocess communcation is too slow for many 
purposes).

A hardware vendor with huge volumes (like Apple) should be able to get a CPU 
vendor to make HW that offers real protection, and at a granularity that makes 
more systems sense.

In the present case (where they haven't done the right thing), they still do 
have ways to confine potentially non-benign software in the existing gross 
process mechanisms. Apple et al already does this for running the web browser 
that can download Javascript programs that have not been vetted by the Apple 
systems people. NaCl in the Chrome browser extends this to allow the 
downloading of machine code that is run safely in its own sandbox. 


It should be crystal clear that Apple's restrictions have no substance in the 
large -- e.g. they could just run non-vetted systems as in the browser and 
NaCl. If you want more and Apple doesn't want to fix their OS, then maybe 
allowing them to vet makes some sense if you are in business and want to use 
their platform. 


But the main point here is that there are no technical reasons why a child 
should be restricted from making an Etoys or Scratch project and sharing it 
with another child on an iPad.

No matter what Apple says, the reasons clearly stem from strategies and tactics 
of economic exclusion.

So I agree with Max that the iPad at present is really the anti-Dynabook


Cheers,

Alan





 From: Robin Barooah ro...@sublime.org
To: Fundamentals of New Computing fonc@vpri.org 
Sent: Wednesday, March 14, 2012 3:38 AM
Subject: Re: [fonc] Error trying to compile COLA
 



On Mar 14, 2012, at 2:22 AM, Max Orhai wrote:

But, that's exactly the cause for concern! Aside from the fact of 
Smalltalk's obsolescence (which isn't really the point), the Squeak plugin 
could never be approved by a 'responsible' sysadmin, because it can run 
arbitrary user code! Squeak's not in the app store for exactly that reason. 
You'll notice how crippled the allowed 'programming apps' are. This is simple 
strong-arm bully tactics on the part of Apple; technical problems  solved by 
heavy-handed legal means. Make no mistake, the iPad is the anti-Dynabook.



To my mind writing Apple's solution off as 'strong-arm bully tactics' obscures 
very real issues. Code expresses human intentions. Not all humans have good 
intentions, and so not all code is well intentioned.  Work at HP labs in the 
90's showed that it's impossible, even if you have full control of the virtual 
machine and can freeze and inspect memory, to mechanically prove with 
certainty that a random software agent is benign. So when code is exchanged 
publicly, provenance becomes important.


Apple's solution is as much technical as it is legal.  They use code signing 
to control the provenance of code that is allowed to execute, and yes, they 
have a quasi-legal apparatus for determining what code gets signed.  As it 
stands, they have established themselves as the sole arbiter of provenance.


I think one can easily argue that as the first mover, they have set things up 
to greatly advantage themselves as a commercial entity (as they do in other 
areas like the supply chain), and that it would be generally better if there 
was freedom about who to trust as the arbiter of provenance, however I don't 
see a future in which non-trivial unsigned code is generally exchanged.  This 
is the beginning of a necessary trend.  I'd love to hear how I'm wrong about 
this.


My suspicion is that for the most part, Apple's current set up is as locked 
down as 

Re: [fonc] Block-Strings / Heredocs (Re: Magic Ink and Killing Math)

2012-03-14 Thread Loup Vaillant

Michael FIG wrote:

Loup Vaillantl...@loup-vaillant.fr  writes:


You could also play the human compiler: use the better syntax in the
comments, and implement a translation of it in code just below.  But
then you have to manually make sure they are synchronized.  Comments
are good.  Needing them is bad.


Or use a preprocessor that substitutes the translation inline
automatically.


Which is a way of implementing the syntax… How is this different than
my Then you write the parser?  Sure you can use a preprocessor, but
you still have to write the macros for your new syntax.

Loup.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread Alan Kay
Hi Scott

This seems like a plan that should be done and tried and carefully evaluated. I 
think the approach is good. It could be not quite enough to work, but it 
should give rise to a lot of useful information for further passes at this.


1. Psychologist O.K. Moore in the early 60s at Yale and elsewhere pioneered the 
idea of a talking typewriter to help children learn how to read via learning 
to write. This was first a grad student in a closet with a microphone 
simulating a smart machine -- but later the Edison division of McGraw-Hill made 
a technology that did some of these things. 


The significance of Moore's work is that he really thought things through, both 
with respect to what such a curriculum might be, but also to the nature of the 
whole environment made for the child. 


He first defined a *responsive environment* as one that:
a.   permits learners to explore freely
b.   informs learners immediately about the consequences of their actions
c.   is self-pacing, i.e. events happen within the environment at a rate 
determined by the learner
d.  permits the learners to make full use of their capacities to discover 
relations of various kinds
e.   has a structure such that learners are likely to make a series of 
interconnected discoveries about the physical, cultural or social world


He called a responsive environment: “*autotelic*, if engaging in it is done for 
its own sake rather than for obtaining rewards or avoiding punishments that 
have no inherent connection with the activity itself”. By “discovery” he meant 
“gently guided discovery” in the sense of Montessori, Vygotsky, Bruner and 
Papert (i.e. recognizing that it is very difficult for human beings to come up 
with good ideas from scratch—hence the need for forms of guidance—but that 
things are learned best if the learner puts in the effort to make the final 
connections themselves—hence the need for forms of discovery.

The many papers from this work greatly influenced the thinking about personal 
computing at Xerox PARC in the 70s. Here are a couple:

-- O. K. Moore, Autotelic Responsive Environments and Exceptional Children, 
Experience, Structure and Adaptabilty (ed. Harvey), Springer, 1966
-- Anderson and Moore, Autotelic Folk Models, Sociological Quarterly, 1959

2. Separating out some of the programming ideas here:

a. Simplest one is that the most important users of this system are the 
children, so it would be a better idea to make the tile scripting look as easy 
for them as possible. I don't agree with the rationalization in the paper about 
preserving the code reading skills of existing programmers.

b. Good idea to go all the way to the bottom with the children's language.

c. Figure 2 introduces another -- at least equally important language -- in my 
opinion, this one should be made kid usable and programmable -- and I would try 
to see how it could fit with the TS language in some way. 


d. There is another language -- AIML -- introduced for recognizing things. I 
would use something much nicer, easier, more readable, etc., -- like OMeta -- 
or more likely I would go way back to the never implemented Smalltalk-71 (which 
had these and some of the above features in its design and also tried to be kid 
usable) -- and try to make a version that worked (maybe too hard to do in 
general or for the scope of this project, but you can see why it would be nice 
to have all of the mechanisms that make your system work be couched in kid 
terms and looks and feels if possible).

3. It's out of the scope of your paper and these comments to discuss getting 
kids to add other structures besides stories and narrative to think with. You 
have to start with stories, and that is enough for now. A larger scale plan 
(you may already have) would involve a kind of weaning process to get kids to 
add non-story thinking (as is done in math and science, etc.) to their skills. 
This is a whole curriculum of its own.


I make these comments because I think your project is a good idea, on the right 
track, and needs to be done

Best wishes

Alan





 From: C. Scott Ananian csc...@laptop.org
To: IAEP SugarLabs i...@lists.sugarlabs.org 
Sent: Tuesday, March 13, 2012 4:07 PM
Subject: [IAEP] Barbarians at the gate! (Project Nell)
 

I read the following today:


A healthy [project] is, confusingly, one at odds with itself. There is a 
healthy part which is attempting to normalize and to create predictability, 
and there needs to be another part that is tasked with building something new 
that is going to disrupt and eventually destroy that normality. 
(http://www.randsinrepose.com/archives/2012/03/13/hacking_is_important.html)


So, in this vein, I'd like to encourage Sugar-folk to read the short paper 
Chris Ball, Michael Stone, and I just submitted (to IDC 2012) on Nell, our 
design for XO-3 software for the reading project:


     http://cscott.net/Publications/OLPC/idc2012.pdf


You're expected not to like 

Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread C. Scott Ananian
On Wed, Mar 14, 2012 at 12:54 PM, Alan Kay alan.n...@yahoo.com wrote:

 The many papers from this work greatly influenced the thinking about
 personal computing at Xerox PARC in the 70s. Here are a couple:

 -- O. K. Moore, Autotelic Responsive Environments and Exceptional
 Children, Experience, Structure and Adaptabilty (ed. Harvey), Springer, 1966
 -- Anderson and Moore, Autotelic Folk Models, Sociological Quarterly, 1959


Thank you for these references.  I will chase them down and learn as much
as I can.


 2. Separating out some of the programming ideas here:

 a. Simplest one is that the most important users of this system are the
 children, so it would be a better idea to make the tile scripting look as
 easy for them as possible. I don't agree with the rationalization in the
 paper about preserving the code reading skills of existing programmers.


I probably need to clarify the reasoning in the paper for this point.

Traditional text-based programming languages have been tweaked over
decades to be easy to read -- for both small examples and large systems.
 It's somewhat of a heresy, but I thought it would be interesting to
explore a tile-based system that *didn't* throw away the traditional text
structure, and tried simply to make the structure of the traditional text
easier to visualize and manipulate.

So it's not really skills of existing programmers I'm interested in -- I
should reword that.  It's that I feel we have an existence proof that the
traditional textual form of a program is easy to read, even for very
complicated programs.  So I'm trying to scale down the thing that works,
instead of trying to invent something new which proves unwieldy at scale.

b. Good idea to go all the way to the bottom with the children's language.

 c. Figure 2 introduces another -- at least equally important language --
 in my opinion, this one should be made kid usable and programmable -- and I
 would try to see how it could fit with the TS language in some way.


This language is JSON, which is just the object-definition subset of
JavaScript.  So it can in fact be expressed with TurtleScript tiles.
 (Although I haven't yet tackled quasiquote in TurtleScript.)

d. There is another language -- AIML -- introduced for recognizing things.
 I would use something much nicer, easier, more readable, etc., -- like
 OMeta -- or more likely I would go way back to the never implemented
 Smalltalk-71 (which had these and some of the above features in its design
 and also tried to be kid usable) -- and try to make a version that worked
 (maybe too hard to do in general or for the scope of this project, but you
 can see why it would be nice to have all of the mechanisms that make your
 system work be couched in kid terms and looks and feels if possible).


This I completely agree with.  The AIML will be translated to JSON on the
device itself.  The use of AIML is a compromise: it exists and has
well-defined semantics and does 90% of what I'd like it to do.  It also has
an active community who have spend a lot of time building reasonable dialog
rules in AIML.  At some point it will have to be extended or replaced, but
I think it will get me through version 1.0 at least.

I'll probably translate the AIML example to JSON in the next revision of
the paper, and state the relationship of JSON to JavaScript and
TurtleScript more precisely.

3. It's out of the scope of your paper and these comments to discuss
 getting kids to add other structures besides stories and narrative to
 think with. You have to start with stories, and that is enough for now. A
 larger scale plan (you may already have) would involve a kind of weaning
 process to get kids to add non-story thinking (as is done in math and
 science, etc.) to their skills. This is a whole curriculum of its own.

 I make these comments because I think your project is a good idea, on the
 right track, and needs to be done


Thank you.  I'll keep your encouragement in mind during the hard work of
implementation.
  --scott

-- 
  ( http://cscott.net )
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Apple and hardware (was: Error trying to compile COLA)

2012-03-14 Thread Mack
Jay Freeman has also released his Wraith Scheme for the iPad.

On Mar 14, 2012, at 9:17 AM, Jecel Assumpcao Jr. wrote:

 Alan Kay wrote on Wed, 14 Mar 2012 05:53:21 -0700 (PDT)
 A hardware vendor with huge volumes (like Apple) should be able to get a CPU
 vendor to make HW that offers real protection, and at a granularity that 
 makes
 more systems sense.
 
 They did just that when they founded ARM Ltd (with Acorn and VTI): the
 most significant change from the ARM3 to the ARM6 was a new MMU with a
 more fine grained protection mechnism which was designed specially for
 the Newton OS. No other system used it and though I haven't checked, I
 wouldn't be surprised if this feature was eliminated from more recent
 versions of ARM.
 
 Compared to a real capability system (like the Intel iAPX432/BiiN/960XA
 or the IBM AS/400) it was a rather awkward solution, but at least they
 did make an effort.
 
 Having been created under Scully, this technology did not survive Jobs'
 return.
 
 But the main point here is that there are no technical reasons why a child 
 should
 be restricted from making an Etoys or Scratch project and sharing it with 
 another
 child on an iPad.
 No matter what Apple says, the reasons clearly stem from strategies and 
 tactics
 of economic exclusion.
 So I agree with Max that the iPad at present is really the anti-Dynabook
 
 They have changed their position a little. I have a Hand Basic on my
 iPhone which is compatible with the Commodore 64 Basic. I can write and
 save programs, but can't send them to another device or load new
 programs from the Internet. Except I can - there are applications for
 the iPhone that give you access to the filing system and let you
 exchange files with a PC or Mac. But that is beyond most users, which
 seems to be a good enough barrier from Apple's viewpoint.
 
 The same thing applies to this nice native development environment for
 Lua on the iPad:
 
 http://twolivesleft.com/Codea/
 
 You can program on the iPad/iPhone, but can't share.
 
 -- Jecel
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Block-Strings / Heredocs (Re: Magic Ink and Killing Math)

2012-03-14 Thread BGB

On 3/14/2012 8:57 AM, Loup Vaillant wrote:

Michael FIG wrote:

Loup Vaillantl...@loup-vaillant.fr  writes:


You could also play the human compiler: use the better syntax in the
comments, and implement a translation of it in code just below.  But
then you have to manually make sure they are synchronized.  Comments
are good.  Needing them is bad.


Or use a preprocessor that substitutes the translation inline
automatically.


Which is a way of implementing the syntax… How is this different than
my Then you write the parser?  Sure you can use a preprocessor, but
you still have to write the macros for your new syntax.



in my case, this can be theoretically done already (writing new 
customized parsers), and was part of why I added block-strings.


most likely route would be translating code into ASTs, and maybe using 
something like (defmacro) or similar at the AST level.


another route could be I guess to make use of quote and unquote, 
both of which can be used as expression-building features (functionally, 
they are vaguely similar to quasiquote in Scheme, but they haven't 
enjoyed so much use thus far).



a more practical matter though would be getting things nailed down 
enough so that larger parts of the system can be written in a language 
other than C.


yes, there is the FFI (generally seems to work fairly well), and one can 
shove script closures into C-side function pointers (provided arguments 
and return types are annotated and the types match exactly, but I don't 
entirely trust its reliability, ...).


slightly nicer would be if code could be written in various places which 
accepts script objects (either via interfaces or ex-nihilo objects).


abstract example (ex-nihilo object):
var obj={render: function() { ... } ... };
lbxModelRegisterScriptObject(models/script/somemodel, obj);

so, if some code elsewhere creates an object using the given model-name, 
then the script code is invoked to go about rendering it.


alternatively, using an interface:
public interface IRender3D { ... }//contents omitted for brevity
public class MyObject implements IRender3D { ... }
lbxModelRegisterScriptObject(models/script/somemodel, new MyObject());

granted, there are probably better (and less likely to kill performance) 
ways to make use of script objects (as-is, using script code to write 
objects for use in the 3D renderer is not likely to turn out well 
regarding the framerate and similar, at least until if/when there is a 
good solid JIT in place, and it can compete more on equal terms with C 
regarding performance).



mostly the script language was intended for use in the game's server 
end, where typically raw performance is less critical, but as-is, there 
is still a bit of a language-border issue that would need to be worked 
on here (I originally intended to write the server end mostly in script, 
but at the time the VM was a little less solid at the time (poorer 
performance, more prone to leak memory and trigger GC, ...), and so the 
server end was written more quick and dirty in plain C, using a design 
fairly similar to a mix of the Quake 1 and 2 server-ends). as-is, it is 
not entirely friendly to the script code, so a little more work is needed.


another possible use case is related to world-construction tasks 
(procedural world-building and similar).


but, yes, all of this is a bit more of a mundane ways of using a 
scripting language, but then again, everything tends to be built from 
the bottom up (and this just happens to be where I am currently at, at 
this point in time).


(maybe at which point in time I am stuck less worrying about which 
language is used where and about cross-language interfacing issues, then 
allowing things like alternative syntax, ... could be more worth 
exploration. but, in many areas, both C and C++ have a bit of a gravity 
well...).



or such...

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Talking Typwriter [was: Barbarians at the gate! (Project Nell)]

2012-03-14 Thread Martin McClure
On 03/14/2012 09:54 AM, Alan Kay wrote:
 
 1. Psychologist O.K. Moore in the early 60s at Yale and elsewhere
 pioneered the idea of a talking typewriter to help children learn how
 to read via learning to write. This was first a grad student in a closet
 with a microphone simulating a smart machine -- but later the Edison
 division of McGraw-Hill made a technology that did some of these things.

Now that reference brings back some memories!

As an undergrad I had a student job in the Computer Assisted Instruction
lab. One day, a large pile of old parts arrived from somewhere, with no
accompanying documentation, and I was told, Put them together. It
turned out to be two Edison talking typewriters. I got one fully
functional; the other had a couple of minor parts missing. This was in
late '77 or early '78, about the same time I was attempting
(unsuccessfully) to learn something about Smalltalk.

Regards,

-Martin
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread Alan Kay
Hi Scott --

1. I will see if I can get one of these scanned for you. Moore tended to 
publish in journals and there is very little of his stuff available on line.

2.a. if (ab) { ... } is easier to read than if ab then ...? There is no 
hint of the former being tweaked for decades to make it easier to read.

Several experiments from the past cast doubt on the rest of the idea. At Disney 
we did a variety of code display generators to see what kinds of 
transformations we could do to the underlying Smalltalk (including syntactic) 
to make it something that could be subsetted as a growable path from Etoys. 


We got some good results from this (and this is what I'd do with Javascript in 
both directions -- Alex Warth's OMeta is in Javascript and is quite complete 
and could do this).

However, the showstopper was all the parentheses that had to be rendered in 
tiles. Mike Travers at MIT had done one of the first tile based editors for a 
version of Lisp that he used, and this was even worse.

More recently, Jens Moenig (who did SNAP) also did a direct renderer and editor 
for Squeak Smalltalk (this can be tried out) and it really seemed to be much 
too cluttered.

One argument for some of this, is well, teach the kids a subset that doesn't 
use so many parens  This could be a solution.

However, in the end, I don't think Javascript semantics is particularly good 
for kids. For example, one of features of Etoys that turned out to be very 
powerful for children and other Etoy programmers is the easy/trivial parallel 
methods execution. And there are others in Etoys and yet others in Scractch 
that are non-standard in regular programming languages but are very powerful 
for children (and some of them are better than standard CS language ideas).

I'm encouraging you to do something better (that would be ideal). Or at least 
as workable. Giving kids less just because that's what an existing language for 
adults has is not a good tactic.


2.c. Ditto 2.a. above

2.d. Ditto above above

Cheers,

Alan







 From: C. Scott Ananian csc...@laptop.org
To: Alan Kay alan.n...@yahoo.com 
Cc: IAEP SugarLabs i...@lists.sugarlabs.org; Fundamentals of New Computing 
fonc@vpri.org; Viewpoints Research a...@vpri.org 
Sent: Wednesday, March 14, 2012 10:25 AM
Subject: Re: [IAEP] [fonc] Barbarians at the gate! (Project Nell)
 

On Wed, Mar 14, 2012 at 12:54 PM, Alan Kay alan.n...@yahoo.com wrote:

The many papers from this work greatly influenced the thinking about personal 
computing at Xerox PARC in the 70s. Here are a couple:


-- O. K. Moore, Autotelic Responsive Environments and Exceptional Children, 
Experience, Structure and Adaptabilty (ed. Harvey), Springer, 1966
-- Anderson and Moore, Autotelic Folk Models, Sociological Quarterly, 1959



Thank you for these references.  I will chase them down and learn as much as I 
can.
 
2. Separating out some of the programming ideas here:


a. Simplest one is that the most important users of this system are the 
children, so it would be a better idea to make the tile scripting look as 
easy for them as possible. I don't agree with the rationalization in the 
paper about preserving the code reading skills of existing programmers.


I probably need to clarify the reasoning in the paper for this point.


Traditional text-based programming languages have been tweaked over decades 
to be easy to read -- for both small examples and large systems.  It's 
somewhat of a heresy, but I thought it would be interesting to explore a 
tile-based system that *didn't* throw away the traditional text structure, and 
tried simply to make the structure of the traditional text easier to visualize 
and manipulate.


So it's not really skills of existing programmers I'm interested in -- I 
should reword that.  It's that I feel we have an existence proof that the 
traditional textual form of a program is easy to read, even for very 
complicated programs.  So I'm trying to scale down the thing that works, 
instead of trying to invent something new which proves unwieldy at scale.


b. Good idea to go all the way to the bottom with the children's language.


c. Figure 2 introduces another -- at least equally important language -- in 
my opinion, this one should be made kid usable and programmable -- and I 
would try to see how it could fit with the TS language in some way. 



This language is JSON, which is just the object-definition subset of 
JavaScript.  So it can in fact be expressed with TurtleScript tiles.  
(Although I haven't yet tackled quasiquote in TurtleScript.)


d. There is another language -- AIML -- introduced for recognizing things. I 
would use something much nicer, easier, more readable, etc., -- like OMeta -- 
or more likely I would go way back to the never implemented Smalltalk-71 
(which had these and some of the above features in its design and also tried 
to be kid usable) -- and try to make a version that worked (maybe too hard to 
do in general 

Re: [fonc] Block-Strings / Heredocs (Re: Magic Ink and Killing Math)

2012-03-14 Thread Mack

On Mar 13, 2012, at 6:27 PM, BGB wrote:

SNIP
 the issue is not that I can't imagine anything different, but rather that 
 doing anything different would be a hassle with current keyboard technology:
 pretty much anyone can type ASCII characters;
 many other people have keyboards (or key-mappings) that can handle 
 region-specific characters.
 
 however, otherwise, typing unusual characters (those outside their current 
 keyboard mapping) tends to be a bit more painful, and/or introduces editor 
 dependencies, and possibly increases the learning curve (now people have to 
 figure out how these various unorthodox characters map to the keyboard, ...).
 
 more graphical representations, however, have a secondary drawback:
 they can't be manipulated nearly as quickly or as easily as text.
 
 one could be like drag and drop, but the problem is that drag and drop is 
 still a fairly slow and painful process (vs, hitting keys on the keyboard).
 
 
 yes, there are scenarios where keyboards aren't ideal:
 such as on an XBox360 or an Android tablet/phone/... or similar, but people 
 probably aren't going to be using these for programming anyways, so it is 
 likely a fairly moot point. 
 
 however, even in these cases, it is not clear there are many clearly better 
 options either (on-screen keyboard, or on-screen tile selector, either way it 
 is likely to be painful...).
 
 
 simplest answer:
 just assume that current text-editor technology is basically sufficient and 
 call it good enough.

Stipulating that having the keys on the keyboard mean what the painted symbols 
show is the simplest path with the least impedance mismatch for the user, 
there are already alternatives in common use that bear thinking on.  For 
example:

On existing keyboards, multi-stroke operations to produce new characters 
(holding down shift key to get CAPS, CTRL-ALT-TAB-whatever to get a special 
character or function, etc…) are customary and have entered average user 
experience.

Users of IDE's like EMACS, IntelliJ or Eclipse are well-acquainted with special 
keystrokes to get access to code completions and intention templates.

So it's not inconceivable to consider a similar strategy for typing 
non-character graphical elements.  One could think of say… CTRL-O, UP ARROW, UP 
ARROW, ESC to type a circle and size it, followed by CTRL-RIGHT ARROW, C to 
enter the circle and type a c inside it.

An argument against these strategies is the same one against command-line 
interfaces in the CLI vs. GUI discussion: namely, that without visual 
prompting, the possibilities that are available to be typed are not readily 
visible to the user.  The user has to already know what combination gives him 
what symbol.

One solution for mitigating this, presuming rich graphical typing was 
desirable, would be to take a page from the way touch type cell phones and 
tablets work, showing symbol maps on the screen in response to user input, with 
the maps being progressively refined as the user types to guide the user 
through constructing their desired input.

…just a thought :)


SNIP
On Mar 13, 2012, at 6:27 PM, BGB also wrote:

 
 
 I'll take Dave's point that penetration matters, and at the same time, most 
 new ideas have old idea constituents, so you can easily find some matter 
 for people stuck in the old methodologies and thinking to relate to when 
 building your new stuff ;-)
 
 
 well, it is like using alternate syntax designs (say, not a C-style curly 
 brace syntax).
 
 one can do so, but is it worth it?
 in such a case, the syntax is no longer what most programmers are familiar or 
 comfortable with, and it is more effort to convert code to/from the language, 
 …

The degenerate endpoint of this argument (which, sadly I encounter on a daily 
basis in the larger business-technical community) is if it isn't Java, it is 
by definition alien and to uncomfortable (and therefore too expensive) to use.

We can protest the myopia inherent in that objection, but the sad fact is that 
perception and emotional comfort are more important to the average person's 
decision-making process than coldly rational analysis.  (I refer to this as the 
Discount Shirt problem.  Despite the fact that a garment bought at a discount 
store doesn't fit well and falls apart after the first washing… not actually 
fulfilling our expectations of what a shirt should do, so ISN'T really a shirt 
from a usability perspective… because it LOOKS like a shirt and the store CALLS 
it a shirt, we still buy it, telling ourselves we've bought a shirt.  Then we 
go home and complain that shirts are a failure.)

Given this hurdle of perception, I have come to the conclusion that the only 
reasonable way to make advances is to live in the world of use case-driven 
design and measure the success of a language by how well it fits the perceived 
shape of the problem to be solved, looking for familiarity on the part of the 
user by means of keeping semantic distance between the language 

Re: [fonc] Apple and hardware (was: Error trying to compile COLA)

2012-03-14 Thread Alan Kay
Yep, I was there and trying to get the Newton project off the awful ATT chip 
they had first chosen. Larry Tesler (who worked with us at PARC) finally wound 
up taking over this project and doing a number of much better things with it. 
Overall what happened with Newton was too bad -- it could have been much better 
-- but there were many too many different opinions and power bases involved.

If you have a good version of confinement (which is pretty simple HW-wise) you 
can use Butler Lampson's schemes for Cal-TSS to make a workable version of a 
capability system.

And, yep, I managed to get them to allow interpreters to run on the iPad, but 
was not able to get Steve to countermand the no sharing rule.

Cheers,

Alan





 From: Jecel Assumpcao Jr. je...@merlintec.com
To: Fundamentals of New Computing fonc@vpri.org 
Sent: Wednesday, March 14, 2012 9:17 AM
Subject: [fonc] Apple and hardware (was: Error trying to compile COLA)
 
Alan Kay wrote on Wed, 14 Mar 2012 05:53:21 -0700 (PDT)
 A hardware vendor with huge volumes (like Apple) should be able to get a CPU
 vendor to make HW that offers real protection, and at a granularity that 
 makes
 more systems sense.

They did just that when they founded ARM Ltd (with Acorn and VTI): the
most significant change from the ARM3 to the ARM6 was a new MMU with a
more fine grained protection mechnism which was designed specially for
the Newton OS. No other system used it and though I haven't checked, I
wouldn't be surprised if this feature was eliminated from more recent
versions of ARM.

Compared to a real capability system (like the Intel iAPX432/BiiN/960XA
or the IBM AS/400) it was a rather awkward solution, but at least they
did make an effort.

Having been created under Scully, this technology did not survive Jobs'
return.

 But the main point here is that there are no technical reasons why a child 
 should
 be restricted from making an Etoys or Scratch project and sharing it with 
 another
 child on an iPad.
 No matter what Apple says, the reasons clearly stem from strategies and 
 tactics
 of economic exclusion.
 So I agree with Max that the iPad at present is really the anti-Dynabook

They have changed their position a little. I have a Hand Basic on my
iPhone which is compatible with the Commodore 64 Basic. I can write and
save programs, but can't send them to another device or load new
programs from the Internet. Except I can - there are applications for
the iPhone that give you access to the filing system and let you
exchange files with a PC or Mac. But that is beyond most users, which
seems to be a good enough barrier from Apple's viewpoint.

The same thing applies to this nice native development environment for
Lua on the iPad:

http://twolivesleft.com/Codea/

You can program on the iPad/iPhone, but can't share.

-- Jecel

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread shaun gilchrist
Alan,

I would go way back to the never implemented Smalltalk-71

Is there a formal specification of what 71 should have been? I have only
ever read about it in passing reference in the various histories of
smalltalk as a step on the way to 72, 76, and finally 80.

I am very intrigued as to what sets 71 apart so dramatically. -Shaun

On Wed, Mar 14, 2012 at 12:29 PM, Alan Kay alan.n...@yahoo.com wrote:

 Hi Scott --

 1. I will see if I can get one of these scanned for you. Moore tended to
 publish in journals and there is very little of his stuff available on line.

 2.a. if (ab) { ... } is easier to read than if ab then ...? There is
 no hint of the former being tweaked for decades to make it easier to read.

 Several experiments from the past cast doubt on the rest of the idea. At
 Disney we did a variety of code display generators to see what kinds of
 transformations we could do to the underlying Smalltalk (including
 syntactic) to make it something that could be subsetted as a growable path
 from Etoys.

 We got some good results from this (and this is what I'd do with
 Javascript in both directions -- Alex Warth's OMeta is in Javascript and is
 quite complete and could do this).

 However, the showstopper was all the parentheses that had to be rendered
 in tiles. Mike Travers at MIT had done one of the first tile based editors
 for a version of Lisp that he used, and this was even worse.

 More recently, Jens Moenig (who did SNAP) also did a direct renderer and
 editor for Squeak Smalltalk (this can be tried out) and it really seemed to
 be much too cluttered.

 One argument for some of this, is well, teach the kids a subset that
 doesn't use so many parens  This could be a solution.

 However, in the end, I don't think Javascript semantics is particularly
 good for kids. For example, one of features of Etoys that turned out to be
 very powerful for children and other Etoy programmers is the easy/trivial
 parallel methods execution. And there are others in Etoys and yet others in
 Scractch that are non-standard in regular programming languages but are
 very powerful for children (and some of them are better than standard CS
 language ideas).

 I'm encouraging you to do something better (that would be ideal). Or at
 least as workable. Giving kids less just because that's what an existing
 language for adults has is not a good tactic.

 2.c. Ditto 2.a. above

 2.d. Ditto above above

 Cheers,

 Alan



   --
 *From:* C. Scott Ananian csc...@laptop.org
 *To:* Alan Kay alan.n...@yahoo.com
 *Cc:* IAEP SugarLabs i...@lists.sugarlabs.org; Fundamentals of New
 Computing fonc@vpri.org; Viewpoints Research a...@vpri.org
 *Sent:* Wednesday, March 14, 2012 10:25 AM
 *Subject:* Re: [IAEP] [fonc] Barbarians at the gate! (Project Nell)

 On Wed, Mar 14, 2012 at 12:54 PM, Alan Kay alan.n...@yahoo.com wrote:

 The many papers from this work greatly influenced the thinking about
 personal computing at Xerox PARC in the 70s. Here are a couple:

 -- O. K. Moore, Autotelic Responsive Environments and Exceptional
 Children, Experience, Structure and Adaptabilty (ed. Harvey), Springer, 1966
 -- Anderson and Moore, Autotelic Folk Models, Sociological Quarterly, 1959


 Thank you for these references.  I will chase them down and learn as much
 as I can.


 2. Separating out some of the programming ideas here:

 a. Simplest one is that the most important users of this system are the
 children, so it would be a better idea to make the tile scripting look as
 easy for them as possible. I don't agree with the rationalization in the
 paper about preserving the code reading skills of existing programmers.


 I probably need to clarify the reasoning in the paper for this point.

 Traditional text-based programming languages have been tweaked over
 decades to be easy to read -- for both small examples and large systems.
  It's somewhat of a heresy, but I thought it would be interesting to
 explore a tile-based system that *didn't* throw away the traditional text
 structure, and tried simply to make the structure of the traditional text
 easier to visualize and manipulate.

 So it's not really skills of existing programmers I'm interested in -- I
 should reword that.  It's that I feel we have an existence proof that the
 traditional textual form of a program is easy to read, even for very
 complicated programs.  So I'm trying to scale down the thing that works,
 instead of trying to invent something new which proves unwieldy at scale.

 b. Good idea to go all the way to the bottom with the children's language.

 c. Figure 2 introduces another -- at least equally important language --
 in my opinion, this one should be made kid usable and programmable -- and I
 would try to see how it could fit with the TS language in some way.


 This language is JSON, which is just the object-definition subset of
 JavaScript.  So it can in fact be expressed with TurtleScript tiles.
  (Although I haven't yet tackled 

Re: [fonc] Talking Typwriter [was: Barbarians at the gate! (Project Nell)]

2012-03-14 Thread Alan Kay
You had to have a lot of moxie in the 60s to try to make Moore's ideas into 
real technology. It was amazing what they were able to do.

I wonder where this old junk is now? Should be in the Computer History Museum!

Cheers,

Alan





 From: Martin McClure martin.mccl...@vmware.com
To: Fundamentals of New Computing fonc@vpri.org 
Cc: Viewpoints Research a...@vpri.org 
Sent: Wednesday, March 14, 2012 11:26 AM
Subject: Re: [fonc] Talking Typwriter [was:  Barbarians at the gate! (Project 
Nell)]
 
On 03/14/2012 09:54 AM, Alan Kay wrote:
 
 1. Psychologist O.K. Moore in the early 60s at Yale and elsewhere
 pioneered the idea of a talking typewriter to help children learn how
 to read via learning to write. This was first a grad student in a closet
 with a microphone simulating a smart machine -- but later the Edison
 division of McGraw-Hill made a technology that did some of these things.

Now that reference brings back some memories!

As an undergrad I had a student job in the Computer Assisted Instruction
lab. One day, a large pile of old parts arrived from somewhere, with no
accompanying documentation, and I was told, Put them together. It
turned out to be two Edison talking typewriters. I got one fully
functional; the other had a couple of minor parts missing. This was in
late '77 or early '78, about the same time I was attempting
(unsuccessfully) to learn something about Smalltalk.

Regards,

-Martin


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread Brian Rice
I'm sure that if you took a pretty clean PEG grammar approach to composable
mixfix phrasing and cues from Inform 7 and Ruby's Cucumber DSL,
Smalltalk-71 would feel as real as any bytecode-native language.

On Wed, Mar 14, 2012 at 11:38 AM, shaun gilchrist shaunxc...@gmail.comwrote:

 Alan,

 I would go way back to the never implemented Smalltalk-71

 Is there a formal specification of what 71 should have been? I have only
 ever read about it in passing reference in the various histories of
 smalltalk as a step on the way to 72, 76, and finally 80.

 I am very intrigued as to what sets 71 apart so dramatically. -Shaun

 On Wed, Mar 14, 2012 at 12:29 PM, Alan Kay alan.n...@yahoo.com wrote:

 Hi Scott --

 1. I will see if I can get one of these scanned for you. Moore tended to
 publish in journals and there is very little of his stuff available on line.

 2.a. if (ab) { ... } is easier to read than if ab then ...? There
 is no hint of the former being tweaked for decades to make it easier to
 read.

 Several experiments from the past cast doubt on the rest of the idea. At
 Disney we did a variety of code display generators to see what kinds of
 transformations we could do to the underlying Smalltalk (including
 syntactic) to make it something that could be subsetted as a growable path
 from Etoys.

 We got some good results from this (and this is what I'd do with
 Javascript in both directions -- Alex Warth's OMeta is in Javascript and is
 quite complete and could do this).

 However, the showstopper was all the parentheses that had to be rendered
 in tiles. Mike Travers at MIT had done one of the first tile based editors
 for a version of Lisp that he used, and this was even worse.

 More recently, Jens Moenig (who did SNAP) also did a direct renderer and
 editor for Squeak Smalltalk (this can be tried out) and it really seemed to
 be much too cluttered.

 One argument for some of this, is well, teach the kids a subset that
 doesn't use so many parens  This could be a solution.

 However, in the end, I don't think Javascript semantics is particularly
 good for kids. For example, one of features of Etoys that turned out to be
 very powerful for children and other Etoy programmers is the easy/trivial
 parallel methods execution. And there are others in Etoys and yet others in
 Scractch that are non-standard in regular programming languages but are
 very powerful for children (and some of them are better than standard CS
 language ideas).

 I'm encouraging you to do something better (that would be ideal). Or at
 least as workable. Giving kids less just because that's what an existing
 language for adults has is not a good tactic.

 2.c. Ditto 2.a. above

 2.d. Ditto above above

 Cheers,

 Alan



   --
 *From:* C. Scott Ananian csc...@laptop.org
 *To:* Alan Kay alan.n...@yahoo.com
 *Cc:* IAEP SugarLabs i...@lists.sugarlabs.org; Fundamentals of New
 Computing fonc@vpri.org; Viewpoints Research a...@vpri.org
 *Sent:* Wednesday, March 14, 2012 10:25 AM
 *Subject:* Re: [IAEP] [fonc] Barbarians at the gate! (Project Nell)

 On Wed, Mar 14, 2012 at 12:54 PM, Alan Kay alan.n...@yahoo.com wrote:

 The many papers from this work greatly influenced the thinking about
 personal computing at Xerox PARC in the 70s. Here are a couple:

 -- O. K. Moore, Autotelic Responsive Environments and Exceptional
 Children, Experience, Structure and Adaptabilty (ed. Harvey), Springer, 1966
 -- Anderson and Moore, Autotelic Folk Models, Sociological Quarterly, 1959


 Thank you for these references.  I will chase them down and learn as much
 as I can.


 2. Separating out some of the programming ideas here:

 a. Simplest one is that the most important users of this system are the
 children, so it would be a better idea to make the tile scripting look as
 easy for them as possible. I don't agree with the rationalization in the
 paper about preserving the code reading skills of existing programmers.


 I probably need to clarify the reasoning in the paper for this point.

 Traditional text-based programming languages have been tweaked over
 decades to be easy to read -- for both small examples and large systems.
  It's somewhat of a heresy, but I thought it would be interesting to
 explore a tile-based system that *didn't* throw away the traditional text
 structure, and tried simply to make the structure of the traditional text
 easier to visualize and manipulate.

 So it's not really skills of existing programmers I'm interested in --
 I should reword that.  It's that I feel we have an existence proof that the
 traditional textual form of a program is easy to read, even for very
 complicated programs.  So I'm trying to scale down the thing that works,
 instead of trying to invent something new which proves unwieldy at scale.

 b. Good idea to go all the way to the bottom with the children's language.

 c. Figure 2 introduces another -- at least equally important language --
 in my opinion, this one 

Re: [fonc] Block-Strings / Heredocs (Re: Magic Ink and Killing Math)

2012-03-14 Thread BGB

On 3/14/2012 11:31 AM, Mack wrote:

On Mar 13, 2012, at 6:27 PM, BGB wrote:

SNIP

the issue is not that I can't imagine anything different, but rather that doing 
anything different would be a hassle with current keyboard technology:
pretty much anyone can type ASCII characters;
many other people have keyboards (or key-mappings) that can handle 
region-specific characters.

however, otherwise, typing unusual characters (those outside their current 
keyboard mapping) tends to be a bit more painful, and/or introduces editor 
dependencies, and possibly increases the learning curve (now people have to 
figure out how these various unorthodox characters map to the keyboard, ...).

more graphical representations, however, have a secondary drawback:
they can't be manipulated nearly as quickly or as easily as text.

one could be like drag and drop, but the problem is that drag and drop is 
still a fairly slow and painful process (vs, hitting keys on the keyboard).


yes, there are scenarios where keyboards aren't ideal:
such as on an XBox360 or an Android tablet/phone/... or similar, but people 
probably aren't going to be using these for programming anyways, so it is 
likely a fairly moot point.

however, even in these cases, it is not clear there are many clearly better 
options either (on-screen keyboard, or on-screen tile selector, either way it is likely 
to be painful...).


simplest answer:
just assume that current text-editor technology is basically sufficient and call it 
good enough.

Stipulating that having the keys on the keyboard mean what the painted symbols 
show is the simplest path with the least impedance mismatch for the user, there are 
already alternatives in common use that bear thinking on.  For example:

On existing keyboards, multi-stroke operations to produce new characters 
(holding down shift key to get CAPS, CTRL-ALT-TAB-whatever to get a special 
character or function, etc…) are customary and have entered average user 
experience.

Users of IDE's like EMACS, IntelliJ or Eclipse are well-acquainted with special 
keystrokes to get access to code completions and intention templates.

So it's not inconceivable to consider a similar strategy for typing non-character graphical elements.  One 
could think of say… CTRL-O, UP ARROW, UP ARROW, ESC to type a circle and size it, followed by CTRL-RIGHT 
ARROW, C to enter the circle and type a c inside it.

An argument against these strategies is the same one against command-line 
interfaces in the CLI vs. GUI discussion: namely, that without visual 
prompting, the possibilities that are available to be typed are not readily 
visible to the user.  The user has to already know what combination gives him 
what symbol.

One solution for mitigating this, presuming rich graphical typing was desirable, would 
be to take a page from the way touch type cell phones and tablets work, showing symbol 
maps on the screen in response to user input, with the maps being progressively refined as the user 
types to guide the user through constructing their desired input.

…just a thought :)


typing, like on phones...
I have seen 2 major ways of doing this:
hit key multiple times to indicate the desired letter, with a certain 
timeout before it moves to the next character;
type out characters, phone shows first/most-likely possibility, hit a 
key a bunch of times to cycle though the options.



another idle thought would be some sort of graphical/touch-screen 
keyboard, but it would be a matter of finding a way to make it not suck. 
using on-screen inputs in Android devices and similar kind of sucks:
pressure and sensitivity issues, comfort issues, lack of tactile 
feedback, smudges on the screen if one uses their fingers, and 
potentially scratches if one is using a stylus, ...


so, say, a touch-screen with these properties:
similar sized (or larger) than a conventional keyboard;
resistant to smudging, fairly long lasting, and easy to clean;
soft contact surface (me thinking sort of like those gel insoles for 
shoes), so that ideally typing isn't an experience of constantly hitting 
a piece of glass with ones' fingers (ideally, both impact pressure and 
responsiveness should be similar to a conventional keyboard, or at least 
a laptop keyboard);
ideally, some sort of tactile feedback (so, one can feel whether or not 
they are actually hitting the keys);
being dynamically reprogrammable (say, any app which knows about the 
keyboard can change its layout when it gains focus, or alternatively the 
user can supply per-app keyboard layouts);
maybe, there could be tabs to change between layouts, such as a US-ASCII 
tab, ...

...

with something like the above being common, I can more easily imagine 
people using non-ASCII based input methods.


say, one is typing in US-ASCII, hits a math-symbol layout where, for 
example, the numeric keypad (or maybe the whole rest of the keyboard) is 
replaced by a grid of math symbols, or maybe also have a drawing 
tablet tab, 

Re: [fonc] Apple and hardware (was: Error trying to compile COLA)

2012-03-14 Thread Jecel Assumpcao Jr.
Alan Kay wrote on Wed, 14 Mar 2012 11:36:30 -0700 (PDT)
 Yep, I was there and trying to get the Newton project off the awful ATT chip
 they had first chosen.

Interesting - a few months ago I studied the datasheets for the Hobbit
and read all the old CRISP papers and found this chip rather cute. It is
even more C centric than RISCs (specially the ARM) so might not be a
good choice for other languages. Another project that started out using
this and then had to switch (to the PowerPC) was the BeBox. In the link
I give below it says both projects were done by the same people (Jean
Louis Gassée and Steve Sakoman), so in a way it was really just one
project that used the chip.

 Larry Tesler (who worked with us at PARC) finally wound up taking over this
 project and doing a number of much better things with it.

He was also responsible for giving us Hypercard, right?

 Overall what happened with Newton was too bad -- it could have been much
 better -- but there were many too many different opinions and power bases
 involved.

This looks like a reasonable history of the Newton project (though some
parts that I know aren't quite right, so I can't guess how accurate the
parts I didn't know are):

http://lowendmac.com/orchard/06/john-sculley-newton-origin.html

It doesn't mention NewtonScript nor Object Soups. I have never used it
myself, only read about it and seen some demos. But my impression is
that this was the closest thing we have had to the dynabook yet.

 If you have a good version of confinement (which is pretty simple HW-wise) you
 can use Butler Lampson's schemes for Cal-TSS to make a workable version of a
 capability system.

The 286 protected mode was good enough for this, and was extended in the
386. I am not sure all modern x86 processors still implement these, and
if they do it is likely that actually using them will hurt performance
so much that it isn't an option in practice.

 And, yep, I managed to get them to allow interpreters to run on the iPad, but 
 was
 not able to get Steve to countermand the no sharing rule.

That is a pity, though at least having native languages makes these
devices a reasonable replacement for my old Radio Shack PC-4 calculator.
I noticed that neither Matlab nor Mathematica are available for the
iPad, but only simple terminal apps that allow you to access these
applications running on your PC. What a waste!

-- Jecel

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Apple and hardware (was: Error trying to compile COLA)

2012-03-14 Thread Alan Kay
Hi Jecel

The CRISP was too slow, and had other problems in its details. Sakoman liked it 
...

Bill Atkinson did Hypercard ... Larry made many other contributions at Xerox 
and Apple

To me the Dynabook has always been 95% a service model and 5% physical specs 
(there were three main physical ideas for it, only one was the tablet).

Cheers,

Alan





 From: Jecel Assumpcao Jr. je...@merlintec.com
To: Fundamentals of New Computing fonc@vpri.org 
Sent: Wednesday, March 14, 2012 3:55 PM
Subject: Re: [fonc] Apple and hardware (was: Error trying to compile COLA)
 
Alan Kay wrote on Wed, 14 Mar 2012 11:36:30 -0700 (PDT)
 Yep, I was there and trying to get the Newton project off the awful ATT chip
 they had first chosen.

Interesting - a few months ago I studied the datasheets for the Hobbit
and read all the old CRISP papers and found this chip rather cute. It is
even more C centric than RISCs (specially the ARM) so might not be a
good choice for other languages. Another project that started out using
this and then had to switch (to the PowerPC) was the BeBox. In the link
I give below it says both projects were done by the same people (Jean
Louis Gassée and Steve Sakoman), so in a way it was really just one
project that used the chip.

 Larry Tesler (who worked with us at PARC) finally wound up taking over this
 project and doing a number of much better things with it.

He was also responsible for giving us Hypercard, right?

 Overall what happened with Newton was too bad -- it could have been much
 better -- but there were many too many different opinions and power bases
 involved.

This looks like a reasonable history of the Newton project (though some
parts that I know aren't quite right, so I can't guess how accurate the
parts I didn't know are):

http://lowendmac.com/orchard/06/john-sculley-newton-origin.html

It doesn't mention NewtonScript nor Object Soups. I have never used it
myself, only read about it and seen some demos. But my impression is
that this was the closest thing we have had to the dynabook yet.

 If you have a good version of confinement (which is pretty simple HW-wise) 
 you
 can use Butler Lampson's schemes for Cal-TSS to make a workable version of a
 capability system.

The 286 protected mode was good enough for this, and was extended in the
386. I am not sure all modern x86 processors still implement these, and
if they do it is likely that actually using them will hurt performance
so much that it isn't an option in practice.

 And, yep, I managed to get them to allow interpreters to run on the iPad, 
 but was
 not able to get Steve to countermand the no sharing rule.

That is a pity, though at least having native languages makes these
devices a reasonable replacement for my old Radio Shack PC-4 calculator.
I noticed that neither Matlab nor Mathematica are available for the
iPad, but only simple terminal apps that allow you to access these
applications running on your PC. What a waste!

-- Jecel

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Apple and hardware

2012-03-14 Thread BGB

On 3/14/2012 3:55 PM, Jecel Assumpcao Jr. wrote:

snip, interesting, but no comment


If you have a good version of confinement (which is pretty simple HW-wise) you
can use Butler Lampson's schemes for Cal-TSS to make a workable version of a
capability system.

The 286 protected mode was good enough for this, and was extended in the
386. I am not sure all modern x86 processors still implement these, and
if they do it is likely that actually using them will hurt performance
so much that it isn't an option in practice.


the TSS?...

it is still usable on x86 in 32-bit Protected-Mode.

however, it generally wasn't used much by operating systems, and in the 
transition to x86-64, was (along with the GDT and LDT) mostly reduced to 
a semi-vestigial structure.


its role is generally limited to holding register state and the stack 
pointers when doing inter-ring switches (such as an interrupt-handler 
transferring control into the kernel, or when transferring control into 
userspace).


however, it can no longer be used to implement process switching or 
similar on modern chips.




And, yep, I managed to get them to allow interpreters to run on the iPad, but 
was
not able to get Steve to countermand the no sharing rule.

That is a pity, though at least having native languages makes these
devices a reasonable replacement for my old Radio Shack PC-4 calculator.
I noticed that neither Matlab nor Mathematica are available for the
iPad, but only simple terminal apps that allow you to access these
applications running on your PC. What a waste!


IMHO, this is at least one reason to go for Android instead...

not that Android is perfect though, as admittedly I would prefer if I 
could have a full/generic ARM version of Linux or similar, but alas.


sadly, I am not getting a whole lot of the tablet I have development 
wise, which is lame considering that was a major reason I bought it 
(ended up doing far more development in Linux ARMv5TEL in QEMU preparing 
to try to port stuff to Android).


more preferable would have been:
if the NDK didn't suck as badly;
if there were, say, a C API for the GUI stuff (so one could more easily 
just use C without having to deal with Java or the JNI).


would have likely been a little happier had Android been more like just 
a ARM build of a more generic Linux distro or something.



or such...

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread C. Scott Ananian
On Wed, Mar 14, 2012 at 6:02 PM, Jameson Quinn jameson.qu...@gmail.comwrote:

 If you're going to base it on Javascript, at least make it
 Coffeescript-like. I also agree that some basic parallelism primitives
 would be great; it is probably possible to build these into a
 Coffeescript-like dialect using JS under the hood (though they'd probably
 optimize even better if you could implement them natively instead of in
 JS).


I think you are underestimating the value of using a standard
widely-deployed language.  I love languages as much as the next guy---but
our previous learning environment (Sugar) has had incredible difficulty
getting local support outside the US because it is written in *Python*.
 Python is not a commercially viable language (not my words) and you
can't even take university classes in Python in many countries (say,
Uruguay) because there is no company behind it and no one who will give you
a certificate for having learned it.

This is very sad, but the true state of affairs.

JavaScript is not perfect, but at heart it is a functional object-oriented
language which is pretty darn close to Good Enough.  There are huge
benefits to using a language which is supported by training materials all
over the web, university systems outside the US, etc, etc.

I am open to *very* slight extensions to JavaScript -- OMeta/JS and
quasiquote might squeeze in -- but they have to be weighed against their
costs.  Subsets are even more problematic -- once you start subsetting,
then you are throwing away compatibility with all the wealth of JavaScript
libraries out there, in addition to confusing potential contributors who
are trying to type in examples they found in some book.
  --scott

-- 
  ( http://cscott.net )
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)

2012-03-14 Thread Alan Kay
Well, it was very much a mythical beast even on paper -- and you really have 
to implement programming languages and make a lot of things with them to be 
able to assess them 


But -- basically -- since meeting Seymour and starting to think about children 
and programming, there were eight systems that I thought were really nifty and 
cried out to be unified somehow:
  1. Joss
  2. Lisp
  3. Logo -- which was originally a unification of Joss and Lisp, but I thought 
more could be done in this direction).
  4. Planner -- a big set of ideas (long before Prolog) by Carl Hewitt for 
logic programming and pattern directed inference both forward and backwards 
with backtracking)
  5. Meta II -- a super simple meta parser and compiler done by Val Schorre at 
UCLA ca 1963
  6. IMP -- perhaps the first real extensible language that worked well -- by 
Ned Irons (CACM, Jan 1970)

  7. The Lisp-70 Pattern Matching System -- by Larry Tesler, et al, with some 
design ideas by me

  8. The object and pattern directed extension stuff I'd been doing previously 
with the Flex Machine and afterwards at SAIL (that also was influenced by Meta 
II)


One of the schemes was to really make the pattern matching parts of this work 
for everything that eventually required invocations and binding. This was 
doable semantically but was a bear syntactically because of the different 
senses of what kinds of matching and binding were intended for different 
problems. This messed up the readability and desired simple things should be 
simple.

Examples I wanted to cover included simple translations of languages (English 
to Pig Latin, English to French, etc. some of these had been done in Logo), the 
Winograd robot block stacking and other examples done with Planner, the making 
of the language the child was using, message sending and receiving, extensions 
to Smalltalk-71, and so forth.

I think today the way to try to do this would be with a much more graphical UI 
than with text -- one could imagine tiles that would specify what to match, and 
the details of the match could be submerged a bit.

More recently, both OMeta and several of Ian's matchers can handle multiple 
kinds of matching with binding and do backtracking, etc., so one could imagine 
a more general language that could be based on this.

On the other hand, trying to stuff 8 kinds of language ideas into one new 
language in a graceful way could be a siren's song of a goal.

Still 

Cheers,

Alan





 From: shaun gilchrist shaunxc...@gmail.com
To: fonc@vpri.org 
Sent: Wednesday, March 14, 2012 11:38 AM
Subject: Re: [fonc] [IAEP] Barbarians at the gate! (Project Nell)
 

Alan, 

I would go way back to the never implemented Smalltalk-71

Is there a formal specification of what 71 should have been? I have only ever 
read about it in passing reference in the various histories of smalltalk as a 
step on the way to 72, 76, and finally 80. 

I am very intrigued as to what sets 71 apart so dramatically. -Shaun


On Wed, Mar 14, 2012 at 12:29 PM, Alan Kay alan.n...@yahoo.com wrote:

Hi Scott --


1. I will see if I can get one of these scanned for you. Moore tended to 
publish in journals and there is very little of his stuff available on line.


2.a. if (ab) { ... } is easier to read than if ab then ...? There is no 
hint of the former being tweaked for decades to make it easier to read.


Several experiments from the past cast doubt on the rest of the idea. At 
Disney we did a variety of code display generators to see what kinds of 
transformations we could do to the underlying Smalltalk (including syntactic) 
to make it something that could be subsetted as a growable path from Etoys. 



We got some good results from this (and this is what I'd do with Javascript 
in both directions -- Alex Warth's OMeta is in Javascript and is quite 
complete and could do this).


However, the showstopper was all the parentheses that had to be rendered in 
tiles. Mike Travers at MIT had done one of the first tile based editors for a 
version of Lisp that he used, and this was even worse.


More recently, Jens Moenig (who did SNAP) also did a direct renderer and 
editor for Squeak Smalltalk (this can be tried out) and it really seemed to 
be much too cluttered.


One argument for some of this, is well, teach the kids a subset that doesn't 
use so many parens  This could be a solution.


However, in the end, I don't think Javascript semantics is particularly good 
for kids. For example, one of features of Etoys that turned out to be very 
powerful for children and other Etoy programmers is the easy/trivial parallel 
methods execution. And there are others in Etoys and yet others in Scractch 
that are non-standard in regular programming languages but are very powerful 
for children (and some of them are better than standard CS language ideas).


I'm encouraging you to do something better (that would be ideal). Or at least 
as workable. Giving kids less