[fonc] Re: are lambda's closures?

2007-11-09 Thread Jakob Praher

Steve Folta schrieb:

Aren't lambda's supposed to be closures, so that they can
actually access variables defined in lexical scope?


Unlike most other Lisps, lambdas in Jolt are *not* closures.  Jolt's
big insight is that a closureless Lisp makes a great portable
assembly language (better than C, which is often used in that role).


Sorry my ignorance, but can you give some differences (other than 
meta-programming), why LISP style S-Expressions makes a better portable 
assembly language? Is it because of the way evaluation takes place?





Or is there anything I could do with the _closure
parameter here (what is it for, anyway...)?


That's something different.  http://piumarta.com/pepsi/objmodel.pdf
explains what it is.



___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


[fonc] State of idst and a questions about the object-function-dichotomy

2009-04-27 Thread Jakob Praher
Dear List,

as a lurker on this list, and generally interested in the topic at hand,
I am asking myself lately what are the next plans regarding idst. I am
interested in seeing this project progress!

Those who have worked extensively with idst I am asking whether it
wouldn't be easier to directly make the AST representable in the
jolt/function/coke system directly instead of having a compiler object
(model). From looking at clojure, for instance, I could imagine adding
more powerful first class constructs to the lisp/schenme-like jolt would
eliminate some of this differences between the compiler object and the
rest.  For instance concise syntax for associative arrays/maps or
annotations are at least to me sort of interesting sugar. On the other
hand I like the low level notion of the system, which is for reasons to
not having to change to some other language (like C) when implementing
system code. This on the other hand also raises the question why to have
to switch languages for accessing the current AST objects...

For instance the simple example:

(syntax begin
(lambda (node compiler)
  `(let () ,@[node copyFrom: '1])))

could be natively implemented via for instance scheme's define-syntax or
some other form of pattern matching on the AST nodes, here I show the
standard R5RS scheme implementation:

(define-syntax begin
 (syntax-rules ()
 ((_ first )
  (let () first ...

The above makes the node natively accessible via the ... symbol.

My point is that this would make the system more self contained and
hence easier to learn - no?
I am very interested in hearing from you.

Thanks in advance
Jakob



___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


[fonc] State of idst and a questions about the object-function-dichotomy

2009-04-27 Thread Jakob Praher

Dear List,

as a lurker on this list, and generally interested in the topic at hand,
I am asking myself lately what are the next plans regarding idst. I am
interested in seeing this project progress!

Those who have worked extensively with idst I am asking whether it
wouldn't be easier to directly make the AST representable in the
jolt/function/coke system directly instead of having a compiler object
(model). From looking at clojure, for instance, I could imagine adding
more powerful first class constructs to the lisp/schenme-like jolt would
eliminate some of this differences between the compiler object and the
rest.  For instance concise syntax for associative arrays/maps or
annotations are at least to me sort of interesting sugar. On the other
hand I like the low level notion of the system, which is for reasons to
not having to change to some other language (like C) when implementing
system code. This on the other hand also raises the question why to have
to switch languages for accessing the current AST objects...

For instance the simple example:

(syntax begin
(lambda (node compiler)
  `(let () ,@[node copyFrom: '1])))

could be natively implemented via for instance scheme's define-syntax or
some other form of pattern matching on the AST nodes, here I show the
standard R5RS scheme implementation:

(define-syntax begin
 (syntax-rules ()
 ((_ first )
  (let () first ...

The above makes the node natively accessible via the ... symbol.

My point is that this would make the system more self contained and
hence easier to learn - no?
I am very interested in hearing from you.

Thanks in advance
Jakob



___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Fonc on Mac Snow Leopard?

2010-05-08 Thread Jakob Praher
Hi Alan,

just out of curiosity: I am wondering why VPRI is not aiming at a more
community oriented style of innovation. Do you think the communcation
effort is not worth the cost since you do not gain enough or even loose
some freedom and / or speed by discussing archictural concepts more
publicly? Does this imply that in your opinion open (source) projects
only work (good) if there is something to be build incrementally (like a
bazaar).

I am also asking since I am interested in innovation through open
communities. I am wondering why there is not more discussions (which I
am sure you have internally at the VPRI) brought onto lists. Maybe one
could discuss not only the implementation but also concepts behind the
design?

Having a daytime job I know that sometimes catching up with a lively
community is a challenge, on the other hand seeing where things are
going and maybe also join forces in early stages might be interesting,
no? For instance there could be other people doing PoC work.

Thanks,
Jakob

Am 08.05.10 18:03, schrieb Alan Kay:
 Glad you are interested, but don't hold your breath. We've got quite a
 bit more to do this year.

 It's not an incremental project like many open source people are used
 to. We actually throw away much of our code and rewrite with new
 designs pretty often.

 Cheers,

 Alan

 
 *From:* DeNigris Sean s...@clipperadams.com
 *To:* Fundamentals of New Computing fonc@vpri.org
 *Sent:* Sat, May 8, 2010 8:55:29 AM
 *Subject:* Re: [fonc] Fonc on Mac Snow Leopard?

 We don't plan to wind up using any one else's GC  so I wouldn't
 worry.

 Not worried - just excited to play with this stuff!

 Sean 


 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc
   

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Fonc on Mac Snow Leopard?

2010-05-09 Thread Jakob Praher
Personally I was very excited about the Carl Hewitt's work on
ActorScript  [1] lately. IMHO something like Lively Kernel could provide
the client infrastructure for this Client Cloud computing. What is your
opinion on his work? I also liked the notation he uses in the paper.

What I  do not like about JS is its imperative verboseness. I also have
mixed feelings about JSON. I like the idea of beeing able to express
something more denotational, e.g. in ways that keeps focussing on the
simplest solution, and then use something like equational reasoning to
derive at more performant and complex systems. In the end syntax matters
and maybe there is no syntax that works for every solution.

So I see a high value in building Syntax understanding into libraries
(This kind of homoiconicity is what I like from LISP-like langauges).
Also Gilad Bracha did some work on parser combinators using a Self like
language called Newspeak [2]. I found the IDE (I think it was called
Hopscotch) very interesting in that it mirros the browser. Indeed the
current version is running on top of Squeak. Maybe Lively and Newspeak
could join forces? Taking the Idea of Hopscotch beyond a single language
would be great.  Mozilla Lab's Bespin Project focuses on a kind of Emacs
for the Web. What I would really like to see is some kind of Worldwide
live development environment where projects are really living things and
people can take different views on them. With real semantic mappings
between individual notations and syntax.

-- Jakob

[1] - http://arxiv.org/abs/0907.3330
[2] - http://newspeaklanguage.org/

Am 09.05.10 01:06, schrieb Alan Kay:
 By the way, people on this list should look at Dan Ingalls' Lively
 Kernel. (http://www.lively-kernel.org/)

 Dan is also one of original authors of the NSF proposal for STEPS and
 we claim successes in the Lively Kernel as STEPS successes as well.

 That said, LK is much more like the bootstrapping of Squeak was, in
 that a known architecture was adapted to the purpose of using JS as
 the machine code for a new operating system and live environment.
 Again, there was the small dedicated team at Sun under the direction
 of the master designer/builder (Dan). And once they got a few versions
 bootstrapped they opened it up to interested open sourcers, and there
 is a lively mailing list for Lively.

 We like Lively and pay a lot of attention to it because it covers some
 of the functional ground that needs to be covered for a complete
 system. The main difference is that they are not trying for really
 small really relational models. However, the adaptation of the
 Smalltalk architecture here is very efficient for building things (as
 it was almost 40 years ago). So this is worth looking at.

 And we could imagine this as what STEPS might be like to a community,
 except that we are trying to invent new more compact more powerful
 ways to express programmatic ideas. At best, something like this is
 several years in STEPS' future.

 Cheers,

 Alan


 
 *From:* Jakob Praher j...@hapra.at
 *To:* fonc@vpri.org
 *Sent:* Sat, May 8, 2010 12:25:12 PM
 *Subject:* Re: [fonc] Fonc on Mac Snow Leopard?

 Hi Alan,

 just out of curiosity: I am wondering why VPRI is not aiming at a more
 community oriented style of innovation. Do you think the communcation
 effort is not worth the cost since you do not gain enough or even
 loose some freedom and / or speed by discussing archictural concepts
 more publicly? Does this imply that in your opinion open (source)
 projects only work (good) if there is something to be build
 incrementally (like a bazaar).

 I am also asking since I am interested in innovation through open
 communities. I am wondering why there is not more discussions (which I
 am sure you have internally at the VPRI) brought onto lists. Maybe one
 could discuss not only the implementation but also concepts behind the
 design?

 Having a daytime job I know that sometimes catching up with a lively
 community is a challenge, on the other hand seeing where things are
 going and maybe also join forces in early stages might be interesting,
 no? For instance there could be other people doing PoC work.

 Thanks,
 Jakob

 Am 08.05.10 18:03, schrieb Alan Kay:
 Glad you are interested, but don't hold your breath. We've got quite
 a bit more to do this year.

 It's not an incremental project like many open source people are used
 to. We actually throw away much of our code and rewrite with new
 designs pretty often.

 Cheers,

 Alan

 
 *From:* DeNigris Sean s...@clipperadams.com
 *To:* Fundamentals of New Computing fonc@vpri.org
 *Sent:* Sat, May 8, 2010 8:55:29 AM
 *Subject:* Re: [fonc] Fonc on Mac Snow Leopard?

 We don't plan to wind up using any one else's GC  so I wouldn't
 worry.

 Not worried - just excited to play with this stuff!

 Sean

Re: [fonc] Fonc on Mac Snow Leopard?

2010-05-09 Thread Jakob Praher
Regarding syntax understanding: Yes I am aware of OMeta as well as the
original work on PEGs. (And also of PEG/LEG work by Ian).  I see is that
OMeta is used in the Lively Kernel also to understand Smalltalk Syntax.
Is this just proof of concept or is there some attempt to be able to run
Squeak programs on the browser through lively?

Am 09.05.10 23:41, schrieb Jakob Praher:
 Personally I was very excited about the Carl Hewitt's work on
 ActorScript  [1] lately. IMHO something like Lively Kernel could
 provide the client infrastructure for this Client Cloud computing.
 What is your opinion on his work? I also liked the notation he uses in
 the paper.

 What I  do not like about JS is its imperative verboseness. I also
 have mixed feelings about JSON. I like the idea of beeing able to
 express something more denotational, e.g. in ways that keeps focussing
 on the simplest solution, and then use something like equational
 reasoning to derive at more performant and complex systems. In the end
 syntax matters and maybe there is no syntax that works for every solution.

 So I see a high value in building Syntax understanding into libraries
 (This kind of homoiconicity is what I like from LISP-like langauges).
 Also Gilad Bracha did some work on parser combinators using a Self
 like language called Newspeak [2]. I found the IDE (I think it was
 called Hopscotch) very interesting in that it mirros the browser.
 Indeed the current version is running on top of Squeak. Maybe Lively
 and Newspeak could join forces? Taking the Idea of Hopscotch beyond a
 single language would be great.  Mozilla Lab's Bespin Project focuses
 on a kind of Emacs for the Web. What I would really like to see is
 some kind of Worldwide live development environment where projects are
 really living things and people can take different views on them. With
 real semantic mappings between individual notations and syntax.

 -- Jakob

 [1] - http://arxiv.org/abs/0907.3330
 [2] - http://newspeaklanguage.org/

 Am 09.05.10 01:06, schrieb Alan Kay:
 By the way, people on this list should look at Dan Ingalls' Lively
 Kernel. (http://www.lively-kernel.org/)

 Dan is also one of original authors of the NSF proposal for STEPS and
 we claim successes in the Lively Kernel as STEPS successes as well.

 That said, LK is much more like the bootstrapping of Squeak was, in
 that a known architecture was adapted to the purpose of using JS as
 the machine code for a new operating system and live environment.
 Again, there was the small dedicated team at Sun under the direction
 of the master designer/builder (Dan). And once they got a few
 versions bootstrapped they opened it up to interested open sourcers,
 and there is a lively mailing list for Lively.

 We like Lively and pay a lot of attention to it because it covers
 some of the functional ground that needs to be covered for a complete
 system. The main difference is that they are not trying for really
 small really relational models. However, the adaptation of the
 Smalltalk architecture here is very efficient for building things (as
 it was almost 40 years ago). So this is worth looking at.

 And we could imagine this as what STEPS might be like to a community,
 except that we are trying to invent new more compact more powerful
 ways to express programmatic ideas. At best, something like this is
 several years in STEPS' future.

 Cheers,

 Alan


 
 *From:* Jakob Praher j...@hapra.at
 *To:* fonc@vpri.org
 *Sent:* Sat, May 8, 2010 12:25:12 PM
 *Subject:* Re: [fonc] Fonc on Mac Snow Leopard?

 Hi Alan,

 just out of curiosity: I am wondering why VPRI is not aiming at a
 more community oriented style of innovation. Do you think the
 communcation effort is not worth the cost since you do not gain
 enough or even loose some freedom and / or speed by discussing
 archictural concepts more publicly? Does this imply that in your
 opinion open (source) projects only work (good) if there is something
 to be build incrementally (like a bazaar).

 I am also asking since I am interested in innovation through open
 communities. I am wondering why there is not more discussions (which
 I am sure you have internally at the VPRI) brought onto lists. Maybe
 one could discuss not only the implementation but also concepts
 behind the design?

 Having a daytime job I know that sometimes catching up with a lively
 community is a challenge, on the other hand seeing where things are
 going and maybe also join forces in early stages might be
 interesting, no? For instance there could be other people doing PoC work.

 Thanks,
 Jakob

 Am 08.05.10 18:03, schrieb Alan Kay:
 Glad you are interested, but don't hold your breath. We've got quite
 a bit more to do this year.

 It's not an incremental project like many open source people are
 used to. We actually throw away much of our code and rewrite with
 new designs pretty often.

 Cheers,

 Alan

Re: [fonc] Alan Kay talk at HPI in Potsdam

2011-07-25 Thread Jakob Praher

Dear Alan,
Dear List,

the following very recent announcement might be of interest to this 
discussion: 
http://groups.google.com/group/mozilla.dev.platform/browse_thread/thread/7668a9d46a43e482


To quote Andreas et al.:

 Mozilla believes that the web can displace proprietary,
   single-vendor stacks for application development.  To make open web
   technologies a better basis for future applications on mobile and
   desktop alike, we need to keep pushing the envelope of the web to
   include --- and in places exceed --- the capabilities of the
   competing stacks in question. 

Though there is not much there yet (just a kind of manifesto and a 
readme file on github) https://github.com/andreasgal/B2G, I think this 
is a encouragning development, as the web becomes more and more a walled 
garden of giants, I think we desperately need to have open APIs. Strong 
open client APIs hopefully bring more power to individuals. What do you 
think?


Cheers,
-- Jakob



Am 24.07.2011 19:24, schrieb Alan Kay:

Hi Marcel

I think I've already said a bit about the Web on this list -- mostly 
about the complete misunderstanding of the situation the web and 
browser designers had.


All the systems principles needed for a good design were already 
extant, but I don't think they were known to the designers, even 
though many of them were embedded in the actual computers and 
operating systems they used.


The simplest way to see what I'm talking about is to notice the 
many-many things that could be done on a personal computer/workstation 
that couldn't be done in the web  browser running on the very same 
personal computer/workstation. There was never any good reason for 
these differences.


Another way to look at this is from the point of view of separation 
of concerns. A big question in any system is how much does 'Part A' 
have to know about 'Part B' (and vice versa) in order to make things 
happen? The web and browser designs fail on this really badly, and 
have forced set after set of weak conventions into larger and larger, 
but still weak browsers and, worse, onto zillions of web pages on the 
net.


Basically, one of the main parts of good systems design is to try to 
find ways to finesse safe actions without having to know much. So -- 
for example -- Squeak runs everywhere because it can carry all of its 
own resources with it, and the OS processes/address-spaces allow it to 
run safely, but do not have to know anything about Squeak to run it. 
Similarly Squeak does not have to know much to run on every machine - 
just how to get events, a display buffer, and to map its file 
conventions onto the local ones. On a bare machine, Squeak *is* the 
OS, etc. So much for old ideas from the 70s!


The main idea here is that a windowing 2.5 D UI can compose views from 
many sources into a page. The sources can be opaque because they can 
even do their own rendering if needed. Since the sources can run in 
protected address-spaces their actions can be confined, and we the 
mini-OS running all this do not have to know anything about them. This 
is how apps work on personal computers, and there is no reason why 
things shouldn't work this way when the address-spaces come from other 
parts of the net. There would then be no difference between local 
and global apps.


Since parts of the address spaces can be externalized, indexing as 
rich (and richer) to what we have now still can be done.


And so forth.

The Native Client part of Chrome finally allows what should have been 
done in the first place (we are now about 20+ years after the first 
web proposals by Berners-Lee).  However, this approach will need to be 
adopted by most of the already existing multiple browsers before it 
can really be used in a practical way in the world of personal 
computing -- and there are signs that there is not a lot of agreement 
or understanding why this would be a good thing.


The sad and odd thing is that so many people in the computer field 
were so lacking in systems consciousness that they couldn't see 
this, and failed to complain mightily as the web was being set up and 
a really painful genii was being let out of the bottle.


As Kurt Vonnegut used to say And so it goes.

Cheers,

Alan


*From:* Marcel Weiher marcel.wei...@gmail.com
*To:* Fundamentals of New Computing fonc@vpri.org
*Cc:* Alan Kay alan.n...@yahoo.com
*Sent:* Sun, July 24, 2011 5:39:26 AM
*Subject:* Re: [fonc] Alan Kay talk at HPI in Potsdam

Hi Alan,

as usual, it was inspiring talking to your colleagues and hearing you 
speak at Potsdam.  I think I finally got the Model-T image, which 
resonated with my fondness for Objective-C:  a language that a 17 year 
old with no experience with compilers or runtimes can implement and 
that manages to boil down dynamic OO/messaging to a single special 
function can't be all bad :-)


There was one question I had on the scaling issue that would 

Re: [fonc] Alan Kay talk at HPI in Potsdam

2011-07-25 Thread Jakob Praher

Dear Alan,
Dear List,

the following very recent announcement might be of interest to this 
discussion: 
http://groups.google.com/group/mozilla.dev.platform/browse_thread/thread/7668a9d46a43e482


To quote Andreas et al.:

 Mozilla believes that the web can displace proprietary,
   single-vendor stacks for application development.  To make open web
   technologies a better basis for future applications on mobile and
   desktop alike, we need to keep pushing the envelope of the web to
   include --- and in places exceed --- the capabilities of the
   competing stacks in question. 

Though there is not much there yet (just a kind of manifesto and a 
readme file on github) https://github.com/andreasgal/B2G, I think this 
is a encouragning development, as the web becomes more and more a walled 
garden of giants, I think we desperately need to have open APIs. Strong 
open client APIs hopefully bring more power to individuals. What do you 
think?


Cheers,
-- Jakob



Am 24.07.2011 19:24, schrieb Alan Kay:

Hi Marcel

I think I've already said a bit about the Web on this list -- mostly 
about the complete misunderstanding of the situation the web and 
browser designers had.


All the systems principles needed for a good design were already 
extant, but I don't think they were known to the designers, even 
though many of them were embedded in the actual computers and 
operating systems they used.


The simplest way to see what I'm talking about is to notice the 
many-many things that could be done on a personal computer/workstation 
that couldn't be done in the web  browser running on the very same 
personal computer/workstation. There was never any good reason for 
these differences.


Another way to look at this is from the point of view of separation 
of concerns. A big question in any system is how much does 'Part A' 
have to know about 'Part B' (and vice versa) in order to make things 
happen? The web and browser designs fail on this really badly, and 
have forced set after set of weak conventions into larger and larger, 
but still weak browsers and, worse, onto zillions of web pages on the 
net.


Basically, one of the main parts of good systems design is to try to 
find ways to finesse safe actions without having to know much. So -- 
for example -- Squeak runs everywhere because it can carry all of its 
own resources with it, and the OS processes/address-spaces allow it to 
run safely, but do not have to know anything about Squeak to run it. 
Similarly Squeak does not have to know much to run on every machine - 
just how to get events, a display buffer, and to map its file 
conventions onto the local ones. On a bare machine, Squeak *is* the 
OS, etc. So much for old ideas from the 70s!


The main idea here is that a windowing 2.5 D UI can compose views from 
many sources into a page. The sources can be opaque because they can 
even do their own rendering if needed. Since the sources can run in 
protected address-spaces their actions can be confined, and we the 
mini-OS running all this do not have to know anything about them. This 
is how apps work on personal computers, and there is no reason why 
things shouldn't work this way when the address-spaces come from other 
parts of the net. There would then be no difference between local 
and global apps.


Since parts of the address spaces can be externalized, indexing as 
rich (and richer) to what we have now still can be done.


And so forth.

The Native Client part of Chrome finally allows what should have been 
done in the first place (we are now about 20+ years after the first 
web proposals by Berners-Lee).  However, this approach will need to be 
adopted by most of the already existing multiple browsers before it 
can really be used in a practical way in the world of personal 
computing -- and there are signs that there is not a lot of agreement 
or understanding why this would be a good thing.


The sad and odd thing is that so many people in the computer field 
were so lacking in systems consciousness that they couldn't see 
this, and failed to complain mightily as the web was being set up and 
a really painful genii was being let out of the bottle.


As Kurt Vonnegut used to say And so it goes.

Cheers,

Alan


*From:* Marcel Weiher marcel.wei...@gmail.com
*To:* Fundamentals of New Computing fonc@vpri.org
*Cc:* Alan Kay alan.n...@yahoo.com
*Sent:* Sun, July 24, 2011 5:39:26 AM
*Subject:* Re: [fonc] Alan Kay talk at HPI in Potsdam

Hi Alan,

as usual, it was inspiring talking to your colleagues and hearing you 
speak at Potsdam.  I think I finally got the Model-T image, which 
resonated with my fondness for Objective-C:  a language that a 17 year 
old with no experience with compilers or runtimes can implement and 
that manages to boil down dynamic OO/messaging to a single special 
function can't be all bad :-)


There was one question I had on the scaling issue that would 

Re: [fonc] Alan Kay talk at HPI in Potsdam

2011-07-25 Thread Jakob Praher
On 07/25/2011 09:35 PM, Bert Freudenberg wrote:
 I did ask in that thread about exposing the CPU, a la NativeClient. (It's a 
 usenet group so you can post without subscribing, nice)

 Short answer is that they don't see a need for it.
I somehow have mixed feelings about NaCL. I think that safe execution of
native code is a great achievement. Yet the current implementation
somehow still feels a bit like safer reincarnation of the ActiveX
technology. It defines a kind of abstract toolkit (like ActiveX used
WIN32 API) that enables you to interact with the user in a definite way
(graphics, audio, events).

I think it fails to achieve a common low level representation of data
that can be safely used to compose powerful applications. From this
point of view I think that e.g. message passing (in a pepsi/cola way)
with capabilities based security is much more interesting concept to
hide powerful computation than having to rely on a IPC (the pepper
interface in NaCl).  Also people should address introspectabilty and
debuggability right at the core - e.g. enforce symbols for debugging
into the applications. I think introspecabilty (the right to View
Source) is one of the biggest improvements of Javascript compared to
e.g. Java.

Cheers,
Jakob

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Ceres and Oberon

2011-08-30 Thread Jakob Praher
Am 30.08.11 21:46, schrieb Jakob Praher:
 Dear Eduardo,

 Thanks for sharing this. There is a great overlap between Alan's and
 Niklaus Wirth's sentiments.
 Very inspiring and to the point. Is anybody using Oberon currently as a
 working environment?

 @Alan: Can you remember the discussion with Niklaus from the PARC days?

 Best,
 Jakob


 Am 30.08.11 20:25, schrieb Eduardo Cavazos:
 Presentation from earlier this year by Niklaus Wirth on Oberon:

 http://www.multimedia.ethz.ch/conferences/2011/oberon/?doi=10.3930/ETHZ/AV-5879ee18-554a-4775-8292-3cf0293f5956autostart=true

 Towards the end Niklaus demos an actual Ceres workstation.


 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Ceres and Oberon

2011-08-30 Thread Jakob Praher
Am 30.08.11 22:38, schrieb Alan Kay:
 Sure. He was invited to spend a year in CSL in the mid 70s and decided
 to do an Alto like machine with an Alto-like UI and that ran Alto-like
 languages (turned out to be an odd combination of Mesa and Smalltalk).
Did you exchange some ideas? He really apreciated object orientation
when he designed his drawing program. Did you explain Smalltalk to him?
The talk makes me think about complexity of software and ability to
understand code.

I think there two sides:
a) no abstraction at all (assembly code) : complicated since simple
things are huge
b) over-use of abstraction : complicated since hard to see where the
real stuff is going on

Maybe it also has something to do with bottom up vs top down.

Cheers.
Jakob



 Cheers,

 Alan

 
 *From:* Jakob Praher j...@hapra.at
 *To:* Fundamentals of New Computing fonc@vpri.org
 *Sent:* Tuesday, August 30, 2011 1:02 PM
 *Subject:* Re: [fonc] Ceres and Oberon

 Am 30.08.11 21:46, schrieb Jakob Praher:
  Dear Eduardo,
 
  Thanks for sharing this. There is a great overlap between Alan's and
  Niklaus Wirth's sentiments.
  Very inspiring and to the point. Is anybody using Oberon
 currently as a
  working environment?
 
  @Alan: Can you remember the discussion with Niklaus from the
 PARC days?
 
  Best,
  Jakob
 
 
  Am 30.08.11 20:25, schrieb Eduardo Cavazos:
  Presentation from earlier this year by Niklaus Wirth on Oberon:
 
 
 
 http://www.multimedia.ethz.ch/conferences/2011/oberon/?doi=10.3930/ETHZ/AV-5879ee18-554a-4775-8292-3cf0293f5956autostart=true
 
 http://www.multimedia.ethz.ch/conferences/2011/oberon/?doi=10.3930/ETHZ/AV-5879ee18-554a-4775-8292-3cf0293f5956autostart=true
 
  Towards the end Niklaus demos an actual Ceres workstation.
 
 
  ___
  fonc mailing list
  fonc@vpri.org mailto:fonc@vpri.org
  http://vpri.org/mailman/listinfo/fonc


 ___
 fonc mailing list
 fonc@vpri.org mailto:fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc




 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] COLAs or CLOAs? : are lambda systems fundamentally simpler than object systems?

2012-02-12 Thread Jakob Praher
We would have to define what you mean by the term computation.
Computation is a way to transform a language syntactically by defined
rules.
The lambda calculus is a fundamental way of performing such
transformation via reduction rules (the alpha, beta, gamma rules).

In the end the beta-reduction is term substitution. But abstraction and
substitution in a generic purpose von Neumann-style computer has to be
modelled accordingly: A variable in the computer is a memory location/a
register that can be updated (but it is not a 1:1 correspondence). E.g.
A function in a computer is jump to a certain code location having to
write to certain locations in memory/registers to get the arguments passed.

IMHO the computational model of objects and method dispatch is more of a
black box / communcation-oriented model. One does not know much about
the destination and dispatchs a message interpreting the result. In
functional languages the model is more white boxed. One can always
decompose a term into subterms and interpret it. Therefore functional
languages do not grow easily to distributed programming, where the
knowledge over the terms is limited.

Best,
Jakob

Am 12.02.12 03:20, schrieb Steve Wart:
 I think of OO as an organization mechanism. It doesn't add
 fundamentally to computation, but it allows complexity to be managed
 more easily.

 On Sat, Feb 11, 2012 at 5:23 PM, Kurt Stephens k...@kurtstephens.com
 mailto:k...@kurtstephens.com wrote:


 COLAs or CLOAs? : are lambda systems fundamentally simpler than
 object systems?

 Should Combined-Object-Lambda-Archtecture really be
 Combined-Lambda-Object-Architecture?

 Ian Piumarta's IDST bootstraps a object-system, then a compiler,
 then a lisp evaluator.  Maru bootstraps a lisp evaluator, then
 crafts an object system, then a compiler.  Maru is much smaller
 and elegant than IDST.

 Are object systems necessarily more complex than lambda
 evaluators?  Or is this just another demonstration of how Lisp
 code/data unification is more powerful?

 If message send and function calls are decomposed into lookup()
 and apply(), the only difference basic OO message-passing and
 function calling is lookup(): the former is late-bound, the latter
 is early bound (in the link-editor, for example.).  Is OO lookup()
 the sole complicating factor?  Is a lambda-oriented compiler
 fundamentally less complex than a OO compiler?

 I took the object-lambda approach in TORT
 (http://github.com/kstephens/tort) tried to keep the OO kernel
 small, and delay the compiler until after the lisp evaluator.  The
 object system started out tiny but to support the lisp evaluator
 created in an OO-style (which the REPL and compiler is built on)
 required a lot of basic foundational object functionality.
  Despite its name, TORT is no longer tiny; I probably didn't
 restrain myself enough; it tries too much to support C extension
 and dynamic linking.

 Did Gregor Kiczales, Ian and others stumble upon the benefits of
 lisp-object bootstrapping .vs. object-lisp bootstrapping?  I've
 written object-oriented LISPs before
 (http://github.com/kstephens/ll based on ideas from OAKLISP).  Do
 OO techniques make language implementation feel easier in the
 beginning, only to complect later on?

  Just some ideas,
  Kurt Stephens

 http://kurtstephens.com/node/154


 ___
 fonc mailing list
 fonc@vpri.org mailto:fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc




 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-28 Thread Jakob Praher
Dear Alan,

Am 28.02.12 14:54, schrieb Alan Kay:
 Hi Ryan

 Check out Smalltalk-71, which was a design to do just what you suggest
 -- it was basically an attempt to combine some of my favorite
 languages of the time -- Logo and Lisp, Carl Hewitt's Planner, Lisp
 70, etc.
do you have a detailled documentation of Smalltalk 71 somewhere?
Something like a Smalltalk 71 for Smalltalk 80 programmers :-)
In the early history of Smalltalk you mention it as

 It was a kind of parser with object-attachment that executed tokens
directly.

From the examples I think that do 'expr' is evaluating expr by using
previous to 'ident' :arg1..:argN body.

As an example do 'factorial 3' should  evaluate to 6 considering:

to 'factorial' 0 is 1
to 'factorial' :n do 'n*factorial n-1'

What about arithmetic and precendence: What part of language was built
into the system?
- :var denote variables, whereas var denotes the instantiated value of
:var in the expr, e.g. :n vs 'n-1'
- '' denote simple tokens (in the head) as well as expressions (in
the body)?
- to, do are keywords
- () can be used for precedence

You described evaluation as straightforward pattern-matching.
It somehow reminds me of a term rewriting system -  e.g 'hd' ('cons' :a
:b) '-'  :c  is a structured term.
I know rewriting systems which first parse into an abstract
representation (e.g. prefix form) and transforms on the abstract syntax
- whereas in Smalltalk 71 the concrete syntax seems to be used in the rules.

Also it seems redundant to both have:
to 'hd' ('cons' :a :b) do 'a'
and
to 'hd' ('cons' :a :b) '-'  :c  do 'a - c'

Is this made to make sure that the left hand side of - has to be a hd
(cons :a :b) expression?

Best,
Jakob


 This never got implemented because of a bet that turned into
 Smalltalk-72, which also did what you suggest, but in a less
 comprehensive way -- think of each object as a Lisp closure that could
 be sent a pointer to the message and could then parse-and-eval that. 

 A key to scaling -- that we didn't try to do -- is semantic typing
 (which I think is discussed in some of the STEPS material) -- that is:
 to be able to characterize the meaning of what is needed and produced
 in terms of a description rather than a label. Looks like we won't get
 to that idea this time either.

 Cheers,

 Alan

 
 *From:* Ryan Mitchley ryan.mitch...@gmail.com
 *To:* fonc@vpri.org
 *Sent:* Tuesday, February 28, 2012 12:57 AM
 *Subject:* Re: [fonc] Error trying to compile COLA

 On 27/02/2012 19:48, Tony Garnock-Jones wrote:

 My interest in it came out of thinking about integrating pub/sub
 (multi- and broadcast) messaging into the heart of a language.
 What would a Smalltalk look like if, instead of a strict unicast
 model with multi- and broadcast constructed atop (via
 Observer/Observable), it had a messaging model capable of
 natively expressing unicast, anycast, multicast, and broadcast
 patterns? 


 I've wondered if pattern matching shouldn't be a foundation of
 method resolution (akin to binding with backtracking in Prolog) -
 if a multicast message matches, the method is invoked (with much
 less specificity than traditional method resolution by
 name/token). This is maybe closer to the biological model of a
 cell surface receptor.

 Of course, complexity is an issue with this approach (potentially
 NP-complete).

 Maybe this has been done and I've missed it.


 ___
 fonc mailing list
 fonc@vpri.org mailto:fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc




 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Natural Language Wins

2013-04-04 Thread Jakob Praher
Am 04.04.13 22:53, schrieb John Carlson:

 Natural languages include tenses.  What computer systems have a wide
 variety of tenses?

John McCarthy analyzed this in his description of Elephant 2000 [1]
sentence Algolic programs refer to the past via variables, arrays and
other data structures.

The maths vs natural language discussion boils down to the
interpretation of meaning. In natural language the meaning of an
expression is typically the intent of the sender to create the meaning
in the world of the receiver. In How to do Things with Words  J. L.
Austin analyzed [2] that we use language to do things as well as to
assert things. This interpretation of the meaning of language is called
the theory of speech acts. Mathematics on the other hand is a formal
language and every expression (should be) based on well defined
definitions and proven theorems based on axioms, laws. Attention: I am
not saying that one cannot express speech act models formally. One has
to take the participating agent's knowledge, goals, and beliefs, into
account 

With Elephant 2000 John envisioned to create a system that work based on
speech acts[3]. He writes further  The nature of the interaction arises
from the fact that the different agents have different goals, knowledge
and capabilities, and an agent's achieving its goals requires
interaction with others. The nature of the required interactions
determines the speech acts required. Many facts about what speech acts
are required are independent of whether the agent is man or machine.

Best,
Jakob

[1] -
http://www-formal.stanford.edu/jmc/elephant/node3.html#SECTION0003
[2] - http://en.wikipedia.org/wiki/How_to_Do_Things_with_Words
[3] -
http://www-formal.stanford.edu/jmc/elephant/node2.html#SECTION0002

 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc