Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Martin Baldan
David,

Thanks for the link. Indeed, now I see how to run  eval with .l example
files. There are also .k  files, which I don't know how they differ from
those, except that .k files are called with ./eval filename.k while
.l files are called with ./eval repl.l filename.l where filename is
the name of the file. Both kinds seem to be made of Maru code.

I still don't know how to go from here to a Frank-like GUI. I'm reading
other replies which seem to point that way. All tips are welcome ;)

-Martin


On Mon, Feb 27, 2012 at 3:54 AM, David Girle davidgi...@gmail.com wrote:

 Take a look at the page:

 http://piumarta.com/software/maru/

 it has the original version you have + current.
 There is a short readme in the current version with some examples that
 will get you going.

 David

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Reuben Thomas
On 27 February 2012 14:01, Martin Baldan martino...@gmail.com wrote:

 I still don't know how to go from here to a Frank-like GUI. I'm reading
 other replies which seem to point that way. All tips are welcome ;)

And indeed, maybe any discoveries could be written up at one of the Wikis:

http://vpri.org/fonc_wiki/index.php/Main_Page
http://www.vpri.org/vp_wiki/index.php/Main_Page

? There's a lot of exciting stuff to learn about here, but the tedious
details of how to build it are not among them!

-- 
http://rrt.sc3d.org
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Alan Kay
Hi Julian

I should probably comment on this, since it seems that the STEPS reports 
haven't made it clear enough.

STEPS is a science experiment not an engineering project. 


It is not at all about making and distributing an operating system etc., but 
about trying to investigate the tradeoffs between problem oriented languages 
that are highly fitted to problem spaces vs. what it takes to design them, 
learn them, make them, integrate them, add pragmatics, etc.

Part of the process is trying many variations in interesting (or annoying) 
areas. Some of these have been rather standalone, and some have had some 
integration from the start.

As mentioned in the reports, we made Frank -- tacking together some of the POLs 
that were done as satellites -- to try to get a better handle on what an 
integration language might be like that is much better than the current use of 
Squeak. It has been very helpful to get something that is evocative of the 
whole system working in a practical enough matter to use it (instead of PPT 
etc) to give talks that involve dynamic demos. We got some good ideas from this.


But this project is really about following our noses, partly via getting 
interested in one facet or another (since there are really too many for just a 
few people to cover all of them). 


For example, we've been thinking for some time that the pretty workable DBjr 
system that is used for visible things - documents, UI, etc. -- should be 
almost constructable by hand if we had a better constraint system. This would 
be the third working DBjr made by us ...


And -- this year is the 50th anniversary of Sketchpad, which has also got us 
re-thinking about some favorite old topics, etc.

This has led us to start putting constraint engines into STEPS, thinking about 
how to automatically organize various solvers, what kinds of POLs would be nice 
to make constraint systems with, UIs for same, and so forth. Intellectually 
this is kind of interesting because there are important overlaps between the 
functions + time stamps approach of many of our POLs and with constraints and 
solvers.

This looks very fruitful at this point!


As you said at the end of your email: this is not an engineering project, but a 
series of experiments.


One thought we had about this list is that it might lead others to conduct 
similar experiments. Just to pick one example: Reuben Thomas' thesis Mite (ca 
2000) has many good ideas that apply here. To quote from the opening:Mite is a 
virtual machine intended to provide fast language and machine-neutral 
just-in-time
translation of binary-portable object code into high quality native code, with 
a formal foundation. So one interesting project could be to try going from 
Nile down to a CPU via Mite. Nile is described in OMeta, so this could be a 
graceful transition, etc.

In any case, we spend most of our time trying to come up with ideas that might 
be powerful for systems design and ways to implement them. We occasionally 
write a paper or an NSF report. We sometimes put out code so people can see 
what we are doing. But what we will put out at the end of this period will be 
very different -- especially in the center -- that what we did for the center 
last year.

Cheers and best wishes,

Alan






 From: Julian Leviston jul...@leviston.net
To: Fundamentals of New Computing fonc@vpri.org 
Sent: Saturday, February 25, 2012 6:48 PM
Subject: Re: [fonc] Error trying to compile COLA
 

As I understand it, Frank is an experiment that is an extended version of DBJr 
that sits atop lesserphic, which sits atop gezira which sits atop nile, which 
sits atop maru all of which which utilise ometa and the worlds idea.


If you look at the http://vpri.org/html/writings.php page you can see a 
pattern of progression that has emerged to the point where Frank exists. From 
what I understand, maru is the finalisation of what began as pepsi and coke. 
Maru is a simple s-expression language, in the same way that pepsi and coke 
were. In fact, it looks to have the same syntax. Nothing is the layer 
underneath that is essentially a symbolic computer - sitting between maru and 
the actual machine code (sort of like an LLVM assembler if I've understood it 
correctly).


They've hidden Frank in plain sight. He's a patch-together of all their 
experiments so far... which I'm sure you could do if you took the time to 
understand each of them and had the inclination. They've been publishing as 
much as they could all along. The point, though, is you have to understand 
each part. It's no good if you don't understand it.


If you know anything about Alan  VPRI's work, you'd know that their focus is 
on getting children this stuff in front as many children as possible, because 
they have so much more ability to connect to the heart of a problem than 
adults. (Nothing to do with age - talking about minds, not bodies here). 
Adults usually get in the way with their stuff - their 

Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Steve Wart
Just to zero in on one idea here


  Anyway I digress... have you had a look at this file?:

  http://piumarta.com/software/maru/maru-2.1/test-pepsi.l

  Just read the whole thing - I found it fairly interesting :) He's build
 pepsi on maru there... that's pretty fascinating, right? Built a micro
 smalltalk on top of the S-expression language... and then does a Fast
 Fourier Transform test using it...

   my case: looked some, but not entirely sure how it works though.


See the comment at the top:

./eval repl.l test-pepsi.l

eval.c is written in C, it's pretty clean code and very cool. Then eval.l
does the same thing in a lisp-like language.

Was playing with the Little Schemer with my son this weekend - if you fire
up the repl, cons, car, cdr stuff all work as expected.

Optionally check out the wikipedia article on PEGs and look at the COLA
paper if you can find it.

Anyhow, it's all self-contained, so if you can read C code and understand a
bit of Lisp, you can watch how the syntax morphs into Smalltalk. Or any
other language you feel like writing a parser for.

This is fantastic stuff.

Steve
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread David Harris
Alan ---

I appreciate both you explanation and what you are doing.  Of course
jealousy comes into it, because you guys appear to be having a lot of fun
mixed in with your hard work, and I would love to part of that.  I know
where I would be breaking down the doors if I was starting a masters or
doctorate.   However, I have made my choices, a long time ago, and so will
have live vicariously through your reports.  The constraint system, a la
Sketchpad, is a laudable experiment and I would love to see a
hand-constructible DBjr.  You seem to be approaching a much more
understandable and malleable system, and achieving more of the promise of
computers as imagined in the sixties and seventies, rather than what seems
to be the more mundane and opaque conglomerate that is generally the case
now.

Keep up the excellent work,
David


On Monday, February 27, 2012, Alan Kay alan.n...@yahoo.com wrote:
 Hi Julian
 I should probably comment on this, since it seems that the STEPS reports
haven't made it clear enough.
 STEPS is a science experiment not an engineering project.

 It is not at all about making and distributing an operating system
etc., but about trying to investigate the tradeoffs between problem
oriented languages that are highly fitted to problem spaces vs. what it
takes to design them, learn them, make them, integrate them, add
pragmatics, etc.
 Part of the process is trying many variations in interesting (or
annoying) areas. Some of these have been rather standalone, and some have
had some integration from the start.
 As mentioned in the reports, we made Frank -- tacking together some of
the POLs that were done as satellites -- to try to get a better handle on
what an integration language might be like that is much better than the
current use of Squeak. It has been very helpful to get something that is
evocative of the whole system working in a practical enough matter to use
it (instead of PPT etc) to give talks that involve dynamic demos. We got
some good ideas from this.

 But this project is really about following our noses, partly via getting
interested in one facet or another (since there are really too many for
just a few people to cover all of them).

 For example, we've been thinking for some time that the pretty workable
DBjr system that is used for visible things - documents, UI, etc. -- should
be almost constructable by hand if we had a better constraint system. This
would be the third working DBjr made by us ...

 And -- this year is the 50th anniversary of Sketchpad, which has also got
us re-thinking about some favorite old topics, etc.
 This has led us to start putting constraint engines into STEPS, thinking
about how to automatically organize various solvers, what kinds of POLs
would be nice to make constraint systems with, UIs for same, and so forth.
Intellectually this is kind of interesting because there are important
overlaps between the functions + time stamps approach of many of our POLs
and with constraints and solvers.
 This looks very fruitful at this point!

 As you said at the end of your email: this is not an engineering project,
but a series of experiments.

 One thought we had about this list is that it might lead others to
conduct similar experiments. Just to pick one example: Reuben Thomas'
thesis Mite (ca 2000) has many good ideas that apply here. To quote from
the opening: Mite is a virtual machine intended to provide fast language
and machine-neutral just-in-time translation of binary-portable object code
into high quality native code, with a formal foundation. So one
interesting project could be to try going from Nile down to a CPU via Mite.
Nile is described in OMeta, so this could be a graceful transition, etc.
 In any case, we spend most of our time trying to come up with ideas that
might be powerful for systems design and ways to implement them. We
occasionally write a paper or an NSF report. We sometimes put out code so
people can see what we are doing. But what we will put out at the end of
this period will be very different -- especially in the center -- that what
we did for the center last year.
 Cheers and best wishes,
 Alan


 
 From: Julian Leviston jul...@leviston.net
 To: Fundamentals of New Computing fonc@vpri.org
 Sent: Saturday, February 25, 2012 6:48 PM
 Subject: Re: [fonc] Error trying to compile COLA

 As I understand it, Frank is an experiment that is an extended version of
DBJr that sits atop lesserphic, which sits atop gezira which sits atop
nile, which sits atop maru all of which which utilise ometa and the
worlds idea.
 If you look at the http://vpri.org/html/writings.php page you can see a
pattern of progression that has emerged to the point where Frank exists.
From what I understand, maru is the finalisation of what began as pepsi and
coke. Maru is a simple s-expression language, in the same way that pepsi
and coke were. In fact, it looks to have the same syntax. Nothing is the
layer underneath that is 

Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Tony Garnock-Jones
Hi Alan,

On 27 February 2012 11:32, Alan Kay alan.n...@yahoo.com wrote:

 [...] a better constraint system. [...] This has led us to start putting
 constraint engines into STEPS, thinking about how to automatically organize
 various solvers, what kinds of POLs would be nice to make constraint
 systems with, UIs for same, and so forth.


Have you looked into the Propagators of Radul and Sussman? For example,
http://dspace.mit.edu/handle/1721.1/44215. His approach is closely related
to dataflow, with a lattice defined at each node in the graph for
integrating the messages that are sent to it. He's built FRP systems, type
checkers, type inferencers, abstract interpretation systems and lots of
other fun things in a nice, simple way, out of this core construct that
he's placed near the heart of his language's semantics.

My interest in it came out of thinking about integrating pub/sub (multi-
and broadcast) messaging into the heart of a language. What would a
Smalltalk look like if, instead of a strict unicast model with multi- and
broadcast constructed atop (via Observer/Observable), it had a messaging
model capable of natively expressing unicast, anycast, multicast, and
broadcast patterns? Objects would be able to collaborate on responding to
requests... anycast could be used to provide contextual responses to
requests... concurrency would be smoothly integrable... more research to be
done :-)

Regards,
  Tony
-- 
Tony Garnock-Jones
tonygarnockjo...@gmail.com
http://homepages.kcbbs.gen.nz/tonyg/
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread BGB

On 2/27/2012 10:30 AM, Steve Wart wrote:

Just to zero in on one idea here



Anyway I digress... have you had a look at this file?:

http://piumarta.com/software/maru/maru-2.1/test-pepsi.l

Just read the whole thing - I found it fairly interesting :) He's
build pepsi on maru there... that's pretty fascinating, right?
Built a micro smalltalk on top of the S-expression language...
and then does a Fast Fourier Transform test using it...


my case: looked some, but not entirely sure how it works though.


See the comment at the top:
./eval repl.l test-pepsi.l
eval.c is written in C, it's pretty clean code and very cool. Then 
eval.l does the same thing in a lisp-like language.


Was playing with the Little Schemer with my son this weekend - if you 
fire up the repl, cons, car, cdr stuff all work as expected.




realized I could rip the filename off the end of the URL to get the 
directory, got the C file.


initial/quick observations:
apparently uses Boehm;
type system works a bit differently than my stuff, but seems to expose a 
vaguely similar interface (except I tend to put 'dy' on the front of 
everything here, so dycar(), dycdr(), dycaddr(), and most 
predicates have names like dyconsp() and similar, and often I 
type-check using strings rather than an enum, ...);
the parser works a bit differently than my S-Expression parser (mine 
tend to be a bit more, if/else, and read characters typically either 
from strings or stream objects);

ANSI codes with raw escape characters (text editor not entirely happy);
mixed tabs and spaces not leading to very good formatting;
simplistic interpreter, albeit it is not entirely clear how the built-in 
functions get dispatched;

...

a much more significant difference:
in my code, this sort of functionality is spread over many different 
areas (over several different DLLs), so one wouldn't find all of it in 
the same place.


will likely require more looking to figure out how the parser or syntax 
changing works (none of my parsers do this, most are fixed-form and 
typically shun context dependent parsing).



some of my earlier/simpler interpreters were like this though, vs newer 
ones which tend to have a longer multiple-stage translation pipeline, 
and which make use of bytecode.



Optionally check out the wikipedia article on PEGs and look at the 
COLA paper if you can find it.




PEGs, apparently I may have been using them informally already (thinking 
they were EBNF), although I haven't used them for directly driving a 
parser. typically, they have been used occasionally for describing 
elements of the syntax (in documentation and similar), at least not when 
using the lazier system of syntax via tables of examples.


may require more looking to try to better clarify the difference between 
a PEG and EBNF...
(the only difference I saw listed was that PEGs are ordered, but I would 
have assumed that a parser based on EBNF would have been implicitly 
ordered anyways, hmm...).



Anyhow, it's all self-contained, so if you can read C code and 
understand a bit of Lisp, you can watch how the syntax morphs into 
Smalltalk. Or any other language you feel like writing a parser for.


This is fantastic stuff.



following the skim and some more looking, I think I have a better idea 
how it works.



I will infer:
top Lisp-like code defines behavior;
syntax in middle defines syntax (as comment says);
(somehow) the parser invokes the new syntax, internally converting it 
into the Lisp-like form, which is what gets executed.



so, seems interesting enough...


if so, my VM is vaguely similar, albeit without the syntax definition or 
variable parser (the parser for my script language is fixed-form and 
written in C, but does parse to a Scheme-like AST system).


the assumption would have been that if someone wanted a parser for a new 
language, they would write one, assuming the semantics mapped tolerably 
to the underlying VM (exactly matching the semantics of each language 
would be a little harder though).


theoretically, nothing would really prevent writing a parser in the 
scripting language, just I had never really considered doing so (or, for 
that matter, even supporting user-defined syntax elements in the main 
parser).



the most notable difference between my ASTs and Lisp or Scheme:
all forms are special forms, and function calls need to be made via a 
special form (this was mostly to help better detect problems);

operators were also moved to special forms, for similar reasons;
there are lots more special forms, most mapping to HLL constructs (for, 
while, break, continue, ...);

...

as-is, there are also a large-number of bytecode operations, many 
related to common special cases.


for example, a recent addition called jmp_cond_sweq reduces several 
instructions related to switch into a single operation, partly 
intended for micro-optimizing (why 3 opcodes when one only needs 1?), 
and also partly intended to be used as a VM 

Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Alan Kay
Hi Tony

Yes, I've seen it. As Gerry says, it is an extension of Guy Steele's thesis. 
When I read this, I wished for a more interesting, comprehensive and 
wider-ranging and -scaling example to help think with. 


One reason to put up with some of the problems of defining things using 
constraints is that if you can organize things well enough, you get super 
clarity and simplicity and power.

They definitely need a driving example that has these traits. There is a 
certain tinge of the Turing Tarpit to this paper.

With regard to objects, my current prejudice is that objects should be able to 
receive messages, but should not have to send to explicit receivers. This is a 
kind of multi-cast I guess (but I think of it more like publish/subscribe).


Cheers,

Alan






 From: Tony Garnock-Jones tonygarnockjo...@gmail.com
To: Alan Kay alan.n...@yahoo.com; Fundamentals of New Computing 
fonc@vpri.org 
Sent: Monday, February 27, 2012 9:48 AM
Subject: Re: [fonc] Error trying to compile COLA
 

Hi Alan,


On 27 February 2012 11:32, Alan Kay alan.n...@yahoo.com wrote:

[...] a better constraint system. [...] This has led us to start putting 
constraint engines into STEPS, thinking about how to automatically organize 
various solvers, what kinds of POLs would be nice to make constraint systems 
with, UIs for same, and so forth.

Have you looked into the Propagators of Radul and Sussman? For example, 
http://dspace.mit.edu/handle/1721.1/44215. His approach is closely related to 
dataflow, with a lattice defined at each node in the graph for integrating the 
messages that are sent to it. He's built FRP systems, type checkers, type 
inferencers, abstract interpretation systems and lots of other fun things in a 
nice, simple way, out of this core construct that he's placed near the heart 
of his language's semantics.

My interest in it came out of thinking about integrating pub/sub (multi- and 
broadcast) messaging into the heart of a language. What would a Smalltalk look 
like if, instead of a strict unicast model with multi- and broadcast 
constructed atop (via Observer/Observable), it had a messaging model capable 
of natively expressing unicast, anycast, multicast, and broadcast patterns? 
Objects would be able to collaborate on responding to requests... anycast 
could be used to provide contextual responses to requests... concurrency would 
be smoothly integrable... more research to be done :-)

Regards,
  Tony
-- 
Tony Garnock-Jones
tonygarnockjo...@gmail.com
http://homepages.kcbbs.gen.nz/tonyg/


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread David Girle
I am interested in the embedded uses of Maru, so I cannot comment on
how to get from here to a Frank-like GUI.  I have no idea how many
others on this list are interested in the Internet of Things (IoT),
but I expect parts of Frank will be useful in that space.  Maybe 5kLOC
will bring up a connected, smart sensor system, rather than the 20kLOC
target VPRI have in mind for a programming system.

David

On Mon, Feb 27, 2012 at 7:01 AM, Martin Baldan martino...@gmail.com wrote:
 David,

 Thanks for the link. Indeed, now I see how to run  eval with .l example
 files. There are also .k  files, which I don't know how they differ from
 those, except that .k files are called with ./eval filename.k while .l
 files are called with ./eval repl.l filename.l where filename is the
 name of the file. Both kinds seem to be made of Maru code.

 I still don't know how to go from here to a Frank-like GUI. I'm reading
 other replies which seem to point that way. All tips are welcome ;)

 -Martin


 On Mon, Feb 27, 2012 at 3:54 AM, David Girle davidgi...@gmail.com wrote:

 Take a look at the page:

 http://piumarta.com/software/maru/

 it has the original version you have + current.
 There is a short readme in the current version with some examples that
 will get you going.

 David



 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread BGB

On 2/27/2012 10:31 AM, David Harris wrote:

Alan ---

I appreciate both you explanation and what you are doing.  Of course 
jealousy comes into it, because you guys appear to be having a lot of 
fun mixed in with your hard work, and I would love to part of that.  I 
know where I would be breaking down the doors if I was starting a 
masters or doctorate.   However, I have made my choices, a long time 
ago, and so will have live vicariously through your reports.  The 
constraint system, a la Sketchpad, is a laudable experiment and I 
would love to see a hand-constructible DBjr.  You seem to be 
approaching a much more understandable and malleable system, and 
achieving more of the promise of computers as imagined in the sixties 
and seventies, rather than what seems to be the more mundane and 
opaque conglomerate that is generally the case now.




WRT the mundane opaque conglomerate:
I sort of had assumed that is how things always were and (presumably) 
always would be, just with the assumption that things were becoming more 
open due to both the existence of FOSS, and the rising popularity of 
scripting languages (Scheme, JavaScript, Python, ...).


like, in contrast to the huge expensive closed systems of the past 
(like, the only people who really know how any of it works were the 
vendors, and most of this information was kept secret).


I had sort of guessed that the push towards small closed embedded 
systems (such as smart-phones) was partly a move to try to 
promote/regain vendor control over the platforms (vs the relative 
user-freedom present on PCs).



(I don't know if I will ever go for a masters or doctorate though, as I 
have been in college long enough just going for an associates' degree, 
and would assume trying to find some way to get a job or money or 
similar...).




Keep up the excellent work,
David


On Monday, February 27, 2012, Alan Kay alan.n...@yahoo.com 
mailto:alan.n...@yahoo.com wrote:

 Hi Julian
 I should probably comment on this, since it seems that the STEPS 
reports haven't made it clear enough.

 STEPS is a science experiment not an engineering project.



I had personally sort of assumed this, which is why I had thought it 
acceptable to mention my own ideas and efforts as well, which would be a 
bit more off-topic if the focus were on a single product or piece of 
technology (like, say, hanging around on the LLVM or Mono lists, writing 
about code-generators or VM technology in general, rather than the topic 
being restricted to LLVM or Mono, which is the focus of these lists).


but, there are lots of tradeoffs, say, between stuff in general and 
pursuit of trying to get a marketable product put together, so trying 
for the latter to some extent impedes the former in my case.


although, I would personally assume everyone decides and acts 
independently, in pursuit of whatever is of most benefit to themselves, 
and make no claim to have any sort of absolute position.


(decided to leave out me going off into philosophy land...).


but, anyways, I tend to prefer the personal freedom to act in ways which 
I believe to be in my best interests, rather than being endlessly judged 
by others for random development choices (such as choosing to write a 
piece of code to do something when there is a library for that), like, 
if I can easily enough throw together a piece of code to do something, 
why should I necessarily subject myself to the hassles of a dependency 
on some random 3rd party library, and why do other people feel so 
compelled to make derisive comments for such a decision?


and, I would assume likewise for others. like, if it works, who cares?
like, if I write a piece of code, what reason would I have to think this 
somehow obligates other people to use it, and what reason do other 
people seem to have to believe that I think that it does?
like, what if I just do something, and maybe people might consider using 
it if they find it useful, can agree with the license terms, ... ?
and, if in-fact it sucks too hard for anyone else to have much reason to 
care, or is ultimately a dead-end, what reason do they have to care?

...

but, I don't think it means I have to keep it all secret either, but I 
don't really understand people sometimes... (though, I guess I am 
getting kind of burnt out sometimes of people so often being judgmental...).


or, at least, this is how I see things.


 It is not at all about making and distributing an operating system 
etc., but about trying to investigate the tradeoffs between problem 
oriented languages that are highly fitted to problem spaces vs. 
what it takes to design them, learn them, make them, integrate them, 
add pragmatics, etc.
 Part of the process is trying many variations in interesting (or 
annoying) areas. Some of these have been rather standalone, and some 
have had some integration from the start.
 As mentioned in the reports, we made Frank -- tacking together some 
of the POLs that were done as satellites -- to 

Re: [fonc] Error trying to compile COLA

2012-02-27 Thread BGB

On 2/27/2012 1:27 PM, David Girle wrote:

I am interested in the embedded uses of Maru, so I cannot comment on
how to get from here to a Frank-like GUI.  I have no idea how many
others on this list are interested in the Internet of Things (IoT),
but I expect parts of Frank will be useful in that space.  Maybe 5kLOC
will bring up a connected, smart sensor system, rather than the 20kLOC
target VPRI have in mind for a programming system.

David


IoT: had to look it up, but it sounds like something which could easily 
turn very cyber-punky or end up being abused in some sort of dystopic 
future scenario. accidentally touch some random object and suddenly the 
person has a price on their head and police jumping in through their 
window armed with automatic weapons or something...


and escape is difficult as doors will automatically lock on their 
approach, and random objects will fly into their path as they try to 
make a run for it, ... (because reality itself has something akin to the 
Radiant AI system from Oblivion or Fallout 3).


(well, ok, not that I expect something like this would necessarily 
happen... or that the idea is necessarily a bad idea...).



granted, as for kloc:
code has to go somewhere, I don't think 5 kloc is going to work.

looking at the Maru stuff from earlier, I would have to check, but I 
suspect it may already go over that budget (by quickly looking at a few 
files and adding up the line counts).



admittedly, I don't as much believe in the tiny kloc goal, since as-is, 
getting a complete modern computing system down into a few Mloc would 
already be a bit of an achievement (vs, say, a 14 Mloc kernel running a 
4 Mloc web browser, on a probably 10 Mloc GUI framework, all being 
compiled by a 5 Mloc C compiler, add another 1 Mloc if one wants a 3D 
engine, ...).



yes, one can make systems much smaller, but typically at a cost in terms 
of functionality, like one has a small OS kernel that only run on a 
single hardware configuration, a compiler that only supports a single 
target, a web browser which only supports very minimal functionality, ...


absent a clearly different strategy (what the VPRI people seem to be 
pursuing), the above outcome would not be desirable, and it is generally 
much more desirable to have a feature-rich system, even if potentially 
the LOC counts are far beyond the ability of any given person to 
understand (and if the total LOC for the system, is likely, *huge*...).


very course estimates:
a Linux installation DVD is 3.5 GB;
assume for a moment that nearly all of this is (likely) compressed 
program-binary code, and assuming that code tends to compress to approx 
1/4 its original size with Deflate;

so, probably 14GB of binary code;
my approx 1Mloc app compiles to about 16.5 MB of DLLs;
assuming everything else holds (and the basic assumptions are correct), 
this would work out to ~ 849 Mloc.


(a more realistic estimate would need to find how much is program code 
vs data files, and maybe find a better estimate of the binary-size to 
source-LOC mapping).



granted, there is probably a lot of redundancy which could likely be 
eliminated, and if one assumes it is a layered tower strategy (a large 
number of rings, with each layer factoring out most of what resides 
above it), then likely a significant reduction would be possible.


the problem is, one is still likely to have an initially fairly large 
wind up time, so ultimately the resulting system, is still likely to 
be pretty damn large (assuming it can do everything a modern OS does, 
and more, it is still likely to be probably well into the Mloc range).



but, I could always be wrong here...



On Mon, Feb 27, 2012 at 7:01 AM, Martin Baldanmartino...@gmail.com  wrote:

David,

Thanks for the link. Indeed, now I see how to run  eval with .l example
files. There are also .k  files, which I don't know how they differ from
those, except that .k files are called with ./eval filename.k while .l
files are called with ./eval repl.l filename.l where filename is the
name of the file. Both kinds seem to be made of Maru code.

I still don't know how to go from here to a Frank-like GUI. I'm reading
other replies which seem to point that way. All tips are welcome ;)

-Martin


On Mon, Feb 27, 2012 at 3:54 AM, David Girledavidgi...@gmail.com  wrote:

Take a look at the page:

http://piumarta.com/software/maru/

it has the original version you have + current.
There is a short readme in the current version with some examples that
will get you going.

David



___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Alan Kay
Hi Tony

I like what the BOOM/BLOOM people are doing quite a bit. Their version of 
Datalog + Time is definitely in accord with lots of our prejudices ...

Cheers,

Alan





 From: Tony Garnock-Jones tonygarnockjo...@gmail.com
To: Alan Kay alan.n...@yahoo.com 
Cc: Fundamentals of New Computing fonc@vpri.org 
Sent: Monday, February 27, 2012 1:44 PM
Subject: Re: [fonc] Error trying to compile COLA
 

On 27 February 2012 15:09, Alan Kay alan.n...@yahoo.com wrote:

Yes, I've seen it. As Gerry says, it is an extension of Guy Steele's thesis. 
When I read this, I wished for a more interesting, comprehensive and 
wider-ranging and -scaling example to help think with.

For me, the moment of enlightenment was when I realized that by using a 
lattice at each node, they'd abstracted out the essence of 
iterate-to-fixpoint that's disguised within a number of the examples I 
mentioned in my previous message. (Particularly the frameworks of abstract 
interpretation.)

I'm also really keen to try to relate propagators to Joe Hellerstein's recent 
work on BOOM/BLOOM. That team has been able to implement the Chord DHT in 
fewer than 50 lines of code. The underlying fact-propagation system of their 
language integrates with a Datalog-based reasoner to permit terse, dense 
reasoning about distributed state.
 
One reason to put up with some of the problems of defining things using 
constraints is that if you can organize things well enough, you get super 
clarity and simplicity and power.

Absolutely. I think Hellerstein's Chord example shows that very well. So I 
wish it had been an example in Radul's thesis :-)
 
With regard to objects, my current prejudice is that objects should be able 
to receive messages, but should not have to send to explicit receivers. This 
is a kind of multi-cast I guess (but I think of it more like 
publish/subscribe).


I'm nearing the point where I can write up the results of a chunk of my 
current research. We have been using a pub/sub-based virtual machine for 
actor-like entities, and have found a few cool uses of non-point-to-point 
message passing that simplify implementation of complex protocols like DNS and 
SSH.

Regards,
  Tony
-- 
Tony Garnock-Jones
tonygarnockjo...@gmail.com
http://homepages.kcbbs.gen.nz/tonyg/


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread Julian Leviston
Structural optimisation is not compression. Lurk more.

Julian

On 28/02/2012, at 3:38 PM, BGB wrote:

 granted, I remain a little skeptical.
 
 I think there is a bit of a difference though between, say, a log table, and 
 a typical piece of software.
 a log table is, essentially, almost pure redundancy, hence why it can be 
 regenerated on demand.
 
 a typical application is, instead, a big pile of logic code for a wide range 
 of behaviors and for dealing with a wide range of special cases.
 
 
 executable math could very well be functionally equivalent to a highly 
 compressed program, but note in this case that one needs to count both the 
 size of the compressed program, and also the size of the program needed to 
 decompress it (so, the size of the system would also need to account for 
 the compiler and runtime).
 
 although there is a fair amount of redundancy in typical program code (logic 
 that is often repeated,  duplicated effort between programs, ...), 
 eliminating this redundancy would still have a bounded reduction in total 
 size.
 
 increasing abstraction is likely to, again, be ultimately bounded (and, 
 often, abstraction differs primarily in form, rather than in essence, from 
 that of moving more of the system functionality into library code).
 
 
 much like with data compression, the concept commonly known as the Shannon 
 limit may well still apply (itself setting an upper limit to how much is 
 expressible within a given volume of code).

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Error trying to compile COLA

2012-02-27 Thread BGB

On 2/27/2012 10:08 PM, Julian Leviston wrote:

Structural optimisation is not compression. Lurk more.


probably will drop this, as arguing about all this is likely pointless 
and counter-productive.


but, is there any particular reason for why similar rules and 
restrictions wouldn't apply?


(I personally suspect that similar applies to nearly all forms of 
communication, including written and spoken natural language, and a 
claim that some X can be expressed in Y units does seem a fair amount 
like a compression-style claim).



but, anyways, here is a link to another article:
http://en.wikipedia.org/wiki/Shannon%27s_source_coding_theorem


Julian

On 28/02/2012, at 3:38 PM, BGB wrote:


granted, I remain a little skeptical.

I think there is a bit of a difference though between, say, a log table, and a 
typical piece of software.
a log table is, essentially, almost pure redundancy, hence why it can be 
regenerated on demand.

a typical application is, instead, a big pile of logic code for a wide range of 
behaviors and for dealing with a wide range of special cases.


executable math could very well be functionally equivalent to a highly compressed program, but 
note in this case that one needs to count both the size of the compressed program, and also the size of the 
program needed to decompress it (so, the size of the system would also need to account for the compiler and 
runtime).

although there is a fair amount of redundancy in typical program code (logic 
that is often repeated,  duplicated effort between programs, ...), eliminating 
this redundancy would still have a bounded reduction in total size.

increasing abstraction is likely to, again, be ultimately bounded (and, often, 
abstraction differs primarily in form, rather than in essence, from that of 
moving more of the system functionality into library code).


much like with data compression, the concept commonly known as the Shannon 
limit may well still apply (itself setting an upper limit to how much is 
expressible within a given volume of code).

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc