Re: [fonc] languages

2011-06-05 Thread Florin Mateoc
I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very operators used to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that addition distributes over multiplication instead of 
multiplication distributes over addition is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but many to most 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate about whether to put in normal precedence 
for the common arithmetic operators. The argument that won was based on the APL 
argument that if you have lots of operators then precedence just gets 
confusing, 
so just associate to the right or left. However, one of the big complaints 
about 
Smalltalk-80 from the culture that thinks parentheses are a good idea after 
if, is that it has a non-standard precedence for + and * 

A more interesting tradeoff perhaps is that between Tower of Babel and local 
high expressibility -- for example, when you decide to try lots of DSLs (as in 
STEPS). Each one has had the virtue of being very small and very clear about 
what is being expressed. At the meta level, the OMeta definitions of the syntax 
part are relatively small and relatively clear. But I think a big burden does 
get placed on the poor person from outside who is on the one hand presented 
with 
not a lot of code that  does do a lot, but they have to learn 4 or 5 
languages 
to actually understand it.

People of my generation (50 years ago) were used to learning and using many 
syntaxes (e.g. one might learn as many as 20 or more machine code/assembler 
languages, plus 10 or more HLLs, both kinds with more variability in form and 
intent than today). Part of this could have stemmed from the high percentage of 
math people involved in computing back then -- part of that deal is learning 
to handle many kinds of mathematical notations, etc. 


Things seem different today for most programmers.

In any case, one of the things we learned from Smalltalk-72 is that even good 
language designers tend to create poor extensions during the heat of 
programming 
and debugging. And that an opportunity for cohesion in an extensible language 
is 
rarely seized. (Consider just how poor is the cohesion in a much smaller part 
of 
all this -- polymorphism -- even though it is of  great benefit to everyone to 
have really strong (and few) 

Re: [fonc] languages

2011-06-05 Thread Alan Kay
Yep, and yep

Cheers,

Alan





From: Florin Mateoc fmat...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 3:51:23 PM
Subject: Re: [fonc] languages


But wasn't APL called a write-only language, which would make it in a way a 
polar opposite of Smalltalk?

I agree that it is not about consequences of message sending. And, while I 
also agree that uniformity/simplicity are also virtues, I think it is more 
useful to explicitly state that there are things which are truly different. 
Especially in an object system which models the world. Numbers would be in that 
category, they deserve to be treated specially. In the same vein, I think 
mathematical operators deserve special treatment, and not just from an under 
the covers, optimization point of view.

Thank you,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages


Check out APL, designed by a very good mathematician, to see why having no 
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in 
Smalltalk. 
Not from the math argument, or from a kids argument, but just because most 
conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with  consequences of message 
sending because what can be sent as a canonical form could be the abstract 
syntax packaging. Prolog had an idea -- that we thought about to some extent -- 
of being able to specify right and left precedences, but this was rejected as 
leading to real needless complexities.

Cheers,

Alan





From: Florin Mateoc fmat...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages


I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very  operators used 
to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that addition distributes over multiplication instead of 
multiplication distributes over addition is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And  since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but many to most 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate 

[fonc] Re: Electrical Actors?

2011-06-05 Thread Dale Schumacher
On Sun, Jun 5, 2011 at 5:23 PM, Casey Ransberger
casey.obrie...@gmail.com wrote:

 Has anyone taken the actor model down to the metal?

If someone has, I would sure like to hear about it!  There was the
Apiary machine, but I don't think that was ever physically built, only
simulated.

This is a concept that has been kicked around several times over the
last couple of years among some of my collaborators.  It is one of the
reasons why I ported the actor core runtime to the Arduino, to get a
step closer to the metal.

The SEND and BECOME primitives seem fairly straight-forward to
translate to hardware.  It is the CREATE primitive that I struggle
with.  Since we can't actually create new hardware elements, it
seems like it would have to be virtualized in some way.  Perhaps it
would be sufficient to virtualize it the same way we virtualize
processes, simulating multi-processing on a smaller number of cores.

Maybe there would be some way to activate latent nodes of processing
power, injecting them with their initial behavior as a way of
breathing life into them.  It could be just a matter of allocating
new actors the way we allocate memory.  Each hardware node could have
a capacity of available actors who only need a script to become alive.

I would love to explore this idea further and hear how you would
consider approaching the problem.

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-05 Thread BGB

On 6/5/2011 4:48 PM, Steve Wart wrote:

I like both Smalltalk and APL. I disagree with the assumption that
operator precedence is a big hurdle for people learning Smalltalk. At
least I find mathematical expressions in Smalltalk to be clearer than
their counterparts in Lisp. I like the following example:

[:n :k | (1 to: k) inject: 1 into: [:c :i | c * (n - k + i / i)]]

(defn choose [n k] (reduce (fn [c i] (* c (/ (+ (- n k) i) i))) 1
(range 1 (+ k 1



both are generally hard to understand, especially for someone not 
already well familiar with either language.




Okay maybe they're both hard to understand; nobody said math was easy.
Lisp has seen a huge resurgence in popularity thanks to Clojure.
Smalltalk has also seen nice growth, although on a much smaller scale,
and sadly it's not generally considered viable for enterprise software
development anymore (which is generally the kind of code that matters
to me, boring as it is). But math operators are a red herring. No
programming language really does math well (except maybe APL).
Accountants, engineers and scientists have got on well enough using
whatever lets them do their calculations, but by and large these
operations are a very small part of any reasonably-sized program.


it adds up quickly though...

a little more pain here, a little more pain there, and eventually there 
is pain all around.
much of what is done in creating a language or API isn't really big hard 
problems, the vast majority is actually cutting off sharp corners and 
edge cases, making everything a little nicer and a little more generic, 
which is then done recursively (and often over many layers).




After spending the better part of the past year poring over a very
large Smalltalk code base, I think the biggest conceptual barrier is
that understanding Smalltalk code requires tools that leverage the
language metadata to dynamically analyze what's going on (I'm talking
about menu commands to search for senders and implementers of various
methods, and similar beasts). I think these tools offer a mechanism
that will eventually give you a conceptual understanding of what the
code is doing. Maybe that can be formalized or be proved equivalent or
superior to the explicit type information provided in more
conventional programming languages, maybe not.


dunno... if these could be applied to aide with more conventional 
languages, maybe all the better.


it is worth noting though that many IDEs already do provide a lot of 
tools to aide with things.




Personally I don't think grep and javadoc are better, but the vast
majority of programmers in the world must disagree with me.


many people go for tools like Visual Studio and Eclipse, which do 
provide a lot of niceties (auto-complete, ability to see 
documentation-comments and possible argument lists as tool-tips, ...).


granted, I still use grep sometimes, but mostly because I do most of my 
coding in C with standalone (non-IDE) tools (mostly as, IMO, IDEs often 
provide more drawbacks than they are worth, usually because IDEs are 
often not terribly flexible or customizable).


maybe 90% of the time, one can remember where the code they are looking 
for is located.
granted, this is subject to the use of good code organization and naming 
practices.


I would personally like to see an IDE which was:
more-or-less language neutral, to what extent this was practical (more 
like traditional standalone editors);
not tied to or hard-coded for particular tools or build configurations 
(nearly everything would be actions tied to high-level scripts, which 
would be customizable per-project, and ideally in a readily 
human-editable form);

not being tied to a particular operating system;
...



Type systems are for reasoning about code, whereas most programs are
written with a computational intent that is generally not formalized
or even formalizable. While it's nice to have programs that can be
formally proved, if you can't prove that your specification is correct
too, there's not much point in it. Ultimately what matters is fitness
for purpose, a big part of which is social utility and communicating
the intent to someone far removed from the original implementation.


yeah.

usually though, if one knows that the program works, and is reasonably 
bug free, this is sufficient.


the extra mile, namely making code entirely bug-free, universally 
portable, ... is not generally worthwhile for most things (since it adds 
far more cost for relatively little apparent gain).


so, most things are tradeoff...

if it crashes sometimes, but at most these are at most a minor 
inconvenience, or the thing will not build or work on some uncommon 
target, then it may well be more cost-effective not to bother (and focus 
effort instead on any more immediate concerns).




In short, it's the libraries and how you can manage the dependencies
amongst your units of code that really matter most.



ok.



Cheers,
Steve

On Sun, Jun 5, 2011 at 3:55 PM, Alan 

Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Leibs
I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
on 
dynamic analysis and optimization of APL -- a very nice piece of work! (Maybe 
in 
the early 70s or late 60s?)

Cheers,

Alan





From: David Pennell pennell.da...@gmail.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:33:40 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

HP had a version of APL in the early 80's that included structured 
conditional 
statements and where performance didn't depend on cramming your entire program 
into one line of code.  Between the two, it was possible to create reasonably 
readable code.  That version of APl also did some clever performance 
optimizations by manipulating array descriptors instead just using brute force.

APL was the first language other than Fortran that I learned - very eye opening.


-david


On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay alan.n...@yahoo.com wrote:

Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





 From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re:  Terseness, precedence, deprogramming (was Re: [fonc] languages)


I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  


Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.


There is some old analysis out there that indicates  that APL is naturally 
very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.




-David Leibs


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Harris
Alan-

I expect you lost a few readers there.  I have fond memories of APL on an
IBM 360/145 with APL microcode support and Selectric terminals.

David


On Sun, Jun 5, 2011 at 7:13 PM, Alan Kay alan.n...@yahoo.com wrote:

 Hi David

 I've always been very fond of APL also -- and a slightly better and more
 readable syntax could be devised these days now that things don't have to be
 squeezed onto an IBM Selectric golfball ...

 Cheers,

 Alan

 --
 *From:* David Leibs david.le...@oracle.com
 *To:* Fundamentals of New Computing fonc@vpri.org
 *Sent:* Sun, June 5, 2011 7:06:55 PM
 *Subject:* Re: Terseness, precedence, deprogramming (was Re: [fonc]
 languages)

 I love APL!  Learning APL is really all about learning the idioms and how
 to apply them.  This takes quite a lot of training time.   Doing this kind
 of training will change the way you think.

 Alan Perlis quote:  A language that doesn't affect the way you think about
 programming, is not worth knowing.

 There is some old analysis out there that indicates that APL is naturally
 very parallel.  Willhoft-1991 claimed that  94 of the 101 primitives
 operations in APL2 could be implemented in parallel and that 40-50% of APL
 code in real applications was naturally parallel.

 R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30
 (1991), no. 4, 498–512.


 -David Leibs


 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Language in Test (was Re: [fonc] languages)

2011-06-05 Thread Casey Ransberger
I'm actually not talking about the potty mouths:)

APL is up there on my list now, but it hasn't knocked Prolog out of the top 
slot. 

I've done a bunch of test automation. I really enjoy testing because on a good 
day it can approach something reminiscent of science, but OTOH the test code I 
ended up wrangling (often my own code) wound up the worst sort of mess, for a 
few reasons. Not-test code that I've worked on or composed myself has always 
been a lot better, for reasons I don't totally understand yet. 

I can toss some guesses out there:

One is that people who do automation are often outnumbered 5:1 or worse by the 
people making the artifacts under examination, such that there's too much to do 
in order to do anything very well.

Another is, testing often fails to strike decision makers as important enough 
to invest much in, so you end up hacking your way around blocking issues with 
the smallest kludge you can think of, instead of instrumenting proper hooks 
into the thing under test, which usually takes a little longer, and risks 
further regression in the context of a release schedule. 

Things I learned from Smalltalk and Lisp have been really useful for reducing 
the amount of horror in the test code I've worked on, but it's still kind of 
bad. Actually I was inspired to look for an EDSL in my last adventure that 
would cut down on the cruft in the test code there, which was somewhat inspired 
by STEPS, and did seem to actually help quite a bit. 

Use of an EDSL proved very useful, in part just because most engineering orgs 
I've been in don't seem to want to let me use Lisp. 

Being able to claim honestly that I'd implemented what I'd done in Ruby seemed 
to help justify the unorthodox approach to my managers. I did go out of my way 
to explain that what I'd done was compose a domain specific language, and this 
did not seem to get the same kind of resistance, just a few raised eyebrows and 
funny looks. 

I keep getting the feeling that the best tool for the job of encoding and 
running invariants might be a Prolog, and so this one is currently at the top 
of my list of things to understand deeply.  

Anyone else here ever deal with a bunch of automation? Ever find a way to apply 
what you know about programming languages themselves in the context of software 
verification? Because I would *love* to hear about that!

On Jun 5, 2011, at 7:06 PM, David Leibs david.le...@oracle.com wrote:

 I love APL!  Learning APL is really all about learning the idioms and how to 
 apply them.  This takes quite a lot of training time.   Doing this kind of 
 training will change the way you think.  
 
 Alan Perlis quote:  A language that doesn't affect the way you think about 
 programming, is not worth knowing.

I love this quote. Thanks for your 

(snip)

 
 -David Leibs
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread BGB

On 6/5/2011 7:06 PM, David Leibs wrote:
I love APL!  Learning APL is really all about learning the idioms and 
how to apply them.  This takes quite a lot of training time.   Doing 
this kind of training will change the way you think.


Alan Perlis quote:  A language that doesn't affect the way you think 
about programming, is not worth knowing.




not everyone wants to learn new things though.

very often, people want to get the job done as their main priority, and 
being faced with new ideas or new ways of doing things is a hindrance to 
the maximization of productivity (or, people may see any new/unfamiliar 
things as inherently malevolent).


granted, these are not the sort of people one is likely to find using a 
language like APL...



a lot depends on who ones' market or target audience is, and whether or 
not they like the thing in question. if it is the wrong product for the 
wrong person, one isn't going to make a sale (and people neither like 
having something being forced on them, nor buying or committing 
resources to something which is not to to their liking, as although 
maybe not immediate, this will breed frustration later...).


it doesn't mean either the product or the potential customer is bad, 
only that things need to be matched up.


it is like, you don't put sugar in the coffee of someone who likes their 
coffee black.



There is some old analysis out there that indicates that APL is 
naturally very parallel.  Willhoft-1991 claimed that  94 of the 101 
primitives operations in APL2 could be implemented in parallel and 
that 40-50% of APL code in real applications was naturally parallel.


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 
30 (1991), no. 4, 498–512.




not personally dealt with APL, so I don't know.

I sort of like having the ability to write code with asynchronous 
operations, but this is a little different. I guess a task for myself 
would be to determine whether or not what I am imagining as 'async' is 
equivalent to the actor model, hmm...


decided to leave out a more complex elaboration.


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] Re: Electrical Actors?

2011-06-05 Thread Max OrHai
You might get a kick out of this toy model I made to demonstrate how a mesh
(or cloud) of minimal hardware actors can work together to compute. It's
the latest in a series of explorations of the particle / field concept...

http://cs.pdx.edu/~orhai/mesh-sort

I think there's a lot that can be done with fine-grained homogeneous
self-contained hardware in quantity, and I may get around to building a
poor man's Connection Machine out of a bucketful of microcontrollers one
of these days. The AVR is quite a capable machine for  $5 apiece!

-- Max

On Sun, Jun 5, 2011 at 6:44 PM, David Pennell pennell.da...@gmail.comwrote:

 Note that you can create new HW in a cloud environment.


 On Sun, Jun 5, 2011 at 8:13 PM, Casey Ransberger 
 casey.obrie...@gmail.com wrote:

 Thank you for your reply, comments inline.

 On Jun 5, 2011, at 4:25 PM, Dale Schumacher dale.schumac...@gmail.com
 wrote:

  On Sun, Jun 5, 2011 at 5:23 PM, Casey Ransberger
  casey.obrie...@gmail.com wrote:
 
  Has anyone taken the actor model down to the metal?
 
  If someone has, I would sure like to hear about it!  There was the
  Apiary machine, but I don't think that was ever physically built, only
  simulated.

 Googling...

 
 (snip)

  The SEND and BECOME primitives seem fairly straight-forward to
  translate to hardware.  It is the CREATE primitive that I struggle
  with.

  Since we can't actually create new hardware elements

 (snip)

 Oh, yeah. That makes sense.

  Maybe there would be some way to activate latent nodes of processing
  power, injecting them with their initial behavior as a way of
  breathing life into them.

 I really like this idea.

  It could be just a matter of allocating
  new actors the way we allocate memory.  Each hardware node could have
  a capacity of available actors who only need a script to become alive.

 This is not far off from what I was already daydreaming about. When I
 started I thought those guys looked like a kind of regular object animator
 that would light up when something was bound. I'd likely have to cache the
 ones that didn't fit on the chip somewhere.

 Maybe to deal with concurrency I should really start thinking of them as
 actor animators.

 I'm sure there's a way to pull this off. Even if it's by having a lot of
 FPGAs on the logic board so that I can compensate for reconfiguration
 latency by switching between them, but I don't think that idea fits any goal
 around a parsimonious architecture, which is one thing that I'm after. The
 synchronization problems I'd expect also seem awful, unless someone out
 there has thought a bunch about doing a low-level TeaTime (or what have
 you.)

 So I'm really hoping I can find a general thing that I can just place
 many identical copies of in the die or whatever it is we use now... ahem.
 I am such a noob! And then just swap them out to main memory or a local
 cache when I run out.

  I would love to explore this idea further and hear how you would
  consider approaching the problem.

 I will definitely CC you if I think I've gotten somewhere with it. Feel
 free to send me a note if you have any big aha-moments, because I have a
 tiny slab of time to run at that fence before I'm going to have to get back
 to work, and any/all help that I can get will be much appreciated.

 If I made it, I'd likely build a couple of boxes and try to pass them off
 as art (like what one buys for the wall,) but my plan is to make everything
 you need (IP cores appears to be the term of industry) to do it yourself
 available under the MIT license if and when I've made some actual progress.

 I reckon I have a better shot at getting to actually use it if I just
 give it away!
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc




 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


RE: [fonc] Alternative Web programming models?

2011-06-05 Thread david hussman
They have their payment story together. They will contact you will all the
arrangements. I think they cover flights and hotel and pay different fees
for different services (e.g. teaching a course verses giving a talk). I may
be off base if you are only giving a talk because I have always done the
combo deal.

I suggest you ping Lee directly. Unlike other conference organizers we know,
he will respond.

-Original Message-
From: fonc-boun...@vpri.org [mailto:fonc-boun...@vpri.org] On Behalf Of
Michael Forster
Sent: Friday, June 03, 2011 1:19 PM
To: Fundamentals of New Computing
Subject: Re: [fonc] Alternative Web programming models?

On Fri, Jun 3, 2011 at 12:58 PM, C. Scott Ananian csc...@laptop.org wrote:
[...]

 The web is not *only* an OS.  It also provides the backing data for a 
 very large unstructured database.  Google of course realize this, as 
 their company rests on a search engine.  The semantic web folks have 
 tried in vain to get people to add more structure to the database.
 What the web OS must do is allow the efficient export of additional 
 *unstructured and ad hoc* data.  HTML+CSS web applications today are
[...]

Sorry for the tanget, but there is no such thing as an unstructured
database.  Whether talking about the logical or physical level, a database
is a specfication of data structure (and constraints upon that data).  Dr.
Kay once characterised computing as a pop culture,
and statements such as the above reflect that.

Regards,

Mike

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


RE: [fonc] Alternative Web programming models?

2011-06-05 Thread david hussman
Please forgive this email.

-Original Message-
From: fonc-boun...@vpri.org [mailto:fonc-boun...@vpri.org] On Behalf Of
david hussman
Sent: Monday, June 06, 2011 12:02 AM
To: 'Fundamentals of New Computing'
Subject: RE: [fonc] Alternative Web programming models?

They have their payment story together. They will contact you will all the
arrangements. I think they cover flights and hotel and pay different fees
for different services (e.g. teaching a course verses giving a talk). I may
be off base if you are only giving a talk because I have always done the
combo deal.

I suggest you ping Lee directly. Unlike other conference organizers we know,
he will respond.

-Original Message-
From: fonc-boun...@vpri.org [mailto:fonc-boun...@vpri.org] On Behalf Of
Michael Forster
Sent: Friday, June 03, 2011 1:19 PM
To: Fundamentals of New Computing
Subject: Re: [fonc] Alternative Web programming models?

On Fri, Jun 3, 2011 at 12:58 PM, C. Scott Ananian csc...@laptop.org wrote:
[...]

 The web is not *only* an OS.  It also provides the backing data for a 
 very large unstructured database.  Google of course realize this, as 
 their company rests on a search engine.  The semantic web folks have 
 tried in vain to get people to add more structure to the database.
 What the web OS must do is allow the efficient export of additional 
 *unstructured and ad hoc* data.  HTML+CSS web applications today are
[...]

Sorry for the tanget, but there is no such thing as an unstructured
database.  Whether talking about the logical or physical level, a database
is a specfication of data structure (and constraints upon that data).  Dr.
Kay once characterised computing as a pop culture,
and statements such as the above reflect that.

Regards,

Mike

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc