Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-06 Thread C. Scott Ananian
On Sun, Jun 5, 2011 at 8:13 PM, Casey Ransberger
casey.obrie...@gmail.comwrote:

 Another approach I think is really cool is actually just using mathematical
 notation as one representation of what's otherwise basically an s-expr, in
 which case I think one is having some cake and eating it too. I've been
 playing with Mathematica a bit, which seems to do some of that. Kind of like
 flipping between the piano roll and the notation view in a music program. I
 also think their big algorithm library is a really lovely idea... If it
 doesn't necessarily separate meaning from optimization, it at least seems
 like it could help separate math from glue and plumbing  which would just be
 a godsend in my work, where it often feels like I have to
 read-between-the-lines to find an algorithm in a piece of code I'm looking
 at.


You would like Fortress: http://labs.oracle.com/projects/plrg/faq/NAS-CG.pdf
Here is an example of runnable Fortress code:
http://labs.oracle.com/projects/plrg/faq/NAS-CG.pdf
(There is also an ASCII-only syntax for us luddites.)
  --scott
-- 
  ( http://cscott.net )
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Casey Ransberger
I've heard of an IDE called VisualAge (I think?) that was written in Smalltalk 
but could parse and to a degree reason about other languages, but I've never 
seen it. 

Have you looked for that thing, or was it just not so great?

On Jun 5, 2011, at 11:55 PM, BGB cr88...@gmail.com wrote:

 On 6/5/2011 11:03 PM, C. Scott Ananian wrote:
 
 On Sun, Jun 5, 2011 at 8:35 PM, BGB cr88...@gmail.com wrote:
 I would personally like to see an IDE which was:
 more-or-less language neutral, to what extent this was practical (more like 
 traditional standalone editors);
 not tied to or hard-coded for particular tools or build configurations 
 (nearly everything would be actions tied to high-level scripts, which 
 would be customizable per-project, and ideally in a readily human-editable 
 form);
 not being tied to a particular operating system;
 ...
 
 This is Eclipse.  Granted, it's an IDE which is designed-by-committee and 
 hard to love, but it answers all of your requirements.
   --scott
 
 
 I don't believe Eclipse is it, exactly...
 it handles multiple languages, yes, and can be used with multiple operating 
 systems, and supports multiple compiler backends, ...
 
 however, AFAIK, pretty much all of the logic is written in Java-based 
 plugins, which is not ideal (and so, essentially the logic is tied to Eclipse 
 itself, and not to the individual projects).
 
 
 I was imagining something a little different here, such as the project 
 control files being more like Makefiles or Bash-scripts, and so would be 
 plain-text and attached to the project (along with the source files), where 
 it is possible to control things much more precisely per-project. more 
 precisely, I had imagined essentially a hybrid of Makefiles and Bash.
 
 also imagined was the possibility of using JavaScript (or similar) as the 
 build-control language, just using JS in a manner similar to Make+Bash, 
 likely with some special-purpose API functionality (to make it more usable 
 for Make-like purposes).
 
 a difficulty with JS though is that, normally, IDEs like things to be fairly 
 declarative, and JS code its not declarative, unless the JS is split into 
 multiple parts:
 info about the project proper is stored in a JSON-based format, and then any 
 build logic is JS files attached to the project.
 
 so, the IDE would mostly just manage files and editors, and invoke the 
 appropriate scripts as needed, and many IDE actions essentially just call 
 functions, and so one causes something to happen by replacing the default 
 action functions (such as in a script loaded by the project file).
 
 actually, conceptually I like the JS route more, even if it would likely be a 
 little more verbose than a Bash-like syntax.
 
 
 IMO, the next best alternative is SciTE, so what I was imagining would be a 
 more expanded version of SciTE.
 
 then there is also CMake, ...
 
 there is also SCons, which is conceptually related to the prior idea, but it 
 based on Python.
 
 
 but, for the most part, I have mostly just ended up sticking with good old 
 text editors and makefiles, as these have served me well, despite their 
 drawbacks (the cost of switching to an alternative strategy likely being 
 somewhat higher than that of doing nothing and staying with the present 
 strategy). IOW, the if it aint broke, don't fix it strategy...
 
 
 or such...
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/6/2011 12:29 AM, Casey Ransberger wrote:
I've heard of an IDE called VisualAge (I think?) that was written in 
Smalltalk but could parse and to a degree reason about other 
languages, but I've never seen it.


Have you looked for that thing, or was it just not so great?



not really looked at VisualAge...

was intending here to look some at Code::Blocks, since I got thinking 
about IDEs some.



personally, I have not been as much into Smalltalk, due mostly to my 
apparent inability to understand it when looking at it (look at syntax 
reference, look at code, feel utterly confused as to just what I am 
looking at...). (like, somehow, my brain can't really parse it or make 
much sense of it).


I have taken some ideas off of both Smalltalk and Self, although in the 
form of a language I can more easily understand though...


granted, there are other languages like this (like my apparent inability 
to really make sense of Haskell either).


although, I can generally read/understand Forth and PostScript 
acceptably well, so I really don't know sometimes.



although, recently when writing documentation for some parts of my 
language, and made an observation:

str=Hello;
s=str;
while(*s)
printf(%c, *s++);
printf(\n);

basically, along the lines of:
holy crap... my language still retains a fair amount in common with C

this was not originally intended (mostly, I was trying to implement 
ECMA-262 and add ActionScript and Java like features), just the 
combination of little things (implementing stuff, thinking oh well, 
this would be nifty...) happens to allow a few C-like constructions to 
be written.


also:
buf=new char[256];
str=Hello;
t=buf; s=str;
while(*t++=*s++);

funny how this works sometimes...


or such...


On Jun 5, 2011, at 11:55 PM, BGB cr88...@gmail.com 
mailto:cr88...@gmail.com wrote:



On 6/5/2011 11:03 PM, C. Scott Ananian wrote:
On Sun, Jun 5, 2011 at 8:35 PM, BGB cr88...@gmail.com 
mailto:cr88...@gmail.com wrote:


I would personally like to see an IDE which was:
more-or-less language neutral, to what extent this was practical
(more like traditional standalone editors);
not tied to or hard-coded for particular tools or build
configurations (nearly everything would be actions tied to
high-level scripts, which would be customizable per-project, and
ideally in a readily human-editable form);
not being tied to a particular operating system;
...


This is Eclipse.  Granted, it's an IDE which is 
designed-by-committee and hard to love, but it answers all of your 
requirements.

  --scott



I don't believe Eclipse is it, exactly...
it handles multiple languages, yes, and can be used with multiple 
operating systems, and supports multiple compiler backends, ...


however, AFAIK, pretty much all of the logic is written in Java-based 
plugins, which is not ideal (and so, essentially the logic is tied to 
Eclipse itself, and not to the individual projects).



I was imagining something a little different here, such as the 
project control files being more like Makefiles or Bash-scripts, and 
so would be plain-text and attached to the project (along with the 
source files), where it is possible to control things much more 
precisely per-project. more precisely, I had imagined essentially a 
hybrid of Makefiles and Bash.


also imagined was the possibility of using JavaScript (or similar) as 
the build-control language, just using JS in a manner similar to 
Make+Bash, likely with some special-purpose API functionality (to 
make it more usable for Make-like purposes).


a difficulty with JS though is that, normally, IDEs like things to be 
fairly declarative, and JS code its not declarative, unless the JS is 
split into multiple parts:
info about the project proper is stored in a JSON-based format, and 
then any build logic is JS files attached to the project.


so, the IDE would mostly just manage files and editors, and invoke 
the appropriate scripts as needed, and many IDE actions essentially 
just call functions, and so one causes something to happen by 
replacing the default action functions (such as in a script loaded by 
the project file).


actually, conceptually I like the JS route more, even if it would 
likely be a little more verbose than a Bash-like syntax.



IMO, the next best alternative is SciTE, so what I was imagining 
would be a more expanded version of SciTE.


then there is also CMake, ...

there is also SCons, which is conceptually related to the prior idea, 
but it based on Python.



but, for the most part, I have mostly just ended up sticking with 
good old text editors and makefiles, as these have served me well, 
despite their drawbacks (the cost of switching to an alternative 
strategy likely being somewhat higher than that of doing nothing and 
staying with the present strategy). IOW, the if it aint broke, don't 
fix it strategy...



or such...

___
fonc mailing list

Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-06 Thread Alan Kay
Yep ...

As Abrams pointed out, Beating should be pronounced Bee-Ating because it 
was 
a promotion scheme that reminded him of the beatification process in the path 
towards sainthood ...

Cheers,

Alan





From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 9:59:33 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

Alan,
Your memory for great dissertations is amazing.  I don't think the Phil Abrams 
APL machine was ever actually built but It had some really good techniques for 
making APL efficient colorfully named beating and drag-along.  

-djl


On Jun 5, 2011, at 7:50 PM, Alan Kay wrote:

I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
on 
dynamic analysis and optimization of APL -- a very nice piece of work! (Maybe 
in 
the early 70s or late 60s?)

Cheers,

Alan





From: David Pennell pennell.da...@gmail.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:33:40 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

HP had a version of APL in the early 80's that included structured 
conditional 
statements and where performance didn't depend on cramming your entire program 
into one line of code.  Between the two, it was possible to create reasonably 
readable code.  That version of APl also did some clever performance 
optimizations by manipulating array descriptors instead just using brute force.


APL was the first language other than Fortran that I learned - very eye 
opening.



-david


On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay alan.n...@yahoo.com wrote:

Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)


I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  


Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.


There is some old analysis out there that indicates that APL is naturally 
very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.




-David Leibs


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread K. K. Subramaniam
On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
 People of my generation (50 years ago) were used to learning and using
 many  syntaxes (e.g. one might learn as many as 20 or more machine
 code/assembler languages, plus 10 or more HLLs, both kinds with more
 variability in form and intent than today).
Learning multiple languages didn't stop with your generation ;-). In the 80s, 
the fashion of the day was not only to learn many languages but also invent 
your own! One of the languages was called JOVIAL, an acronym for Jules Own 
Version of Algol!

;-) .. Subbu

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Alan Kay
Hi Subbu

Check out when Jules Schwartz actual did Jovial. And the acronym was actually 
Jules' Own Version of the International Algebraic Language

Cheers,

Alan





From: K. K. Subramaniam kksubbu...@gmail.com
To: fonc@vpri.org
Cc: Alan Kay alan.n...@yahoo.com
Sent: Mon, June 6, 2011 8:34:08 AM
Subject: Re: [fonc] languages

On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
 People of my generation (50 years ago) were used to learning and using
 many  syntaxes (e.g. one might learn as many as 20 or more machine
 code/assembler languages, plus 10 or more HLLs, both kinds with more
 variability in form and intent than today).
Learning multiple languages didn't stop with your generation ;-). In the 80s, 
the fashion of the day was not only to learn many languages but also invent 
your own! One of the languages was called JOVIAL, an acronym for Jules Own 
Version of Algol!

;-) .. Subbu
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread K. K. Subramaniam
Alan,

Thanks for the correction. IAL was one of the proposed names for the ALGOL, 
wasn't it?

The reason why this name popped up from my grad days was because something as 
complicated as designing a new programming language was considered a fun thing 
to do. It wasn't as much fun for those who had to maintain programs written in 
them years down the line ;-). Gerald Weinberg's parody - Levine, the great 
Tailor - should serve as a lesson even today.

Subbu

On Monday 06 Jun 2011 9:37:02 PM Alan Kay wrote:
 Hi Subbu
 
 Check out when Jules Schwartz actual did Jovial. And the acronym was
 actually Jules' Own Version of the International Algebraic Language
 
 Cheers,
 
 Alan
 
 
 
 
 
 From: K. K. Subramaniam kksubbu...@gmail.com
 To: fonc@vpri.org
 Cc: Alan Kay alan.n...@yahoo.com
 Sent: Mon, June 6, 2011 8:34:08 AM
 Subject: Re: [fonc] languages
 
 On Sunday 05 Jun 2011 12:16:33 AM Alan Kay wrote:
  People of my generation (50 years ago) were used to learning and using
  many  syntaxes (e.g. one might learn as many as 20 or more machine
  code/assembler languages, plus 10 or more HLLs, both kinds with more
  variability in form and intent than today).
 
 Learning multiple languages didn't stop with your generation ;-). In the
 80s, the fashion of the day was not only to learn many languages but also
 invent your own! One of the languages was called JOVIAL, an acronym for
 Jules Own Version of Algol!
 
 ;-) .. Subbu

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread Casey Ransberger
Inline

On Jun 6, 2011, at 10:48 AM, Alan Kay alan.n...@yahoo.com wrote:

 It was ... and is mostly associated with what came to be called Algol 58, but 
 not Algol 60.
 
 Another way to look at it is that almost all systems are difficult to 
 maintain down the line -- partly because they were not designed with this in 
 mind -- and this is true for most programming languages. However, I don't 
 think this is necessary, but more an artifact of incomplete design.

This, and design drift, wherein over time various forms of pseudo-arch get 
piled up and end up jutting out at weird angles:)

 Cheers,
 
 Alan
 
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread David Barbour
On Sat, Jun 4, 2011 at 10:44 AM, Julian Leviston jul...@leviston.netwrote:

 Is a language I program in necessarily limiting in its expressibility?


Yes. All communication architectures are necessarily limiting in their
expressiveness (in the sense defined by Matthias Felleisen). For example,
can't easily introduce reactivity, concurrency, and constraint models to
languages not already designed for them. Even with extensible syntax, you
might be forced to re-design, re-implement. and re-integrate all the
relevant libraries and services from scratch to take advantage of a new
feature. Limitations aren't always bad things, though (e.g. when pursuing
security, scalability, safety, resilience, modularity, extensibility,
optimizations). We can benefit greatly from favoring 'principle of least
power' in our language designs.



 Is there an optimum methodology of expressing algorithms (ie nomenclature)?



No. From Kolmogorov complexity and pigeon-hole principles, we know that any
given language must make tradeoffs in how efficiently it expresses a given
behavior.  The language HQ9+ shows us that we can (quite trivially) optimize
expression of any given behavior by tweaking the language. Fortunately,
there are a lot of 'useless' algorithms that we'll never need to express.
Good language design is aimed at optimizing, abstracting, and refactoring
expression of useful, common behaviors, even if at some expense to rare or
less useful behaviors.



Is there a good or bad way of expressing intent?


There are effective and ineffective ways of expressing intent.

We certainly want to minimize boiler-plate and noise. If our languages
impose semantic properties (such as ordering of a collection) where we
intend none, we have semantic noise. If our languages impose syntactic
properties (such as  semicolons) where they have no meaning to the
developer, we have syntactic noise. If our languages fail to abstract or
refactor some pattern, we get boiler-plate (and recognizable 'design
patterns').

But we also don't want to sacrifice performance, security, modularity, et
cetera. So sometimes we take a hit on how easily we can express intent.



is there a way of patterning a language of programming such that it can
 extend itself infinitely, innately?


Yes. But you must sacrifice various nice properties (e.g. performance,
securability, modularity, composition) to achieve it.

If you're willing to sacrifice ad-hoc extension of cross-cutting features
(e.g. reactivity, concurrency, failure handling, auditing, resource
management) you can still achieve most of what you want, and embed a few
frameworks and EDSLs (extensible syntax or partial evaluation) to close the
remaining expressiveness gap. If you have a decent concurrency and
reactivity mode, you should even be able to abstract and compose IOC
frameworks as though they were normal objects.
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: [fonc] languages

2011-06-06 Thread BGB

On 6/6/2011 6:05 PM, David Barbour wrote:



On Sat, Jun 4, 2011 at 10:44 AM, Julian Leviston jul...@leviston.net 
mailto:jul...@leviston.net wrote:


Is a language I program in necessarily limiting in its expressibility?


Yes. All communication architectures are necessarily limiting in their 
expressiveness (in the sense defined by Matthias Felleisen). For 
example, can't easily introduce reactivity, concurrency, and 
constraint models to languages not already designed for them. Even 
with extensible syntax, you might be forced to re-design, 
re-implement. and re-integrate all the relevant libraries and services 
from scratch to take advantage of a new feature. Limitations aren't 
always bad things, though (e.g. when pursuing security, scalability, 
safety, resilience, modularity, extensibility, optimizations). We can 
benefit greatly from favoring 'principle of least power' in our 
language designs.


interesting... the principle of least power is something I hadn't 
really thought about previously...





Is there an optimum methodology of expressing algorithms (ie
nomenclature)? 



No. From Kolmogorov complexity and pigeon-hole principles, we know 
that any given language must make tradeoffs in how efficiently it 
expresses a given behavior.  The language HQ9+ shows us that we can 
(quite trivially) optimize expression of any given behavior by 
tweaking the language. Fortunately, there are a lot of 'useless' 
algorithms that we'll never need to express. Good language design is 
aimed at optimizing, abstracting, and refactoring expression of 
useful, common behaviors, even if at some expense to rare or less 
useful behaviors.


yeah...

I think many mainstream languages show this property, as they will often 
be specialized for some sets of tasks, but more far reaching features 
(ability to extend the syntax or core typesystem, ...) are generally absent.





Is there a good or bad way of expressing intent?


There are effective and ineffective ways of expressing intent.

We certainly want to minimize boiler-plate and noise. If our languages 
impose semantic properties (such as ordering of a collection) where we 
intend none, we have semantic noise. If our languages impose syntactic 
properties (such as  semicolons) where they have no meaning to the 
developer, we have syntactic noise. If our languages fail to abstract 
or refactor some pattern, we get boiler-plate (and recognizable 
'design patterns').


But we also don't want to sacrifice performance, security, modularity, 
et cetera. So sometimes we take a hit on how easily we can express intent.




yeah...

usually with semicolons, it is either semicolons or significant 
line-breaks (or hueristics which try to guess whether a break was intended).

semicolons then are the lesser of the evils.

granted, yes, one wouldn't need either if the syntax were designed in a 
way where statements and expressions were naturally self-terminating, 
however, with common syntax design, this is often not the case, and so 
extra symbols are needed mostly as separators or to indicate the 
syntactic structure.




is there a way of patterning a language of programming such that
it can extend itself infinitely, innately?


Yes. But you must sacrifice various nice properties (e.g. performance, 
securability, modularity, composition) to achieve it.


If you're willing to sacrifice ad-hoc extension of cross-cutting 
features (e.g. reactivity, concurrency, failure handling, auditing, 
resource management) you can still achieve most of what you want, and 
embed a few frameworks and EDSLs (extensible syntax or partial 
evaluation) to close the remaining expressiveness gap. If you have a 
decent concurrency and reactivity mode, you should even be able to 
abstract and compose IOC frameworks as though they were normal objects.




yep, and often a lot of this isn't terribly useful in practice.

and, likewise, a lot of advanced functionality can be added more narrowly:
API functionality;
special purpose attributes or modifiers;
...


personally, I keep around a few high power concepts, but these are far 
less than I could do.


for example, I had gotten in arguments with someone before about my 
languages' lack of macro facilities or user-defined syntax extensions 
(or, at least, in-language syntax extensions).


this was partly because both would open up additional and somewhat more 
awkward issues, for example, macros (in the Common Lisp sense) could 
risk exposing an uncomfortable number of implementation details. 
likewise goes for extensible syntax.


some basic amount of extension is possible though mostly by registering 
callbacks, and at most levels of the tower it is possible to register 
new callbacks for new functionality (this is actually how a fair amount 
of the VM itself is implemented).


most things are generally in a form more like how do I perform 
operation X given Y?, so lots of registering handlers for various 
operations, and 

Re: [fonc] languages

2011-06-05 Thread Florin Mateoc
I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very operators used to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that addition distributes over multiplication instead of 
multiplication distributes over addition is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but many to most 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate about whether to put in normal precedence 
for the common arithmetic operators. The argument that won was based on the APL 
argument that if you have lots of operators then precedence just gets 
confusing, 
so just associate to the right or left. However, one of the big complaints 
about 
Smalltalk-80 from the culture that thinks parentheses are a good idea after 
if, is that it has a non-standard precedence for + and * 

A more interesting tradeoff perhaps is that between Tower of Babel and local 
high expressibility -- for example, when you decide to try lots of DSLs (as in 
STEPS). Each one has had the virtue of being very small and very clear about 
what is being expressed. At the meta level, the OMeta definitions of the syntax 
part are relatively small and relatively clear. But I think a big burden does 
get placed on the poor person from outside who is on the one hand presented 
with 
not a lot of code that  does do a lot, but they have to learn 4 or 5 
languages 
to actually understand it.

People of my generation (50 years ago) were used to learning and using many 
syntaxes (e.g. one might learn as many as 20 or more machine code/assembler 
languages, plus 10 or more HLLs, both kinds with more variability in form and 
intent than today). Part of this could have stemmed from the high percentage of 
math people involved in computing back then -- part of that deal is learning 
to handle many kinds of mathematical notations, etc. 


Things seem different today for most programmers.

In any case, one of the things we learned from Smalltalk-72 is that even good 
language designers tend to create poor extensions during the heat of 
programming 
and debugging. And that an opportunity for cohesion in an extensible language 
is 
rarely seized. (Consider just how poor is the cohesion in a much smaller part 
of 
all this -- polymorphism -- even though it is of  great benefit to everyone to 
have really strong (and few

Re: [fonc] languages

2011-06-05 Thread Alan Kay
Yep, and yep

Cheers,

Alan





From: Florin Mateoc fmat...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 3:51:23 PM
Subject: Re: [fonc] languages


But wasn't APL called a write-only language, which would make it in a way a 
polar opposite of Smalltalk?

I agree that it is not about consequences of message sending. And, while I 
also agree that uniformity/simplicity are also virtues, I think it is more 
useful to explicitly state that there are things which are truly different. 
Especially in an object system which models the world. Numbers would be in that 
category, they deserve to be treated specially. In the same vein, I think 
mathematical operators deserve special treatment, and not just from an under 
the covers, optimization point of view.

Thank you,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages


Check out APL, designed by a very good mathematician, to see why having no 
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in 
Smalltalk. 
Not from the math argument, or from a kids argument, but just because most 
conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with  consequences of message 
sending because what can be sent as a canonical form could be the abstract 
syntax packaging. Prolog had an idea -- that we thought about to some extent -- 
of being able to specify right and left precedences, but this was rejected as 
leading to real needless complexities.

Cheers,

Alan





From: Florin Mateoc fmat...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages


I would object to the claim that complaints about the non-standard precedence 
are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it clashes 
with a well-established convention in math, therefore with the 
readability/expressibility of math formulae. Of course, a mathematician can 
agree that these are only conventions, but a mathematician already thinks in a 
highly abstract way, whereas for Smalltalk the argument was made that this 
would 
somehow help kids think more abstractly and better get the concept of 
precedence. But kids do not think abstractly. Furthermore, the precedence of + 
and * is not so much about the behavior of arithmetic operators. Regardless 
over 
what mathematical structure we define them, they are the very  operators used 
to 
define the notion of distributivity, closely related to the notion of 
precedence. Saying that addition distributes over multiplication instead of 
multiplication distributes over addition is not more abstract, it just 
confuses the notions. We might as well start writing Smalltalk with lower caps 
as the first letter in all identifiers followed by all caps. aND cLAIM tHAT 
tHIS 
wILL hELP kIDS  tHINK mORE aBSTRACTLY aND sEE tHAT tHE wAY wE cAPITALIZE iS 
oNLY 
a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded 
precedence, they might as well use left to right. But the few of them that do 
would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case for 
multimethods in Smalltalk, and that the precedence problem is more a 
consequence 
of the lack of multimethods. Anyway for numbers the matter of behavior 
responsibility is also questionable. And  since binary selectors are already 
recognized as different in the language, implementing operators as multimethods 
could be done with minimal impact for readability. This approach could even be 
extended to support multikeyword multimethods by having the first keyword start 
with a binary selector character.

Best,
Florin





From: Alan Kay alan.n...@yahoo.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages


Smalltalk was certainly not the first attempt -- and -- the most versatile 
Smalltalk in this area was the first Smalltalk and also the smallest.


I personally think expressibility is not just semantic, but also syntactic. 
Many 
different styles of programming have been realized in Lisp, but many to most 
of them suffer from the tradeoffs of the uniform parentheses bound notation 
(there are positive aspects of this also because the uniformity does remove one 
kind of mystery). 


The scheme that Dan Ingalls devised for the later Smalltalks overlapped with 
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk 
program at sight, no matter how much the semantics had been extended.  
Similarly, there was a long debate

Re: [fonc] languages

2011-06-05 Thread BGB
 Kayalan.n...@yahoo.com  wrote:

Yep, and yep

Cheers,

Alan


From: Florin Mateocfmat...@yahoo.com
To: Fundamentals of New Computingfonc@vpri.org
Sent: Sun, June 5, 2011 3:51:23 PM
Subject: Re: [fonc] languages

But wasn't APL called a write-only language, which would make it in a way
a polar opposite of Smalltalk?

I agree that it is not about consequences of message sending. And, while I
also agree that uniformity/simplicity are also virtues, I think it is more
useful to explicitly state that there are things which are truly
different. Especially in an object system which models the world. Numbers
would be in that category, they deserve to be treated specially. In the
same vein, I think mathematical operators deserve special treatment, and
not just from an under the covers, optimization point of view.

Thank you,
Florin


From: Alan Kayalan.n...@yahoo.com
To: Fundamentals of New Computingfonc@vpri.org
Sent: Sun, June 5, 2011 2:30:23 PM
Subject: Re: [fonc] languages

Check out APL, designed by a very good mathematician, to see why having no
special precedences has merit in a language with lots of operators.

However, I think that we should have used the standard precedences in
Smalltalk. Not from the math argument, or from a kids argument, but just
because most conventional routes deposit the conventions on the travelers.

The arguments either way don't have much to do with  consequences of
message sending because what can be sent as a canonical form could be the
abstract syntax packaging. Prolog had an idea -- that we thought about to
some extent -- of being able to specify right and left precedences, but this
was rejected as leading to real needless complexities.

Cheers,

Alan


From: Florin Mateocfmat...@yahoo.com
To: Fundamentals of New Computingfonc@vpri.org
Sent: Sun, June 5, 2011 11:17:04 AM
Subject: Re: [fonc] languages

I would object to the claim that complaints about the non-standard
precedence are somehow characteristic to the culture of if(

The clash is with math, not with another programming language. And it
clashes with a well-established convention in math, therefore with the
readability/expressibility of math formulae. Of course, a mathematician can
agree that these are only conventions, but a mathematician already thinks in
a highly abstract way, whereas for Smalltalk the argument was made that this
would somehow help kids think more abstractly and better get the concept of
precedence. But kids do not think abstractly. Furthermore, the precedence of
+ and * is not so much about the behavior of arithmetic operators.
Regardless over what mathematical structure we define them, they are the
very operators used to define the notion of distributivity, closely related
to the notion of precedence. Saying that addition distributes over
multiplication instead of multiplication distributes over addition is not
more abstract, it just confuses the notions. We might as well start writing
Smalltalk with lower caps as the first letter in all identifiers followed by
all caps. aND cLAIM tHAT tHIS wILL hELP kIDS tHINK mORE aBSTRACTLY aND sEE
tHAT tHE wAY wE cAPITALIZE iS oNLY a cONVENTION.

As for the other operators, if they do not have some pre-defined/embedded
precedence, they might as well use left to right. But the few of them that
do would have warranted an exception.
I was thinking recently that operators would actually be a perfect use case
for multimethods in Smalltalk, and that the precedence problem is more a
consequence of the lack of multimethods. Anyway for numbers the matter of
behavior responsibility is also questionable. And since binary selectors are
already recognized as different in the language, implementing operators as
multimethods could be done with minimal impact for readability. This
approach could even be extended to support multikeyword multimethods by
having the first keyword start with a binary selector character.

Best,
Florin


From: Alan Kayalan.n...@yahoo.com
To: Fundamentals of New Computingfonc@vpri.org
Sent: Sat, June 4, 2011 2:46:33 PM
Subject: Re: [fonc] languages

Smalltalk was certainly not the first attempt -- and -- the most versatile
Smalltalk in this area was the first Smalltalk and also the smallest.

I personally think expressibility is not just semantic, but also syntactic.
Many different styles of programming have been realized in Lisp, but many
to most of them suffer from the tradeoffs of the uniform parentheses bound
notation (there are positive aspects of this also because the uniformity
does remove one kind of mystery).

The scheme that Dan Ingalls devised for the later Smalltalks overlapped with
Lisp's, because Dan wanted a programmer to be able to parse any Smalltalk
program at sight, no matter how much the semantics had been extended.
Similarly, there was a long debate about whether to put in normal
precedence

Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Leibs
I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  

Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.

There is some old analysis out there that indicates that APL is naturally very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 

R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.


-David Leibs
___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread Alan Kay
I think this one was derived from Phil Abrams' Stanford (and SLAC) PhD thesis 
on 
dynamic analysis and optimization of APL -- a very nice piece of work! (Maybe 
in 
the early 70s or late 60s?)

Cheers,

Alan





From: David Pennell pennell.da...@gmail.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:33:40 PM
Subject: Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

HP had a version of APL in the early 80's that included structured 
conditional 
statements and where performance didn't depend on cramming your entire program 
into one line of code.  Between the two, it was possible to create reasonably 
readable code.  That version of APl also did some clever performance 
optimizations by manipulating array descriptors instead just using brute force.

APL was the first language other than Fortran that I learned - very eye opening.


-david


On Sun, Jun 5, 2011 at 9:13 PM, Alan Kay alan.n...@yahoo.com wrote:

Hi David

I've always been very fond of APL also -- and a slightly better and more 
readable syntax could be devised these days now that things don't have to be 
squeezed onto an IBM Selectric golfball ...

Cheers,

Alan





 From: David Leibs david.le...@oracle.com
To: Fundamentals of New Computing fonc@vpri.org
Sent: Sun, June 5, 2011 7:06:55 PM
Subject: Re:  Terseness, precedence, deprogramming (was Re: [fonc] languages)


I love APL!  Learning APL is really all about learning the idioms and how to 
apply them.  This takes quite a lot of training time.   Doing this kind of 
training will change the way you think.  


Alan Perlis quote:  A language that doesn't affect the way you think about 
programming, is not worth knowing.


There is some old analysis out there that indicates  that APL is naturally 
very 
parallel.  Willhoft-1991 claimed that  94 of the 101 primitives operations in 
APL2 could be implemented in parallel and that 40-50% of APL code in real 
applications was naturally parallel. 


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30 
(1991), no. 4, 498–512.




-David Leibs


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread David Harris
Alan-

I expect you lost a few readers there.  I have fond memories of APL on an
IBM 360/145 with APL microcode support and Selectric terminals.

David


On Sun, Jun 5, 2011 at 7:13 PM, Alan Kay alan.n...@yahoo.com wrote:

 Hi David

 I've always been very fond of APL also -- and a slightly better and more
 readable syntax could be devised these days now that things don't have to be
 squeezed onto an IBM Selectric golfball ...

 Cheers,

 Alan

 --
 *From:* David Leibs david.le...@oracle.com
 *To:* Fundamentals of New Computing fonc@vpri.org
 *Sent:* Sun, June 5, 2011 7:06:55 PM
 *Subject:* Re: Terseness, precedence, deprogramming (was Re: [fonc]
 languages)

 I love APL!  Learning APL is really all about learning the idioms and how
 to apply them.  This takes quite a lot of training time.   Doing this kind
 of training will change the way you think.

 Alan Perlis quote:  A language that doesn't affect the way you think about
 programming, is not worth knowing.

 There is some old analysis out there that indicates that APL is naturally
 very parallel.  Willhoft-1991 claimed that  94 of the 101 primitives
 operations in APL2 could be implemented in parallel and that 40-50% of APL
 code in real applications was naturally parallel.

 R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 30
 (1991), no. 4, 498–512.


 -David Leibs


 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Language in Test (was Re: [fonc] languages)

2011-06-05 Thread Casey Ransberger
I'm actually not talking about the potty mouths:)

APL is up there on my list now, but it hasn't knocked Prolog out of the top 
slot. 

I've done a bunch of test automation. I really enjoy testing because on a good 
day it can approach something reminiscent of science, but OTOH the test code I 
ended up wrangling (often my own code) wound up the worst sort of mess, for a 
few reasons. Not-test code that I've worked on or composed myself has always 
been a lot better, for reasons I don't totally understand yet. 

I can toss some guesses out there:

One is that people who do automation are often outnumbered 5:1 or worse by the 
people making the artifacts under examination, such that there's too much to do 
in order to do anything very well.

Another is, testing often fails to strike decision makers as important enough 
to invest much in, so you end up hacking your way around blocking issues with 
the smallest kludge you can think of, instead of instrumenting proper hooks 
into the thing under test, which usually takes a little longer, and risks 
further regression in the context of a release schedule. 

Things I learned from Smalltalk and Lisp have been really useful for reducing 
the amount of horror in the test code I've worked on, but it's still kind of 
bad. Actually I was inspired to look for an EDSL in my last adventure that 
would cut down on the cruft in the test code there, which was somewhat inspired 
by STEPS, and did seem to actually help quite a bit. 

Use of an EDSL proved very useful, in part just because most engineering orgs 
I've been in don't seem to want to let me use Lisp. 

Being able to claim honestly that I'd implemented what I'd done in Ruby seemed 
to help justify the unorthodox approach to my managers. I did go out of my way 
to explain that what I'd done was compose a domain specific language, and this 
did not seem to get the same kind of resistance, just a few raised eyebrows and 
funny looks. 

I keep getting the feeling that the best tool for the job of encoding and 
running invariants might be a Prolog, and so this one is currently at the top 
of my list of things to understand deeply.  

Anyone else here ever deal with a bunch of automation? Ever find a way to apply 
what you know about programming languages themselves in the context of software 
verification? Because I would *love* to hear about that!

On Jun 5, 2011, at 7:06 PM, David Leibs david.le...@oracle.com wrote:

 I love APL!  Learning APL is really all about learning the idioms and how to 
 apply them.  This takes quite a lot of training time.   Doing this kind of 
 training will change the way you think.  
 
 Alan Perlis quote:  A language that doesn't affect the way you think about 
 programming, is not worth knowing.

I love this quote. Thanks for your 

(snip)

 
 -David Leibs
 
 ___
 fonc mailing list
 fonc@vpri.org
 http://vpri.org/mailman/listinfo/fonc

___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc


Re: Terseness, precedence, deprogramming (was Re: [fonc] languages)

2011-06-05 Thread BGB

On 6/5/2011 7:06 PM, David Leibs wrote:
I love APL!  Learning APL is really all about learning the idioms and 
how to apply them.  This takes quite a lot of training time.   Doing 
this kind of training will change the way you think.


Alan Perlis quote:  A language that doesn't affect the way you think 
about programming, is not worth knowing.




not everyone wants to learn new things though.

very often, people want to get the job done as their main priority, and 
being faced with new ideas or new ways of doing things is a hindrance to 
the maximization of productivity (or, people may see any new/unfamiliar 
things as inherently malevolent).


granted, these are not the sort of people one is likely to find using a 
language like APL...



a lot depends on who ones' market or target audience is, and whether or 
not they like the thing in question. if it is the wrong product for the 
wrong person, one isn't going to make a sale (and people neither like 
having something being forced on them, nor buying or committing 
resources to something which is not to to their liking, as although 
maybe not immediate, this will breed frustration later...).


it doesn't mean either the product or the potential customer is bad, 
only that things need to be matched up.


it is like, you don't put sugar in the coffee of someone who likes their 
coffee black.



There is some old analysis out there that indicates that APL is 
naturally very parallel.  Willhoft-1991 claimed that  94 of the 101 
primitives operations in APL2 could be implemented in parallel and 
that 40-50% of APL code in real applications was naturally parallel.


R. G. Willhoft, Parallel expression in the apl2 language, IBM Syst. J. 
30 (1991), no. 4, 498–512.




not personally dealt with APL, so I don't know.

I sort of like having the ability to write code with asynchronous 
operations, but this is a little different. I guess a task for myself 
would be to determine whether or not what I am imagining as 'async' is 
equivalent to the actor model, hmm...


decided to leave out a more complex elaboration.


___
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc