I'd like to say I agree with Alan on the syntatic component, not just in terms of absolute expressablility, but also in the way programmers think about how to go about a task (and one reason why I'm not a fan of Perl for large-scale programming - reading other people's code is made even harder if there are many ways to skin a cat!). DSLs in fact help focus on the task in hand. I guess we do think modally - look at iPhone apps, the reason they are popular with the general public is that they make tasks more modal ie. simpler.
Not every language is Turing-complete ie. will allow you full expressability, however even with Turing-complete languages you usually end up with a Turing Tarpit if you step far outside the languages' strengths ie. it's so complex to implement it's not worth the effort. DSLs are the natural answer again as per STEPS - write your DSLs in the Turing-complete language of your choice then. I don't know if there's an optimum way of expressing algorithms - you'll find that some algos are much easier to express and think about in a functional language than say an imperative one (and vice-versa). Expressing intent is an interesting one as I'd advocate that includes documentation (eg. javadoc) and versioning (ie. feature creep). You can certainly get languages that express intent, for example with Maude (equational programming) your mathematical spec IS the program), Donald Knuth's Literate programming is an attempt to meld documentation with the language (personally I find it a bit wrong-headed in it's prefer-documentation-over-code approach but it is very unique). LISP, smalltalk and Maude are good examples of cleanly extensible languages - what you're talking about is clean reflection, something that can only be done with dynamic languages. Interestingly, Maude does pattern matching for both equational programming and non-equational programming which kind of indicates that there are at least practical limitations with a purely equational approach. Ometa is really an example of a meta-language ie. it converts BETWEEN languages. I'm guessing it's not Turing-complete as it relies on a host language to do some of the heavy lifting once it's found a syntax-match. It is semantically restricted to deterministic language syntax ie. non-ambiguous syntax which is by-and-large what you want. Not so good for natural-language modelling though. On 4 June 2011 19:46, Alan Kay <[email protected]> wrote: > Smalltalk was certainly not the first attempt -- and -- the most versatile > Smalltalk in this area was the first Smalltalk and also the smallest. > > I personally think expressibility is not just semantic, but also syntactic. > Many different styles of programming have been realized in Lisp, but "many > to most" of them suffer from the tradeoffs of the uniform parentheses bound > notation (there are positive aspects of this also because the uniformity > does remove one kind of mystery). > > The scheme that Dan Ingalls devised for the later Smalltalks overlapped > with Lisp's, because Dan wanted a programmer to be able to parse any > Smalltalk program at sight, no matter how much the semantics had been > extended. Similarly, there was a long debate about whether to put in > "normal" precedence for the common arithmetic operators. The argument that > won was based on the APL argument that if you have lots of operators then > precedence just gets confusing, so just associate to the right or left. > However, one of the big complaints about Smalltalk-80 from the culture that > thinks parentheses are a good idea after "if", is that it has a non-standard > precedence for + and * .... > > A more interesting tradeoff perhaps is that between Tower of Babel and > local high expressibility -- for example, when you decide to try lots of > DSLs (as in STEPS). Each one has had the virtue of being very small and very > clear about what is being expressed. At the meta level, the OMeta > definitions of the syntax part are relatively small and relatively clear. > But I think a big burden does get placed on the poor person from outside who > is on the one hand presented with "not a lot of code that does do a lot", > but they have to learn 4 or 5 languages to actually understand it. > > People of my generation (50 years ago) were used to learning and using many > syntaxes (e.g. one might learn as many as 20 or more machine code/assembler > languages, plus 10 or more HLLs, both kinds with more variability in form > and intent than today). Part of this could have stemmed from the high > percentage of "math people" involved in computing back then -- part of that > deal is learning to handle many kinds of mathematical notations, etc. > > Things seem different today for most programmers. > > In any case, one of the things we learned from Smalltalk-72 is that even > good language designers tend to create poor extensions during the heat of > programming and debugging. And that an opportunity for cohesion in an > extensible language is rarely seized. (Consider just how poor is the > cohesion in a much smaller part of all this -- polymorphism -- even though > it is of great benefit to everyone to have really strong (and few) > polymorphisms.) > > Cheers, > > Alan > > > ------------------------------ > *From:* Julian Leviston <[email protected]> > *To:* Fundamentals of New Computing <[email protected]> > *Sent:* Sat, June 4, 2011 10:44:07 AM > *Subject:* [fonc] languages > > Hi, > > Is a language I program in necessarily limiting in its expressibility? > > Is there an optimum methodology of expressing algorithms (ie nomenclature)? > Is there a good or bad way of expressing intent? Are there any intent > languages in existence? Are there any pattern or algorithm languages? Is a > programming language necessarily these two combined? > > These are the questions I've been finding myself pondering lately. > > For example, expressing object oriented concepts and patterns in C, while > possible, proves rather "uncomfortable". Some things are almost impossible > unless one "builds a world" inside C, but this is essentially building > another language and using C as the meta-platform for this language, no? > This would have to do with the fact that the design intent of the language > didn't have this as its original intent, surely? Is there a way of > patterning a language of programming such that it can extend itself > infinitely, innately? Was smalltalk the first attempt at this? Does it fail > by being too "large" in structural organisation? > > In other words, would a "language" (or exploratory platform for > programming) inherently require being "ridiculously simple" in terms of its > structure in order to fully be able to represent any other "language" (or > rather than language, simply more complicated structures) clearly? > > Is Ometa an example of this? > > Julian. > _______________________________________________ > fonc mailing list > [email protected] > http://vpri.org/mailman/listinfo/fonc > > _______________________________________________ > fonc mailing list > [email protected] > http://vpri.org/mailman/listinfo/fonc > >
_______________________________________________ fonc mailing list [email protected] http://vpri.org/mailman/listinfo/fonc
