On 9/24/2014 6:39 PM, David Leibs wrote:
I think Stephen is misrepresenting the Wolfram Language when he says it is a big language. He is really talking about the built in library which is indeed huge. The language proper is actually simple, powerful, and lispy.
-David


I think it is partly size along two axes:
core features built into the language core and the languages' core syntax;
features that can be built on top of the language via library features and extensibility mechanisms.

a lot of mainstream languages have tended to be bigger in terms of built-in features and basic syntax (ex: C++ and C#); a lot of other languages have had more in terms of extensibility features, with less distinction between library code and the core language.

of course, if a language generally has neither, it tends to be regarded as a "toy language".

more so if the implementation lacks sufficient scalability to allow implementing a reasonable sized set of library facilities (say, for example, if it always loads from source and there is a relatively high overhead for loaded code).


sometimes, it isn't so clear cut as "apparent complexity"=="implementation complexity".

for example, a more complex-looking language could reduce down somewhat with a simpler underlying architecture (say, the language is itself largely syntax sugar); OTOH, a simple looking language could actually have a somewhat more complicated implementation (say, because a lot of complex analysis and internal machinery is needed to make it work acceptably).

in many cases, the way things are represented in the high-level language vs nearer the underlying implementation may be somewhat different, so the representational complexity may be being reduced at one point and expanded at another.


another related factor I have seen is whether the library API design focuses more on core abstractions and building things from these, or focuses more on a large number of specific use-cases. for example, Java having classes for nearly each and every way they could think up that a person might want to read/write a file, as opposed to, say, a more generic multipurpose IO interface.


generally, complexity has tended to be less of an issue than utility and performance though. for most things, it is preferable to have a more useful language if albeit at the cost of a more complex compiler, at least up until a point where the added complexity outweighs any marginal gains in utility or performance.

where is this point exactly? it is subject to debate.


On Sep 24, 2014, at 3:32 PM, Reuben Thomas <r...@sc3d.org <mailto:r...@sc3d.org>> wrote:

On 24 September 2014 23:20, Tim Olson <tim_ol...@att.net <mailto:tim_ol...@att.net>> wrote:

    Interesting talk by Stephen Wolfram at the Strange Loop conference:

    https://www.youtube.com/watch?v=EjCWdsrVcBM

    He goes in the direction of creating a "big" language, rather
    than a small kernel that can be built upon, like Smalltalk, Maru,
    etc.


Smalltalk and Maru are rather different: Ian Piumarta would argue, I suspect, that the distinction between "small" and "large" languages is an artificial one imposed by most languages' inability to change their syntax. Smalltalk can't, but Maru can. Here we see Ian making Maru understand Smalltalk, ASCII state diagrams, and other things:

https://www.youtube.com/watch?v=EGeN2IC7N0Q

That's the sort of small kernel you could build Wolfram on.

Racket is a production-quality example of the same thing: http://racket-lang.org <http://racket-lang.org/>

--
http://rrt.sc3d.org <http://rrt.sc3d.org/>
_______________________________________________
fonc mailing list
fonc@vpri.org <mailto:fonc@vpri.org>
http://vpri.org/mailman/listinfo/fonc



_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to