Douglas Roberts wrote:
LISP was nice, but the virtual machine and garbage collection made it
a non-player in the modern HPC computing arena.
I'd argue that the Lisp way, is a reasonable fit to HPC.
Certainly virtual machines are not needed to implement Lisp-like
languages (some systems are bytecoded, but not all), and it is
*garbage* that is the problem not garbage collection. Programmers
generate garbage by not paying attention to how things get allocated.
If you run malloc/free thousands of times in various-sized pieces,
you'll get fragmented heaps and slow performance too. Garbage
collectors simply allow for poor practices, but don't require them.
It's not uncommon for millions of dollars to be spent in a relatively
short period on development of HPC codes. After all, the computers
themselves can cost hundreds of millions. Inefficient codes, or codes
poorly tuned to the used computer architectures simply waste money.
Some of the people that work on these codes know the models, while
others must know the details of particular computer architectures. Any
black box, like a compiler, is a potential correctness or performance
mistake waiting to happen. (This is of course not to say that
compilers aren't crucial.)
Experienced Lisp programmers, often being anti-authoritarian types, tend
to hate black boxes and rigid abstractions. If a language feature like
C++ templates are needed, the Lisp programmer writes Lisp macros to
implement them, injecting type annotations as necessary, and tweaks the
the macros interactively until the generated native code is just right.
Yes -- they read and consider both their macro output, and native code
the compiler generates from that. Imagine! This is in contrast to
Java, where the culture is more about insulating the programmer from
having to know how anything in particular works or plays out. And these
days, with Domain Specific Languages all the rage (in ABM and
elsewhere), it's funny that Lispers have been doing that for 40 years!
Finally, most HPC codes have a set of `kernels' where all of the compute
time goes, and the rest of the code could run at about any speed and not
become a bottleneck. Languages like Common Lisp are fine for this,
because there exist native code compilers that *can* be directed to
generate fast code, but in general won't try very hard to make vague
code be fast.
Unfortunately, Lisp culture is a poor fit to conservative HPC culture.
The latter being very focused on production capability, and fixed,
clearly-communicated requirements.
Marcus
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org