>     Could Haskell ever be used for serious scientific computing?
>     What would have to be done to promote it from a modelling
>     tool to a more serious number crunching engine? Maybe not
>     necessarily competing with Cray, but not terribly lagging
>     far behind the other languages?

Short answer: I think so, but it's a lot of work.

Going head to head with FORTRAN is very challenging. FORTRAN
compilers deal with a first-order, strict, language with explicit
storage management -- and they have two decades lead.

First, you absolutely have to store arrays of unboxed values;
having arrays of pointers to thunks is a real killer.  The Clean
people have done a lot of good work on this, and with a bit of 
multi-paramter type class magic I think one can use unboxed arrays
without it polluting your program too much.

I just do not know how important update in place is.  With programs
that manipulate a few very large arrays the detailed storage management
of those arrays becomes rather important.  Sisal did an incredibly
good job of this.  Laziness makes that harder.  Clean allows the
programmer to get update inplace via uniqueness types.

Declarative languages *ought* to give a big handle on optimisation.
FORTRAN compilers spend a lot of time deriving a functional program
from the imperative one they started with, but they have to make
conservative approximations.  So in principle we might do better.
I know of four encouraging examples:
        Sisal
        NESL
        SAC
        FISh

SAC is the least well known, but they have now successfully implemented
'with-loop folding'. This amounts to elimininating intermediate arrays
in the same sort of way as we eliminate intermediate lists.  It's cool.
And they beat FORTRAN.  (See their IFL'98 paper.)

All of these languages are restricted in some way.  All
also have extensions that are aimed at array computations.
Haskell, or Clean, or ML are less specialised, and therefore harder 
to compile as well.  


Another approach is to compete not head-to-head on speed, but on
cunning.   Get a good library of numeric procedures (e.g. Mathlab),
interface them to Haskell, and use Haskell as the glue code to make
it really fast to write complex numerical algorithms.  99% of the
time will still be spent in the library, so the speed of the Haskell
implementation is not very important.  This looks like a jolly productive
line to me.



So in principle, yes.  But it takes a big investment and there's a long
hill to climb before people start to take notice of you.  But (to change
the metaphor) I think there's some fertile territory there.

Incidentally, if anyone wants to work with us to make GHC do a better job of

scientific computation (and it's currently nowhere near good), I'd be
glad to work with them.

Simon


Reply via email to