There were a number of letters recently on the subjects of the Haskell
fitness the scientific computing after the request of

Jan Skibinski' <[EMAIL PROTECTED]>

>     Could Haskell ever be used for serious scientific computing?
>     What would have to be done to promote it from a modelling
>     tool to a more serious number crunching engine? Maybe not
>     necessarily competing with Cray, but not terribly lagging
>     far behind the other languages?
> ...


The letters mention Fortran, large matrices, approximate solution for
DE.
Only the meaning of the words `scientific computing' in programming
has 90% changed since 1960-1970.
Now it means mainly the *symbolic* (not approximate) computation that 
the scientists and engineers usually do on the paper.
And the programs for this are the computer algebra systems:
AXIOM, Maple, MuPAD, Reduce ...
They deal with symbolic differential operators, decompose algebraic
varieties, integrate functions symbolically or approximately, and so
on.
The `number crunching engine' for the tasks like the large float matrix 
inversion constitutes maybe 5% of the scientific computing matter
(still, it is important).

So testing the fitness for the scientific computing will rather mean 
to program, say, the polynomial factorization in Haskell, or maybe, the 
logical resolution method, and compare its performance to AXIOM, Maple,
MuPAD ones.


------------------
Sergey Mechveliani
[EMAIL PROTECTED]








Reply via email to