Here is the second part of my answer. First, I want to tell you that
my feelings on the subject are exactly like yours, i.e., (a) I think that
it is possible to use Haskell to build packages like Matlab, Labview,
Maple, Mathematica, etc. (b) There is clear advantages in doing this.
However, I want to add that I have no idea on how to build a Haskell
compiler. Therefore, my opinion worths little, even less than an educated
guess.

Let's go back to computer algebra. What Prof. Malaquias achieved was
basically a system to put expressions into the canonical form. Since equal 
expressions have the same canonical form, if I put expression E1 into the 
canonical form, and do the same to expression E2, I can compare them. I do not 
know much about computer algebra, but Prof.
Malaquias claims that this is the most difficult part of any computer
algebra system. If you have this, people can easily implement integration
(he implemented an integration system to show how easy it is, once you
have expressions in canonical form), differential equations, etc. 
Since M. knows a lot about
computer algebra (he implemented the CICS system in Scheme), I deem that
he must know what he is saying. Now, Jerzy Karczmarczuk thinks that
Haskell is not good for computer algebra. He argues that it would be difficult
to scrutinize 'intermediate expressions' in Haskell. Professor Malaquias does
not see any problem with this. He created the type 'Expression', like types. 
A value of type expression is an expression in the canonical
form, plus an environment that tells where the expression is being used.
In any case, I am trying to contact Prof. Malaquias about the possibility
of sending his monography (~100 pages) to you (if you want to see it, of 
course).
There, he addresses a lot of issues, and presents complete algorithms.

Now, you tell me also about Numerical Recipes in Haskell (or Clean, since they
are dialects of the same languages :) This is quite a different problem, IMHO,
and a much easier one.  You are not the only one that Haskell people 
advised not to use Haskell for numerical problems. The same happened to me.
However,  I am not sure whether this advice is sound, because:

1- Thorsten Zoerner  (a PhD student, I believe) implemented a linear algebra
system in Clean. Remember that Clean and Haskell are dialects of the same
language. His implementation is not highly optimized, and works only for
real numbers. However, it is quite complete, and  fast enough. It is faster 
than
similar libraries in Oberon (I performed the test in POW, and sent the
result to the Oberon people), SmallEiffel under lcc, or unoptimized gcc.
It is about 50% slower than optimized gcc. Carl Ross added complex numbers
to Thorsten's libraries, and optimized the code (basically, he removed the
multiple loops). The modified library was 20% faster. 
Zoener's library has splices, like
MatLab. Ross used it to implement the Simplex method (this is a kind of
algorithm used in linear programming), and got very good results.

2- Richard O'Keefe received a Neural Network from a collaborator and started 
to
play around with it. The original Neural Network was written in C, and took
400 ms to run a certain benchmark (O'Keefe does not specify which one). 
A partially optimized Clean version
(O' Keefe says in his postings that he does not know how to do optimizations
like strictness declarations) could run the benchmark in about 300 ms.
Then, O'Keefe optimized the C program, adding in line code. The new C
program could do the job in about 100 ms. As you can see, the functional
program is not very far from C.

3- People at Edimburg University develop mechanical arms for amputees.
One of these arms was developed by  Soares, Alcimar B., Brash, Harry M.
 and Gow, David. The arm is very light, and uses shape memory alloys that
imitate animal muscles. To control the arm, Alcimar collects electromyographic
signals from the shoulder of the amputee. Dr. Paschoarelli discovered an
algorithm that retrieves a set of paraemetes of the signals. The parameters
are fed to a neural network, that recognizes the movement that the amputee
wishes to perform. Until Dr. Paschorelli's algorithm, the signal was fed 
directly
to the Neural Network, with a failure rate of 25%.  Dr. Paschoarelli makes
a convolution of the signal with a Wiener estimator (a finite response 
filter).
This produces a modified LMS estimator. He keeps all the convolution results
in a list of vectors. The list has as many elements as the samples of the
signal, each element being a vector. Then he applies estatistical methods to
choose (from this huge list) the best representation of the signal. Believe
me: it is a lot of calculations, involving matrices, vectors, and lists. All 
these
calculations must be done in real time (the amputees feel disconfort if the
system takes more than 0.5 seconds to respond).  However,
the failure rate dropped to less than 2%! It is the best result in the 
litterature.
Now the surprize: The prototype was written in Hugs, and the final system
is in Clean. Prof. Paschoarelli and Prof. Alcimar will submit
the results to the Journal of Biomedical Engineering, and to the Journal of
Funcional Programming in a week or two. If the papers are accepted, you 
can see how one can use functional programming to solve a number crushing
problem.

P. S. As soon as I contact prof. Malaquias, I will ask him to send you
a copy of his monography. If you don' t want it, please, let me know.

------------------------------------------------------------
This e-mail has been sent to  you courtesy of OperaMail,  as a  free  service  from
Opera  Software,  makers  of the award-winning Web Browser, Opera. Visit us at
http://www.opera.com/ or our portal at: http://www.myopera.com/ Your free e-mail 
account is waiting at: http://www.operamail.com/
------------------------------------------------------------

Reply via email to