Dear Haskellers:
        
    Could Haskell ever be used for serious scientific computing?
    What would have to be done to promote it from a modelling
    tool to a more serious number crunching engine? Maybe not
    necessarily competing with Cray, but not terribly lagging
    far behind the other languages?
        
    I hope some of you have some answers, ideas, experiences
    that could shed some light here. Below are somewhat unorganized
    thoughts and more detailed questions that relate to this topic.
    
 1. We tend to kid ourselves, hiding behind "Pseudoknot benchmarks"
    or quoting how fast Sisal is. In fact, Haskell is not so
    highly marked in the benchmarks, and Sisal is not a language
    of our choice for many reasons.
          
    In the announcement of FISh [FISH] it is said that FISh is
    in the order of two magnitudes faster than Haskell, two-four
    times faster than Ocaml, and even faster than C because of
    the code optimization. Two magnitudes faster than Haskell?
          
 2. It appears to me that over the years several people
    have tried using Haskell for scientific purposes, get
    disillusioned somehow and either tried some improvements
    to Haskell, build something different or abandoned the idea
    alltogether. I saw quite ambitious projects, such as
    "Functional Programming for Finite Element Analysis"
    in Miranda [LIU], but the authors conclusions were quite
    discouraging.
           
    What happened to FSC (Functional Scientific Computing)
    of Chris Angus [ANGUS], which supposed to address some
    ineficiences of Haskell? Is it still alive? I do not know
    how related FSC supposes to be to Haskell. Is it a sort of
    a Haskell extension, or completely new language that uses
    Haskell as a development tool?
    
    Do you agree with Chris'es conclusions in his presentation
    in SciTools'96? Mainly that strictness is needed for speed
    [ANGUS1]?  
          
    Two years have passed since SciTools'96. This is quite a
    long time -- considering how fast new ideas evolve nowadays.
    Any new development, any new conclusions, new directions
    that could be worthwhile to adapt for Haskell?
          
    Why a need for SAC (Single Assignment C)? Aside from
    a promotional view that it is still a kind of C? Is there
    something definitely inefficient in Haskell that forces some
    people to choose an inelegant practicality instead of the
    Haskell elegancy? 
          
 3. Some researchers conclude that Haskell is not gaining
    popularity for scientific applications simply because of
    the legacy of Fortran algorithms, which mostly rely on
    arrays, indexing and destructive updating. Surely, some
    algorithms, such as LU decomposition, do not do destructive
    updating -- thus circumventing one of the most time consuming
    obstacles.
          
    Some authors [ELLMEN] suggest using indexed lists for
    certain algorithms, instead of arrays. These suppose
    speed up some algorithms somehow. Any such modules and/or
    benchmarks available?
          
    Some people talk about using unboxed arrays in order
    to speed up computations. Any pointers to usage of such
    in Haskell?
          
    Does anyone working on theory of parallel computations
    have any experiences with speed improvements for classical
    Haskell algorithms? Polytoping?
          
    Would Psi-calculus be of any help in cranking up some
    Haskell computations on arrays? Or is it just something
    that helps organize indexing and provides a programmer
    with a set of primitives to avoid using error-prone
    indexing schemes?
         
    Is the idea of shapes worthy to investigate in this
    context, or is it just a concept introduced for other
    reasons?
          
    Have quadtrees of David Wise's ([WEISE] and [WEISE1])
    proved to be of any importance to scientific computing
    in Haskell? Among other things, the quadtree algorithms
    supposed to improve array updating schemes. Judging
    from the publishing dates (1992, 1995 with a paper version
    of the latter scheduled for 1998) this idea is still
    very much alive. Has anyone benefited from it yet?
        

    And please, do not argue that Haskell is only a general
    purpose language. After all, most of the arguments I found
    on Internet for using Haskell at all were of the kind:
    "Look Ma, how nicely it resembles the scientific notation!"
          
    Jan   
           

    References:

[LIU]    - Junxian Liu, Paul Kelly, Stuart Cox. Functional
           Programming for Finite Element Analysis, Department of
           Computing, Imperial College.                   
[FISH]   - http://www-staff.socs.uts.EDU.AU:8080/~cbj/FISh/
[ANGUS]  - http://www.ncl.ac.uk/~n4521971/work.html
[ANGUS1] -
http://www.oslo.sintef.no/SciTools96/noframes/Contrib/angus/p2.html
[ELLMEN] - Nils Ellmenreich and Christian Lengauer. On Indexed Data
           Structures and Functional Matrix Algorithms, Faculty of
           Mathematics and Informatics, University of Passau
[WEISE]  - David S. Wise. Matrix Algorithms using Quadtrees, Invited Talk
           ATABLE-92, Technical Report 357, Computer Science Department,
           Indiana University, June 1992.
[WEISE1] - David S. Wise. Undulant-Block Elimination and
           Integer-Preserving Matrix Inversion, Technical Report 418,
           Computer Science Department, Indiana University, August 1995.
                              



Reply via email to