Hi Chide,

For data-intensive programming it will be worthwhile to design your data structures carefully. This depends of course on your application. The Clean compiler offers you unique arrays that might be pretty suited here. The uniqueness enforces restrictions on the way you write your algorithms. The benefit is that all updates will be done in-place, i.e. without creating garbage. When you use an unboxed array the elements become strict, but access and storage is very efficient.

Below is an example of a program that creates matrix of the suggested size and applies a function to all its elements. My old laptop executes this in 0.25 s.

Have fun,

Pieter

:: *Matrix :== {#{#Int}}

matrix :: !Int !Int -> Matrix
matrix n m = {{ j \\ j <- [i..i+n-1]} \\ i <- [0..m-1]}

mapMatrix :: (Int -> Int) !Matrix -> Matrix
mapMatrix f matrix = {{f e \\ e <-: a} \\ a <-: matrix}

test = (mapMatrix ((+) 7) myMatrix).[m-1].[n-1]
where
    myMatrix = matrix n m
    n = 30; m = 100000

Start = test

On 15/12/2011 10:57 AM, Groenouwe, C. wrote:
I'm trying to find out:
- How, in general, functional programming languages perform on data-intensive tasks (manipulation of large datasets, e.g.: doing some statistical analysis on a table with 100.000 instances and 30 columns) (regarding speed and memory usage)
- Which functional language performs best?

A quick glance at the following benchmark, gave me the impression that Clean and Caml seem to perform best with regard to memory consumption:

http://shootout.alioth.debian.org/

Is that true?

Additional question: which functional languages exploits (hardware) parallelism running on a multi core CPU best? (Or more CPU's)?

Thanks in advance,

Chide





_______________________________________________
clean-list mailing list
[email protected]
http://mailman.science.ru.nl/mailman/listinfo/clean-list
_______________________________________________
clean-list mailing list
[email protected]
http://mailman.science.ru.nl/mailman/listinfo/clean-list

Reply via email to