Just to add to what Zdenek wrote:

The linear complexity of string concatenation in a naïve implementation
(not having access to an extra-language "end-of-list" in the "diff list"
sense...) make the total complexity O(n^2), since the number of conses
generated is thus

        sum [1 .. n]

which, obviously, is (1+n)*n/2. In the case of the n=50000 in the
example we get "sum [1 .. 50000]" => "1250025000". So, well over one
billion conses. This is why "++" is right associative :-)

This time complexity cannot make Hugs crash, though, except for a defect
GC, having problems tidying up after each round of "++".

The space complexity, which reduces to maximum execution stack space
(considering a proper GC) in the example, is what kills Hugs. The
problem is that the string concatenation is not the last call, so there
is no room for last call optimization.

If you want to mimic the complexity of the example while calculating the
number of conses required, try evaluating the "isomorphic" expression

        last $ scanl1 (+) $ take 50000 (repeat 1)

It might crash in Hugs, running out of execution stack space, for the
same reason as the original example. It is the "last" that holds up the
stack space, by the way.

I hope this was helpful.

Regarding "do":

It is easy to get the feeling that the recursive call in

        recurse = do
                        f x
                        recurse

is the last one, but this is just an illusion of the iterative layout of
"do". Sometimes the monads lure us into old iterative thought
patterns...

Taking away the syntactic sugar, we end up with

        recurse = f x >> recurse

This reveals the fact that there can be no last call optimization,
because the last call is ">>".

Regards,

David

_______________________________________________
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell

Reply via email to