On Mon, 29 Jun 1998, S. Alexander Jacobson wrote:
> this:
> foldl version                                 foldl' version
> 
> = foldl (+) 0 [1..10000]              = foldl' (+) 0 [1..10000]
> = foldl (+) (0+1) [2..10000]          = foldl' (+) 1 [2..10000]
> = foldl (+) ((0+1)+2) [3..10000]      = foldl' (+) 3 [3..10000]
> 
> It should really evaluate like this:
> 
> = foldl (+) 0 [1..10000]
> = 0 + foldl (+) 1 [2..10000]
> = (0+1)+ foldl (+) 2 [3..10000]
> = ((0+1)+2) + foldl (+) 3 [4..10000]
> 
> If Haskell evaluated foldl in the second manner then foldl (+/) 0 [1..n]
> would be constant time rather than O(n) and could be applied to infinite
> sequences.

I don't think the claim is the backed up by the reduction sequence is it?
The reason that foldl (+/) [1..n] ==> 5 is that

(foldl (+/) 0 [1..n-1]) +/ n == 5

not that since +/ happens to be completely associative we have

  0 +/ (foldl (+/) 1 [1..n]) == 5,

isn't it? (Eg try

foldl f "0" [1..5] where f s n = "("++s++show n++")"

to see the pattern) So in order to get constant time evaluation of _the
mathematical expression_

  ((...(0 +/ 1)...+/n)

our first re-write rule would have to take
  foldl (+/) 0 [1..n]

to

  (foldl (+/) 0 [1..n-1]) +/ n,
 
which is a bit tricky since standard lists take O(n) time just to find the
last element. On the other hand, if we want to evaluate the mathemaical
expression

(0 +/ (1 +/ ...(n-1 +/ n)...))

we can use foldr (+/) n [0..n-1], which does take O(1) steps to evaluate. 
(Try it in hugs +s.) What makes this all look mysterious is that +/
happens to be totally associative, _which needn't be the case_, so the
human brain does lots of high school algebra on it without realising it.
Indeed the functions foldr and foldl would be pointless duplication if it
were guaranteed to be true that every function was totatlly associative. 

I hope this helps.



Reply via email to