Abhay Parvate wrote:
I think I would like to make another note: when we talk about the complexity
of a function, we are talking about the time taken to completely evaluate
the result. Otherwise any expression in haskell will be O(1), since it just
creates a thunk.

I don't like this notion of complexity, since it seems not very suited for the analysis of composite expression in Haskell. For example,

  repeat 42

has infinite complexity according to your definition (it doesn't even terminate if completely evaluated), but

  take 5 $ repeat 42

has only constant complexity even if fully evaluated. It is not clear how to reuse the finding about the complexity of (repeat 42) to determine the complexity of (take 5).

Instead, my view of complexity in lazy languages includes the interesting behaviour of the rest of the program as variables. For example, (repeat 42) needs O(n) steps to produce the first n elements of its output. Now, (take 5 x) restricts x to the first 5 elements, so (take 5 $ repeat 42) needs O(min(n, 5)) = O(1) steps to produce the first n elements of its output.

Is this intuitive view generalizable to arbitrary datatypes (instead of lists) and formalized somewhere?

  Tillmann
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to