On 9/10/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> The GHC documentation says that (evaluate a) is not the same as (a
> `seq` return a).  Can someone explain the difference to me, or point
> me to a place where it is explained?

(evaluate a) is "weaker" than (a `seq` return a). (a `seq` return a) is
always _|_ whereas (evaluate a) is not _|_, but does throw an exception
when executed.

Okay...  I think maybe I understand now.  To check my understanding,
is the following a correct description?

* (return a) = produces an IO action which, when executed, returns a
 in the IO monad, but lazily evaluated.  Thus, if a is undefined, it
 doesn't throw an exception unless and until its value is actually
 used.

* (a `seq` return a) = evaluate a *right now*, then produce an IO action
 which, when executed, returns the result of evaluating a.  Thus, if
 a is undefined, throws an exception right now.

* (evaluate a) = produces an IO action which, when *executed*, evaluates
 a (not lazily) and returns the result in the IO monad.  Thus, if a is
 undefined, throws an exception if and when the IO action in question
 is executed.

If this is correct, then the way I would explain your examples is
as follows.  The "1" versions always throw an exception, since that
happens when the function is first called.

e 0 = return a
e 1 = a `seq` return a
e 2 = evaluate a

t x = e x >> return ()
-- t 0 == return ()
-- t 1 == throwIO something
-- t 2 == throwIO something

Here the IO action is executed, so 2 throws an exception; but its
value is never used, so 0 doesn't.

u x = e x `seq` return ()
-- u 0 == return ()
-- u 1 == undefined
-- u 2 == return ()

Here the IO action is never even executed, since it gets replaced with
(return ()), so neither 0 nor 2 throws an exception.

v x = catch (e x) (\_ -> return ()) >>= print
-- v 0 == throwIO something
-- v 1 == print ()
-- v 2 == print ()

Here, again, the IO action is executed, so 2 throws an exception at
that point, which gets caught and so the result is replaced with ().
But 0 executes with no problem and returns a, lazily evaluated, which
thereby slips out from under the `catch' to throw an error when its
value is actually used, later, by `print'.

Is that all correct?

Thanks for your help!
Mike
_______________________________________________
Haskell-Cafe mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to