http://haskell.org/haskellwiki/GHC:FAQ (search for cse)
| -----Original Message----- | From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Bjorn Lisper | Sent: 23 February 2006 10:28 | To: [EMAIL PROTECTED] | Cc: Haskell@haskell.org | Subject: Re: [Haskell] Compilation of big, computed tables | | Chris Kuklewicz: | >Stefan Karrmann wrote: | >> Dear all, | >> | >> can ghc compile huge tables into efficient code if they are constant at | >> compile time? | >> | >> Two examples may clearify the question: | >> | >> big1 :: UArray Int Char | >> big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000] | >> | >> big2 = sum [0..10000]::Int -- == 50005000 == n*(n+1)/2 where n = 10000 | >> | > | >GHC does not compute such values are compile time because | >*) The computation may not finish in reasonable time (infinite/not halting) | >*) The computation may generate a run-time error and crash the compilation | >(divide by zero, pattern march failure, etc.) | >*) Haskell is supposed to be lazy, things are computed when needed, not before | | Does this mean that GHC does not evaluate constant subexpressions at | compile-time? Or does it evaluate only some subclass of surely terminating | and non-erroneous subexpressions at compile-time? | | Constant subexpression evaluation is a common compiler optimization | technique, so I would be a bit surprised if GHC does not do it. | | Björn LIsper | _______________________________________________ | Haskell mailing list | Haskell@haskell.org | http://www.haskell.org/mailman/listinfo/haskell _______________________________________________ Haskell mailing list Haskell@haskell.org http://www.haskell.org/mailman/listinfo/haskell