Brandon S. Allbery KF8NH wrote:
On 2008 May 13, at 17:01, Andrew Coppin wrote:
That definition of mean is wrong because it traverses the list twice.
(Curiosity: would traversing it twice in parallel work any better?)
As for the folds - I always *always* mix up
It might work better but
andrewcoppin:
Brandon S. Allbery KF8NH wrote:
On 2008 May 13, at 17:01, Andrew Coppin wrote:
That definition of mean is wrong because it traverses the list twice.
(Curiosity: would traversing it twice in parallel work any better?)
As for the folds - I always *always* mix up
Yes, using
Advanced technology ought to look like unpredictable magic.
My experience with lazy evaluation is such that every time a program is
slower or bulkier than I presumed, it is not arbitrariness, it is
something new to learn.
My experience with GHC is such that every surprise it gives me is a
On Mon, May 12, 2008 at 08:01:53PM +0100, Andrew Coppin wrote:
I offer up the following example:
mean xs = sum xs / length xs
Now try, say, mean [1.. 1e9], and watch GHC eat several GB of RAM. (!!)
I don't see why the performance implications of this program are surprising.
Just ask any
gale:
Andrew Coppin wrote:
I offer up the following example:
mean xs = sum xs / length xs
Now try, say, mean [1.. 1e9], and watch GHC eat several GB of RAM. (!!)
If we now rearrange this to
mean = (\(s,n) - s / n) . foldr (\x (s,n) - let s' = s+x; n' = n+1 in s'
`seq`
I don't know why, but perhaps beginners may expect too much from the
laziness, almost to the level of magic (me too, in the beginning!). In an
eager language, a function like
mean :: (Fractional a) = [a] - a
expects the *whole* list before it can calculate the mean, and the question
of the
On 2008 May 12, at 22:18, Jeff Polakow wrote:
Then, I immediately blow my stack if I try something like:
mean [1..10].
The culprit is actually sum which is defined in the base libraries
as either a foldl or a direct recursion depending on a compiler
flag. In either case, the
On Tue, May 13, 2008 at 2:20 AM, Don Stewart [EMAIL PROTECTED] wrote:
Note the use of strict pairs. Key to ensuring the accumulators end up in
registers.The performance difference here is due to fold (and all left
folds) not fusing in normal build/foldr fusion.
The vector version
Jeff Polakow wrote:
[...] This can be easily fixed by defining a suitable strict sum:
sum' = foldl' (+) 0
and now sum' has constant space. We could try to redefine mean using
sum':
mean1 xs = sum' xs / fromIntegral (length xs)
but this still gobbles up memory. The reason is that xs
Darrin Thompson wrote:
These tricks going into Real World Haskell?
Some will, yes.
For example, the natural and naive way to write Andrew's mean function
doesn't involve tuples at all: simply tail recurse with two accumulator
parameters, and compute the mean at the end. GHC's strictness
Hello,
For example, the natural and naive way to write Andrew's mean function
doesn't involve tuples at all: simply tail recurse with two accumulator
parameters, and compute the mean at the end. GHC's strictness analyser
does the right thing with this, so there's no need for seq, $!, or the
On Tuesday 13 May 2008, Jeff Polakow wrote:
Is this the code you mean?
meanNat = go 0 0 where
go s n [] = s / n
go s n (x:xs) = go (s+x) (n+1) xs
If so, bang patterns are still required bang patterns in ghc-6.8.2 to run
in constant memory:
meanNat = go 0 0 where
Don Stewart wrote:
Andrew, would you say you understand the original problem of why
mean xs = sum xs / fromIntegral (length xs)
was a bad idea now? Or why the left folds were a better solution?
That definition of mean is wrong because it traverses the list twice.
(Curiosity: would
On 2008 May 13, at 17:01, Andrew Coppin wrote:
That definition of mean is wrong because it traverses the list
twice. (Curiosity: would traversing it twice in parallel work any
better?) As for the folds - I always *always* mix up
It might work better but you're still wasting a core that
As a beginner, I had found the behaviour quite unpredictable. But with time
I found that I could reason out the behaviour with my slowly growing
knowledge of laziness. I don't spot all the places in my program that will
suck while writing a program, but post facto many things become clear. (And
Don Stewart wrote:
jeff.polakow:
Hello,
One frequent criticism of Haskell (and by extension GHC) is that it has
unpredictable performance and memory consumption. I personally do not find
this to be the case. I suspect that most programmer confusion is rooted in
shaky
On Mon, 2008-05-12 at 20:01 +0100, Andrew Coppin wrote:
In short, as a fairly new Haskell programmer, I find it completely
impossibly to write code that doesn't crawl along at a snail's pace.
Even when I manage to make it faster, I usually have no clue why. (E.g.,
adding a seq to a
Andrew Coppin wrote:
I offer up the following example:
mean xs = sum xs / length xs
Now try, say, mean [1.. 1e9], and watch GHC eat several GB of RAM. (!!)
If we now rearrange this to
mean = (\(s,n) - s / n) . foldr (\x (s,n) - let s' = s+x; n' = n+1 in s'
`seq` n' `seq` (s', n'))
Hello,
One frequent criticism of Haskell (and by extension GHC) is that it has
unpredictable performance and memory consumption. I personally do not find
this to be the case. I suspect that most programmer confusion is rooted in
shaky knowledge of lazy evaluation; and I have been able to fix,
jeff.polakow:
Hello,
One frequent criticism of Haskell (and by extension GHC) is that it has
unpredictable performance and memory consumption. I personally do not find
this to be the case. I suspect that most programmer confusion is rooted in
shaky knowledge of lazy
On Fri, May 09, 2008 at 02:24:12PM -0700, Don Stewart wrote:
jeff.polakow:
Hello,
One frequent criticism of Haskell (and by extension GHC) is that it has
unpredictable performance and memory consumption. I personally do not find
this to be the case. I suspect that most programmer
21 matches
Mail list logo