I have a simple program that first generates a large (~ 500 mb) file of random
numbers and then reads the numbers back to find their sum.
It uses Data.Binary and Data.ByteString.Lazy.
The problem is when the program tries to read the data back it quickly (really
quickly) consumes all memory.
Excerpts from Grigory Sarnitskiy's message of Mon Sep 14 16:05:41 +0200 2009:
I have a simple program that first generates a large (~ 500 mb) file of
random numbers and then reads the numbers back to find their sum.
It uses Data.Binary and Data.ByteString.Lazy.
I do think that this is due to
sargrigory:
I have a simple program that first generates a large (~ 500 mb) file
of random numbers and then reads the numbers back to find their sum.
It uses Data.Binary and Data.ByteString.Lazy.
The problem is when the program tries to read the data back it quickly
(really quickly)
I have tweaked this program a few ways for you.
The big mistake (and why it runs out of space) is that you take
ByteString.Lazy.length to compute the block size. This forces the entire
file into memory -- so no benefits of lazy IO.
As a separate matter, calling 'appendFile . encode'