Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-28 Thread Henning Thielemann
On Tue, 27 Aug 2013, John Lato wrote: [1] Most people are physically incapable of reading documents that explain why what they want to do won't work.  Even if people did read the documentation, I suspect that the people most in need of the information would be the least likely to understand

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-27 Thread Niklas Hambüchen
Thanks for your examples. On 27/08/13 13:59, Albert Y. C. Lai wrote: The correct fix is to raise the stack cap, not to avoid using the stack. Indeed, ghci raises the stack cap so high I still haven't fathomed where it is. This is why you haven't seen a stack overflow in ghci for a long

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-27 Thread Patrick Palka
On Mon, Aug 26, 2013 at 4:46 AM, Niklas Hambüchen m...@nh2.me wrote: On #haskell we recently had a discussion about the following: import System.Random list - replicateM 100 randomIO :: IO [Int] I would think that this gives us a list of a million random Ints. In fact, this is

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-27 Thread Niklas Hambüchen
On 27/08/13 20:37, Patrick Palka wrote: You can use ContT to force the function to use heap instead of stack space, e.g. runContT (replicateM 100 (lift randomIO)) return That is interesting, and works. Unfortunately its pure existence will not fix sequence, mapM etc. in base.

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-27 Thread Tom Ellis
On Mon, Aug 26, 2013 at 12:05:14PM -0700, Bryan O'Sullivan wrote: On Mon, Aug 26, 2013 at 1:46 AM, Niklas Hambüchen m...@nh2.me wrote: This is because sequence is implemented as sequence (m:ms) = do x - m xs - sequence ms return

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-27 Thread John Lato
IMHO it's perfectly reasonable to expect sequence/replicateM/mapM to be able to handle a list of ~1e6 elements in the Unescapable Monad (i.e. IO). All the alternate implementations in the world won't be as handy as Prelude.sequence, and no amount of documentation will prevent people from running

[Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-26 Thread Niklas Hambüchen
On #haskell we recently had a discussion about the following: import System.Random list - replicateM 100 randomIO :: IO [Int] I would think that this gives us a list of a million random Ints. In fact, this is what happens in ghci. But with ghc we get: Stack space overflow: current

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-26 Thread Niklas Hambüchen
As an example that this actually makes problems in production code, I found this in the wildlife: https://github.com/ndmitchell/shake/blob/e0e0a43/Development/Shake/Database.hs#L394 -- Do not use a forM here as you use too much stack space bad - (\f - foldM f [] (Map.toList status)) $

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-26 Thread Bryan O'Sullivan
On Mon, Aug 26, 2013 at 1:46 AM, Niklas Hambüchen m...@nh2.me wrote: This is because sequence is implemented as sequence (m:ms) = do x - m xs - sequence ms return (x:xs) and uses stack space when used on some [IO a]. This problem

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-26 Thread Niklas Hambüchen
Maybe an unlimited stack size should be the default? As far as I understand, the only negative effect would be that some programming mistakes would not result in a stack overflow. However, I doubt the usefulness of that: * It already depends a lot on the optimisation level * If you do the same

Re: [Haskell-cafe] sequence causing stack overflow on pretty small lists

2013-08-26 Thread Albert Y. C. Lai
On 13-08-26 04:46 AM, Niklas Hambüchen wrote: Effectively, sequence is a partial function. (Note: We are not trying to obtain a lazy list of random numbers, use any kind of streaming or the likes. We want the list in memory and use it.) We noticed that this problem did not happen if sequence