On Fri, 2004-12-03 at 13:21 +0100, Fred Nicolier wrote: > I'am a little bit confused by the following error messages : > > \begin{code} > import Array > type SignalOf = Array Int > > funcSignal :: (Num a, Num b) => (Int, Int) -> (a -> b) -> SignalOf b > funcSignal b f = listArray b [f (fromIntegral k) | k <- range b] > > instance (Num a) => Num (SignalOf a) where > fromInteger k = funcSignal (minB, maxB) (const i) > where minB = (minBound :: Int) + 1 > maxB = (maxBound :: Int) - 1 > i = fromInteger k > \end{code} > > When testing on ghci -> > *Main> let x = 1::SignalOf Int > Loading package haskell98 ... linking ... done. > *Main> bounds x > zsh: segmentation fault (core dumped) ghci infArray.hs > > Is it not possible to define a really huge array ?
No. The array really does get created with that huge number elements. The Haskell Array type is lazy in it's *elements*, not its keys. The underlying implementation something like a C array of pointers to (not necessarily yet evaluated) values. The Array type is just your ordinary O(1) indexable structure. If you want a data structure that works well with sparse keys, you'll need to look beyond the simple Array type. FiniteMap would work fine for this. I can't immediately think of a collection data structure combining O(1) lookup with a sparse key space however - that sounds quite tricky. Duncan _______________________________________________ Haskell mailing list [EMAIL PROTECTED] http://www.haskell.org/mailman/listinfo/haskell