On Mon, Apr 07, 2008 at 04:52:51AM -0700, John Meacham wrote: > On Mon, Apr 07, 2008 at 04:45:31AM -0700, David Roundy wrote: > > I wonder about the efficiency of this implementation. It seems that for > > most uses the result is that the size of a Nat n is O(n), which means that > > in practice you probably can't use it for large numbers. > > > > e.g. it seems like > > > > last [1..n :: Nat] > > > > should use O(n) space, where > > > > last [1..n :: Integer] > > > > should take O(1) space. And I can't help but imagine that there must be > > scenarios where the memory use goes from O(N) to O(N^2), which seems pretty > > drastic. I imagine this is an inherent cost in the use of lazy numbers? > > Which is probably why they're not a reasonable default, since poor space > > use is often far more devastating then simple inefficiency. And of course > > it is also not always more efficient than strict numbers. > > Oh, yes. I certainly wouldn't recommend them as some sort of default, > they were sort of a fun project and might come in handy some day. > > By efficient, I meant more efficient than the standard lazy number > formulation of > > data Num = Succ Num | Zero > > not more efficient than strict types, which it very much is not. :)
Ah, that makes sense. And yes, they're definitely more efficient than that. :) The trouble (I suppose) is that for the common case in which lazy naturals are called for, which is situations where you compute (1+1+1+1...+1+0), you can't be more space-efficient than the standard lazy numbers without losing some of the laziness. Or at least, I can't see how you could do so. -- David Roundy Department of Physics Oregon State University _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe