On Thursday 10 April 2008 01:20:49 pm Matt Amos wrote: > I'm trying to break out of my imperative mind-set by learning Haskell > in small snippets. After a few successes I've hit a bit of a > roadblock with one of the classic dynamic programming problems, the > longest increasing subsequence: > > http://en.wikipedia.org/wiki/Longest_increasing_subsequence > > The most efficient algorithm relies on destructive updates, so a > simple translation doesn't seem possible. I've been trying to think of > a way of removing the need for destructive updates while keeping the > efficiency, but so far without much luck. Ideally, I'd like to avoid > threading the whole thing with a state monad, since then it would > just be an imperative algorithm in disguise. > > Any suggestions would be very gratefully appreciated.
Memorization is a nice way to implement dynamic programming algorithms in Haskell. Basically, you arrange it so that the underlying runtime does the destructive updates for you. http://www.haskell.org/haskellwiki/Memoization > Cheers, > > Matt _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe