#3909: Priority queues in containers
-------------------------------------------+--------------------------------
Reporter: LouisWasserman | Owner: LouisWasserman
Type: feature request | Status: assigned
Priority: normal | Milestone:
Component: libraries (other) | Version: 6.12.1
Keywords: containers, priority queue | Difficulty:
Os: Unknown/Multiple | Testcase:
Architecture: Unknown/Multiple | Failure: None/Unknown
-------------------------------------------+--------------------------------
Comment(by milan):
Replying to [comment:20 LouisWasserman]:
> Even lazy pairing heaps have the O(n) worst-case delete-min performance.
Amortization isn't going to cut it.
Have you looked at the implementation? Let me aside, even Okasaki
conjectures that I inserts M merges and D delete-mins takes O(I+D log N)
in the persistent setting ("proof by authority":)
The trik is the following -- lazy pairing heaps use back-to-front variant
(both pairings and final merge is performed from back to front). When
there are three children of the root, you merge the newest two and merges
the result to the oldest child, immediately, but lazily.
So if you do a sequence of inserts followed by a delete-min, all previous
versions of the heap are affected -- performing a delete-min now in any
version (except the latest) will take O(1).
As I said, there is not a solid proof, but neither could anyone find a
sequence of operations in persistent setting that would break claimed
bound.
>
> Wikipedia suggested that skew heaps usually outperformed leftist heaps.
I wrote a reasonably refined implementation of skew heaps and got
>
> Times (ms)
> min mean +/-sd median max
> Pairing: 8.000 12.401 4.274 12.001 28.002
> Binomial: 24.001 31.842 6.200 28.002 44.003
> Skew: 24.001 31.922 6.280 32.002 48.003
>
> for sorting of random data. Also, more importantly, skew heaps don't
even support amortized O(1) insertion, let alone worst-case. Binomial
heaps support amortized O(1) insertion, worst-case O(log n). There's a
variant of binomial heaps which supports worst-case O(log n) insertion,
but it's not very pretty to code, and it would make an already large
amount of code blow up hugely. I'm willing to settle for the amortized
bounds there.
>
> Conclusion: I'm inclined to stick with binomial heaps. Further
experimentation has indicated that there isn't much difference between
lazy and strict sorting speeds, so I'm inclined to stick with the strict
implementation -- again for the crowd who needs real-time performance.
I would probably stick with lazy pairing heaps, and if the time bounds
would discovered to be wrong, to switch it with binomial implementation.
The only issue of lazy pairing heaps is amortization (let the conjecture
aside). I personally think the performance win justifies this.
Of course, providing a real-time impementation, preferably with the same
API, is a great idea. Maybe in another package?
And a last think -- do you think {{{ViewQ}}} is a good idea? I would
prefer {{{extract}}} to return {{{Maybe (a, PQueue a)}}}. I agree it is
not so elegant in pattern matching, but
* if the module is qualified, you have to write {{{PQueue.(:^) a queue}}}
with the new-style qualified operator, which is horrible
* you cannot use monadic combinators. If the extract returns in the
{{{Maybe}}} monad, you can.
--
Ticket URL: <http://hackage.haskell.org/trac/ghc/ticket/3909#comment:21>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
_______________________________________________
Glasgow-haskell-bugs mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs