On Mon, Apr 25, 2011 at 4:26 PM, Ted Dunning <[email protected]> wrote:
>
> Pedantically speaking, that is.  If we talking worst case bounds, then the
> worst case comes into the conversation though it be of little practical
> import.


If we're already talking about probabilistic algorithms, talking about worst
case starts to get a little silly, IMO.

Indeed.  I have never implemented anything else.  Note that the worst case
> is still linear in n.  It just has the full k log k multiplier.  Normally
> the multiplier is much less but it seems (after a full 10 seconds of
> reflection) that you still have a total of O(n + log n log k) if you cache
>

I think that's O(n + log(n) * k * log(k)), because you have to re-heapify,
but yeah, same effect:

BUT
>
> As Jake says, this is all really just O(n) and is so fast that none of this
> matters to the user.
>

Reply via email to