On Sunday, 17 January 2016 at 03:26:54 UTC, Andrei Alexandrescu wrote:
On 1/16/16 9:37 PM, Timon Gehr wrote:
Ivan's analysis suggests that even something significantly larger, like
n/log(n)² might work as an upper bound for k.

I'm not clear on how you got to that boundary. There are a few implementation details to be minded as well (quickly and carefully approximating the log etc). Could you please make a commented pull request against my own? Thanks! -- Andrei

The average case is O(n + (k log n log k)) for small enough k. So, any k below roughly n / log^2 (n) makes the second summand less than the first.

Of course, the constant not present in asymptotic analysis, as well as handling the worst case, may make this notion impractical. Measure is the way :) . The worst case is just a decreasing sequence, or almost decreasing if we want to play against branch prediction as well.

Reply via email to