On 11/30/15 5:07 PM, Andrei Alexandrescu wrote:
On 11/30/15 4:58 PM, Steven Schveighoffer wrote:

What about selecting a random element in 0..k/2 instead of 0..k-1?

I think complexity would stay the same. Choosing a tighter range puts a
greater weight on the last search than on recent searches.

In the case where you search for a very small number of elements, it puts an upper bound on how soon they make it to the front (log(n) instead of n)

One thing I like is that I choose 0..k, not 0..k-1, which means it's
possible that the element stays put (no change in position). That
reduces thrashing for the top (most frequently searched) few elements.

I think insignificantly, depending on the number of "frequently searched" elements. But you could tune it, let's say to 8 elements:

const upperBound = max(k/2, min(8, k));

There are a lot of options for tuning that can be played with, probably the best way to "prove" what is best is to just try some experiments :)

-Steve

Reply via email to