On Thu, Dec 31, 2020 at 8:43 AM James Oldfield
<james.oldfi...@cantab.net> wrote:
>
> 1. My first suggestion is to replace the odd cache logic with a
> straightforward classic LRU cache. I'm hoping this is uncontroversial,
> but I can imagine there might be programs that somehow depend on the
> current behaviour. But my suspicion is that many more suffer and just no
> one looked hard enough to notice.
>

LRU has worst scenario too. And implementing LRU requires some memory.
For example, functools.lru_cache() uses doubly-linked list internally.
There are some LRU-like algorithms, like clock or double chance. But
all algorithms have worst scenario anyway.

Another option is a random eviction algorithm. It chose random entry to evict.

* Frequently used keys have a high chance for cache hit, like LFU
(least frequently used).
* When frequently used keys are changed in time, old frequently used
keys are eventually evicted. (behaves like least recently frequently
used?)
* Minimum overhead for managing keys.

So I think a random eviction algorithm is better than LRU.

Off topic: Can we add random_cache to functools, like lru_cache?

Regards,

-- 
Inada Naoki  <songofaca...@gmail.com>
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/PM64J2I77AIOMN7XXZSY2F6O4SFXEEHX/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to