Raymond Hettinger added the comment:

Sorry Constantin, I am rejecting this proposal or any variants of it.

* As Nick pointed-out in the referenced thread, we provide two tools: a 
functools caching decorator that is tightly focused on the task of caching 
function calls and a collections OrderedDict that is a general purpose data 
store suitable for implementing custom LRU logic when needed.

* As John pointed-out, this proposal isn't even fit for the original use case.  
Any speed benefits of a tail recursion optimization get immediately wiped out 
by overhead of a LRU cache decorator, and the clarity of the original code 
starts to get lost in the extra code to call cache_get() and cache_put() -- 
remember the decorator was designed to wrap around a function without having to 
change the logic inside it.  Also, the optimization itself is fundamentally 
suspect because it changes the semantics of the language (thereby defeating all 
decorators that supply wrapper functions including memoization decorators, call 
loggers, precondition/postcondition checkers, subscription notifiers, type 
checkers, etc).

* The essence of this proposal is at odds with what the functools caching 
decorator is all about -- providing a cache for function calls.  The proposal 
bypasses the function call itself, making it more difficult to reason about 
what is in the cache (i.e. the result of previous function calls) and mucking 
up the cache hit/miss statistics which stop being meaningful.

* The proposal also adds API complexity (making it less like a function 
decorator and more like an exposed data store such as an ordered dictionary).  
And, it creates a new control flow exception NotInCache.  Collectively, these 
changes make the decorator slower, harder to maintain, harder to learn, harder 
to avoid reentrancy and locking problems, and harder to reason about but it 
doesn't provide much if any incremental benefit over using an OrderedDict 
directly.  

* To the extent there are valid niche use cases, we shouldn't try to cater to 
all of them.  Good standard library API design stay focused on serving on the 
common case as well as possible and leaving the rest to OrderedDict or a 
third-party package (you're welcome to publish one to see if it actually serves 
a real need).

* In designing the cache, I surveyed previously published memoization 
decorators and did not find the proposed feature. That means that it is false 
to state that every memoization decorator *must have* some functionality to 
insert or lookup entries while bypassing the function call that was intended to 
be cached.  If it really was "must have" behavior, then it would have already 
been commonplace.  IMO, cache_put() is bad design (would you want you database 
cache to return something that was never in the database or your disk cache to 
return values that had never been written to disk?)

* Hopefully, this message explains my reasoning clearly, so you will understand 
why I'm closing this one.  That said, judging by the insistent wording of your 
posts, I don't expect that you will be convinced.  Your mental model of caching 
tools is very different from what the functools.lru_cache was intended to 
accomplish.  To the extent that you don't really want a transparent function 
decorator and would prefer a full featured class (with methods for getting, 
setting, deleting, re-ordering, listing, monitoring, etc), I recommend that you 
write one and post it to the Python Package Index.  The 
collections.OrderedDict() class should easily handle the LRU storage logic, so 
all you have to do is specify your API and decide which parts of the store you 
want to expose.

----------
resolution:  -> rejected
status: open -> closed

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue23030>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to