[issue23030] lru_cache manual get/put

2014-12-14 Thread Constantin

Constantin added the comment:

I understand your decision. Even though it makes my life a little bit harder, 
it is definitely not the end of the world, the end of Python or even the end 
for my libtco project.

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-13 Thread Constantin

Constantin added the comment:

It may be the case, that an lru_cache does not provide the best strategy for 
reliably caching many base cases in recursively written code. I suggest that 
someday we think about a different caching paradigm which fits this purpose and 
add it to functools e.g. as functools.recopt_cache. This cache would then 
implement the same interface as lru_cache and therefore all code currently 
using lru_cache could benefit from recopt_cache with just one line of code 
change.

Furthermore, by designing this interface, it becomes more probable that user 
defined caching decorators are compatible.

Please remember: My suggestion isn't just about lru_cache, but about an 
interface for caching decorators.

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-13 Thread Raymond Hettinger

Raymond Hettinger added the comment:

Sorry Constantin, I am rejecting this proposal or any variants of it.

* As Nick pointed-out in the referenced thread, we provide two tools: a 
functools caching decorator that is tightly focused on the task of caching 
function calls and a collections OrderedDict that is a general purpose data 
store suitable for implementing custom LRU logic when needed.

* As John pointed-out, this proposal isn't even fit for the original use case.  
Any speed benefits of a tail recursion optimization get immediately wiped out 
by overhead of a LRU cache decorator, and the clarity of the original code 
starts to get lost in the extra code to call cache_get() and cache_put() -- 
remember the decorator was designed to wrap around a function without having to 
change the logic inside it.  Also, the optimization itself is fundamentally 
suspect because it changes the semantics of the language (thereby defeating all 
decorators that supply wrapper functions including memoization decorators, call 
loggers, precondition/postcondition checkers, subscription notifiers, type 
checkers, etc).

* The essence of this proposal is at odds with what the functools caching 
decorator is all about -- providing a cache for function calls.  The proposal 
bypasses the function call itself, making it more difficult to reason about 
what is in the cache (i.e. the result of previous function calls) and mucking 
up the cache hit/miss statistics which stop being meaningful.

* The proposal also adds API complexity (making it less like a function 
decorator and more like an exposed data store such as an ordered dictionary).  
And, it creates a new control flow exception NotInCache.  Collectively, these 
changes make the decorator slower, harder to maintain, harder to learn, harder 
to avoid reentrancy and locking problems, and harder to reason about but it 
doesn't provide much if any incremental benefit over using an OrderedDict 
directly.  

* To the extent there are valid niche use cases, we shouldn't try to cater to 
all of them.  Good standard library API design stay focused on serving on the 
common case as well as possible and leaving the rest to OrderedDict or a 
third-party package (you're welcome to publish one to see if it actually serves 
a real need).

* In designing the cache, I surveyed previously published memoization 
decorators and did not find the proposed feature. That means that it is false 
to state that every memoization decorator *must have* some functionality to 
insert or lookup entries while bypassing the function call that was intended to 
be cached.  If it really was must have behavior, then it would have already 
been commonplace.  IMO, cache_put() is bad design (would you want you database 
cache to return something that was never in the database or your disk cache to 
return values that had never been written to disk?)

* Hopefully, this message explains my reasoning clearly, so you will understand 
why I'm closing this one.  That said, judging by the insistent wording of your 
posts, I don't expect that you will be convinced.  Your mental model of caching 
tools is very different from what the functools.lru_cache was intended to 
accomplish.  To the extent that you don't really want a transparent function 
decorator and would prefer a full featured class (with methods for getting, 
setting, deleting, re-ordering, listing, monitoring, etc), I recommend that you 
write one and post it to the Python Package Index.  The 
collections.OrderedDict() class should easily handle the LRU storage logic, so 
all you have to do is specify your API and decide which parts of the store you 
want to expose.

--
resolution:  - rejected
status: open - closed

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Constantin

New submission from Constantin:

In an effort for improved communication between stacked decorators, I would 
like to propose that all decorators providing caching mechanisms should provide 
the functions cache_info, cache_clear, cache_get and cache_put. The standard 
lib only provides functools.lru_cache as caching decorators, which already 
implements the former two. I have attached a patch to provide the latter two.

On python-ideas there was also the idea to create a cache data structure and 
then have a decorator take its data structure as an argument. But it was argued 
that this could lead to some optimizations of the caching being impossible.

I think my suggested approach should not impose a problem on highly optimized 
caching code, because every cache - no matter how optimized - must have some 
functionality to add something to the cache and lookup something in the cache.

--
components: Library (Lib)
files: lru_get_put.patch
keywords: patch
messages: 232476
nosy: ConnyOnny
priority: normal
severity: normal
status: open
title: lru_cache manual get/put
type: enhancement
versions: Python 3.5
Added file: http://bugs.python.org/file37415/lru_get_put.patch

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Ethan Furman

Changes by Ethan Furman et...@stoneleaf.us:


--
nosy: +ethan.furman

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Ethan Furman

Changes by Ethan Furman et...@stoneleaf.us:


--
nosy: +ncoghlan, rhettinger

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread R. David Murray

R. David Murray added the comment:

It would be helpful (and provide a better record) if you could provide a link 
to the python-ideas discussion that endorses your idea.

--
nosy: +r.david.murray

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Ethan Furman

Ethan Furman added the comment:

When reading this thread, keep in mind that most of it was taken up with 
rejecting exposing the underlying data structure, which is not what this patch 
does.

https://mail.python.org/pipermail/python-ideas/2014-December/030230.html

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Josh Rosenberg

Josh Rosenberg added the comment:

Manual adding to the cache seems of limited utility for the proposed recursion 
base case approach when your cache is actually operating in LRU mode (maxsize 
isn't None). You can inject your base cases all you want, but unless you wrap 
every call to the decorated function with another function that reinserts the 
base cases every time (and even that won't always work), you can't actually 
rely on the cache containing your base cases; they could be aged off at any 
time.

I've been bad and (for fun) used inspect to directly find the cache closure 
variable in an lru_cache wrapped function to inject base cases, but it only 
works if your cache is unbounded. Trying to make it actually work reliably in 
LRU mode would complicate matters. I suppose a collections.ChainMap (where one 
is the LRU map, and the other is an unbounded dictionary solely for injected 
entries) might work, but it also starts making the cache more expensive to use 
in general for a really limited use case.

--
nosy: +josh.r

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue23030] lru_cache manual get/put

2014-12-11 Thread Raymond Hettinger

Changes by Raymond Hettinger raymond.hettin...@gmail.com:


--
assignee:  - rhettinger
priority: normal - low

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue23030
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com