New submission from Chiu-Hsiang Hsu:

Currently, lru_cache will automatically construct a Python dictionary in the 
function as cachedict. IMHO, it will be much more flexible to let users 
specified their cachedict, so they can use any kind of dict-like calss as their 
cachedict. Thus, users can use any dictionary implementation and save result in 
any form they want.

for example :

use OrderedDict

.. code-block:: python

    from functools import lru_cache
    from collections import OrderedDict

    @lru_cache(maxsize=None, cache=OrderedDict())
    def func(*args, **kwargs):
        pass


save by pickle

.. code-block:: python

    import os
    import pickle
    from functools import lru_cache

    filename = "cache.pickle"
    cache = {}

    def load_cache():
        global cache
        if os.path.isfile(filename):
            with open(filename, "rb") as f:
                cache = pickle.load(f)

    def store_cache():
        with open(filename, "wb") as f:
            pickle.dump(cache, f)

    load_cache()

    @lru_cache(maxsize=None, cache=cache)
    def func(*args, **kwargs):
        pass

----------
components: Library (Lib)
files: functools.lru_cache-user-specified-cachedict.patch
keywords: patch
messages: 258001
nosy: wdv4758h
priority: normal
severity: normal
status: open
title: functools.lru_cache user specified cachedict support
type: enhancement
versions: Python 3.6
Added file: 
http://bugs.python.org/file41584/functools.lru_cache-user-specified-cachedict.patch

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue26082>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to