[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-24 Thread Raymond Hettinger


Raymond Hettinger  added the comment:

[Andrew Svetlov]
> A third-party library should either copy all these 
> implementation details or import a private function from stdlib 

OrderedDict provides just about everything needed to roll lru cache variants.  
It simply isn't true this can only be done efficiently in the standard library.


[Serhiy]
> it would be simpler to add a decorator which wraps the result 
> of an asynchronous function into an object which can be awaited
> more than once:

This is much more sensible.

> It can be combined with lru_cache and cached_property any third-party
> caching decorator. No access to internals of the cache is needed.

Right.  The premise that this can only be done in the standard library was 
false.

> async_lru_cache() and async_cached_property() can be written 
> using that decorator. 

The task becomes trivially easy :-)  


[Andrew Svetlov]
> Pull Request is welcome!

ISTM it was premature to ask for a PR before an idea has been thought through.  
We risk wasting a user's time or committing too early before simpler, better 
designed alternatives emerge.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-24 Thread Joongi Kim


Change by Joongi Kim :


--
nosy: +achimnol

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Serhiy Storchaka


Serhiy Storchaka  added the comment:

async_lru_cache() and async_cached_property() can be written using that 
decorator. The implementation of async_lru_cache() is complicated because the 
interface of lru_cache() is complicated. But it is simpler than using 
_lru_cache_wrapper().

def async_lru_cache(maxsize=128, typed=False):
if callable(maxsize) and isinstance(typed, bool):
user_function, maxsize = maxsize, 128
return lru_cache(maxsize, typed)(reawaitable(user_function))

def decorating_function(user_function):
return lru_cache(maxsize, typed)(reawaitable(user_function))

return decorating_function

def async_cached_property(user_function):
return cached_property(reawaitable(user_function))

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Serhiy Storchaka


Serhiy Storchaka  added the comment:

I think that it would be simpler to add a decorator which wraps the result of 
an asynchronous function into an object which can be awaited more than once:

def reawaitable(func):
@wraps(func)
def wrapper(*args, **kwargs):
return CachedAwaitable(func(*args, **kwargs))
return wrapper

It can be combined with lru_cache and cached_property any third-party caching 
decorator. No access to internals of the cache is needed.

@lru_cache()
@reawaitable
async def coro(...):
...

@cached_property
@reawaitable
async def prop(self):
...

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Tzu-ping Chung


Tzu-ping Chung  added the comment:

Another thing to point out is that existing third-party solutions (both 
alru_cache and cached_property) only work for asyncio, and the stdlib version 
(as implemented now) will be an improvement. And if the position is that the 
improvements should only be submitted to third-party solutions---I would need 
to disagree since both lru_cache and cached_property have third-party solutions 
predating their stdlib implementations, and it is double-standard IMO if an 
async solution is kept out while the sync version is kept in.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

Thanks, Raymond.

I agree that caching of iterators and generators is out of the issue scope.

Also, I agree that a separate async cache decorator should be added. I prefer 
the `async_lru_cache` (and maybe `async_cache` for the API symmetry). We have 
`contextmanager` and `asynccontextmanager` in contextlib already along with 
`closing` / `aclosing`, `ExitStack` / `AsyncExitStack` etc.

`async_lru_cache` should have the same arguments as accepted by `lru_cache` but 
work with async functions.

I think this function should be a part of stdlib because the implementation 
shares *internal* `_lru_cache_wrapper` that does all dirty jobs (and has C 
accelerator). A third-party library should either copy all these implementation 
details or import a private function from stdlib and keep fingers crossed in 
hope that the private API will keep backward compatibility in future Python 
versions.

Similar reasons were applied to contextlib async APIs.

Third parties can have different features (time-to-live, expiration events, 
etc., etc.) and can be async-framework specific (work with asyncio or trio 
only) -- I don't care about these extensions here.

My point is: stdlib has built-in lru cache support, I love it. Let's add 
exactly the as we have already for sync functions but for async ones.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-22 Thread Raymond Hettinger


Change by Raymond Hettinger :


--
title: Add a async variant of lru_cache for coroutines. -> Add an async variant 
of lru_cache for coroutines.

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add a async variant of lru_cache for coroutines.

2022-02-22 Thread Raymond Hettinger


Raymond Hettinger  added the comment:

If this goes forward, my strong preference is to have a separate async_lru() 
function  just like the referenced external project.

For non-async uses, overloading the current lru_cache makes it confusing to 
reason about. It becomes harder to describe clearly what the caches do or to 
show a pure python equivalent.  People are already challenged to digest the 
current capabilities and are already finding it difficult to reason about when 
references are held.  I don't want to add complexity, expand the docs to be 
more voluminous, cover the new corner cases, and add examples that don't make 
sense to non-async users (i.e. the vast majority of python users).  Nor do I 
want to update the recipes for lru_cache variants to all be async aware.  So, 
please keep this separate (it is okay to share some of the underlying 
implementation, but the APIs should be distinct).

Also as a matter of fair and equitable policy, I am concerned about taking the 
core of a popular external project and putting in the standard library.  That 
would tend to kill the external project, either stealing all its users or 
leaving it as something that only offers a few incremental features above those 
in the standard library.  That is profoundly unfair to the people who created, 
maintained, and promoted the project.

Various SC members periodically voice a desire to move functionality *out* of 
the standard library and into PyPI rather than the reverse.  If a popular 
external package is meeting needs, perhaps it should be left alone.

As noted by the other respondants, caching sync and async iterators/generators 
is venturing out on thin ice.  Even if it could be done reliably (which is 
dubious), it is likely to be bug bait for users.  Remember, we already get 
people who try to cache time(), random() or other impure functions.  They cache 
methods and cannot understand why references is held for the instance.  
Assuredly, coroutines and futures will encounter even more misunderstandings. 

Also, automatic reiterability is can of worms and would likely require a PEP. 
Every time subject has come up before, we've decided not to go there.  Let's 
not make a tool that makes user's lives worse.  Not everything should be 
cached.  Even if we can, it doesn't mean we should.

--
title: Support  decorating a coroutine with functools.lru_cache -> Add a async 
variant of lru_cache for coroutines.

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com