On 7/6/20 11:03 AM, Zac Medico wrote:
> On 7/6/20 10:30 AM, Chun-Yu Shei wrote:
>> I finally got a chance to try Sid's lru_cache suggestion, and the
>> results were really good.  Simply adding it on catpkgsplit and moving
>> the body of use_reduce into a separate function (that accepts tuples
>> instead of unhashable lists/sets) and decorating it with lru_cache
>> gets a similar 40% overall speedup for the upgrade case I tested.  It
>> seems like even a relatively small cache size (1000 entries) gives
>> quite a speedup, even though in the use_reduce case, the cache size
>> eventually reaches almost 20,000 entries if no limit is set.  With
>> these two changes, adding caching to match_from_list didn't seem to
>> make much/any difference.
> 
> That's great!
> 
>> The catch is that lru_cache is only available in Python 3.2, so would
>> it make sense to add a dummy lru_cache implementation for Python < 3.2
>> that does nothing?  There is also a backports-functools-lru-cache
>> package that's already available in the Portage tree, but that would
>> add an additional external dependency.
>>
>> I agree that refactoring could yield an even bigger gain, but
>> hopefully this can be implemented as an interim solution to speed up
>> the common emerge case of resolving upgrades.  I'm happy to submit new
>> patches for this, if someone can suggest how to best handle the Python
>> < 3.2 case. :)
>>
>> Thanks,
>> Chun-Yu
> 
> We can safely drop support for < Python 3.6 at this point. Alternatively
> we could add a compatibility shim for Python 2.7 that does not perform
> any caching, but I really don't think it's worth the trouble to support
> it any longer.

We've dropped Python 2.7, so now the minimum version is Python 3.6.
-- 
Thanks,
Zac

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to