hi Dawn,
from your config, any time you open a new searcher you are
auto-warming the *most
recent 100 keys* of your cache (while losing the other entries).
How often do you open a searcher ? (soft commit or hard commit with
openSearcher='true' ?)

>From your comment "found that the Filter cache can not be released" -> when
do you expect Solr to release the memory associated with the filter cache?

A Caffeine cache removes rows from the cache if they are not used for
 maxIdleTimeSec="600"
"Specifies that each entry should be automatically removed from the cache
once a fixed duration has elapsed after the entry's creation, the most
recent replacement of its value, or its last access. Access time is reset
by all cache read and write
operations" 
com.github.benmanes.caffeine.cache.Caffeine#expireAfterAccess(java.time.Duration)

Cheers
--------------------------
Alessandro Benedetti
Apache Lucene/Solr Committer
Director, R&D Software Engineer, Search Consultant

www.sease.io


On Fri, 23 Jul 2021 at 03:18, Dawn <limingni...@live.com> wrote:

> Hi:
> solr 8.7.0
> My online service, memory continues to grow, dump memory found that the
> Filter cache can not be released, has been occupying memory.
>
>         The filter cache class is CaffeineCache I tried to adjust the GC
> policy and Filter cache parameter(maxRamMB maxIdleTimeSec cleanupThread
> autowarmCount), but it doesn't solve the problem.
>
>         Are there any other parameters about the cache that can be
> adjusted?
>
>
>
> <filterCache class="solr.CaffeineCache"
>                  maxRamMB="200"
>                  maxIdleTimeSec="600"
>                  cleanupThread="true"
>                  autowarmCount="100”/>
>
>
>
>
>
>
>

Reply via email to