Hi guys, I have done some tests a week ago on Mavibot partition. It appears that we have a serious performance issue as soon as the JVM memory is completly eaten, when adding some entries. What happens is that we start to discard some pages as they are hold by WeakReferences, and the GC has to do quite a expensive processing to get back some free memory. This slows down the add operation so much (by at least a factor 10) that it's not possible to keep going with WeakReferences (or SoftReferences).
At this point, the best solution would be to use a cache to replace this mechanism. There are different things we can cache : o PageIo : it's useless, as we still have to deserialize them afterward o Nodes and Leaves : They contain references to other Nodes or Leaves, and for Leaves, reference to data. Holding the upper Nodes would save us a lot of time o Data : the ultimate objects to cache : having the object in cache spares us the time it takes to searcj it in a Btree A few other things to consider : o we need a versionned cache : all the elements may be versionned. o each BTree may be cached, with different configuation (some BTree may be used rarely, other may be heavily used, requiring a more aggressive cache configuration) o the Cache must not use locks, but CAS (Compare-And-Swap) for better performance At this point, I suggest using EhCache as a first approach, in order to leverage the cache we already use in the server, but we may want to design our own LRU cache (something that should not be too complex to implement). Thoughts ? -- Regards, Cordialement, Emmanuel Lécharny www.iktek.com
