I think you mean RamUsageEstimator (in Lucene's test-framework)?

It's entirely possible it fails to dig into Maps correctly with newer Java
releases; maybe Dawid or Uwe would know?

Mike McCandless

http://blog.mikemccandless.com


On Tue, Dec 4, 2018 at 12:18 PM Michael Sokolov <msoko...@gmail.com> wrote:

> Hi, I'm using RamUsageCrawler to size some things, and I find it seems
> to underestimate the size of Map (eg HashMap and ConcurrentHashMap).
> This is using a Java 10 runtime, with code compiled to Java 8. I
> looked at the implementation and it seems as if for JRE classes, when
> JRE >= 9, we can no longer use reflection to size them accurately?
> Instead the implementation estimates the map size by treating it as an
> array of (keys and value) plus some constant header size. But this
> seems to neglect the size of the HashMap$Node (in the case of HashMap
> - I haven't looked at ConcurrentHashMap or TreeMap or anything). In my
> case, I have a great many maps of a relatively small number of shared
> keys and values, so the crawler seems to be wildly under-counting. I'm
> comparing to sizes gleaned from heap dumps, eclipse mat, and OOM
> events.
>
> I wonder if we can we improve on the estimates for Maps and other
> Collections?
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
> For additional commands, e-mail: java-user-h...@lucene.apache.org
>
>

Reply via email to