[ 
https://issues.apache.org/jira/browse/HBASE-1590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12725469#action_12725469
 ] 

Jonathan Gray commented on HBASE-1590:
--------------------------------------

Let's keep it open, would like to get Erik's input tomorrow.

We need to address above 2 issues, this is as fine a place as any.

> Extend TestHeapSize and ClassSize to do "deep" sizing of Objects
> ----------------------------------------------------------------
>
>                 Key: HBASE-1590
>                 URL: https://issues.apache.org/jira/browse/HBASE-1590
>             Project: Hadoop HBase
>          Issue Type: Improvement
>    Affects Versions: 0.20.0
>            Reporter: Jonathan Gray
>             Fix For: 0.20.0
>
>
> As discussed in HBASE-1554 there is a bit of a disconnect between how 
> ClassSize calculates the heap size and how we need to calculate heap size in 
> our implementations.
> For example, the LRU block cache can be sized via ClassSize, but it is a 
> shallow sizing.  There is a backing ConcurrentHashMap that is the largest 
> memory consumer.  However, ClassSize only counts that as a single reference.  
> But in our heapSize() reporting, we want to include *everything* within that 
> Object.
> This issue is to resolve that dissonance.  We may need to create an 
> additional ClassSize.estimateDeep(), we may need to rethink our HeapSize 
> interface, or maybe just leave it as is.  The two primary goals of all this 
> testing is to 1) ensure that if something is changed and the sizing is not 
> updated, our tests fail, and 2) ensure our sizing is as accurate as possible.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to