I think that in many cases estimating memory taken maybe quite easy (when there are no circual references). Then it's ease to write utility that would analyze object using reflection or to implement interface that has size() method; If you know structure of the bean and there are well known types (strings, primitives maybe dates/timestamps)  estimating size is as easy as writing hashCode :)
Michal Malecki
----- Original Message -----
Sent: Friday, July 22, 2005 3:15 PM
Subject: Re: Estimating amount of memory need for cache

Thanks for the info Michal,

I suppose then you'd have to double expected size of each string in your model then, to make the estimate safer.

Too bad there isn't something along the lines of System.sizeOf (Object obj). 

Cheers,
Clinton

On 7/22/05, Michal Malecki <[EMAIL PROTECTED]> wrote:
Hello,
I think it's not quite so e.g. all characters in the memory are 2 bytes and serialized there are in UTF-8
here is article about it:
Regards
Michal Malecki

You could serialize the results it to disk.  That would give you a pessimistic estimation of the size it will be in memory.  That is, the cost of serialization will be higher than the amount of RAM used, so you can feel relatively confident that it will be smaller in memory.

Cheers,
Clinton

On 7/21/05, Nathan Maves <[EMAIL PROTECTED]> wrote:
What is the best and most accurate way to calculate the space needed
to cache a query.

Say I have a query that returns 5 strings on average of 20 chars each.

The query returns 5000 rows.

I know this seems high but just humor me and remember I have access
to big servers here at Sun :)

Nathan


Reply via email to