It exists other tools than jhat to browse a heap dump, which stream
the heap dump instead of loading it full in memory like jhat do.

Kind regards,

Benoit.

2010/4/3 Weijun Li <weiju...@gmail.com>:
> I'm running a test to write 30 million columns (700bytes each) to Cassandra:
> the process ran smoothly for about 20mil then the heap usage suddenly jumped
> from 2GB to 3GB which is the up limit of JVM, --from this point Cassandra
> will freeze for long time (terrible latency, no response to nodetool that I
> have to stop the import client ) before it comes back to normal . It's a
> single node cluster with JVM maximum heap size of 3GB. So what could cause
> this spike? What kind of tool can I use to find out what are the objects
> that are filling the additional 1GB heap? I did a heap dump but could get
> jhat to work to browse the dumped file.
>
> Thanks,
>
> -Weijun
>

Reply via email to