Have a look at either Eclipse Memory Analyser (they have a standalone
version of the memory analyser) or YourKit Java Profiler (commercial,
but with evaluation license). I successfully load and browse heap
bigger than the available memory on the system.

Regards,

Benoit

2010/4/3 Weijun Li <weiju...@gmail.com>:
> Thank you Benoit. I did a search but couldn't find any that you mentioned.
> Both jhat and netbean load entire map file int memory. Do you know the name
> of the tools that requires less memory to view map file?
> Thanks,
> -Weijun
>
> On Sat, Apr 3, 2010 at 12:55 AM, Benoit Perroud <ben...@noisette.ch> wrote:
>>
>> It exists other tools than jhat to browse a heap dump, which stream
>> the heap dump instead of loading it full in memory like jhat do.
>>
>> Kind regards,
>>
>> Benoit.
>>
>> 2010/4/3 Weijun Li <weiju...@gmail.com>:
>> > I'm running a test to write 30 million columns (700bytes each) to
>> > Cassandra:
>> > the process ran smoothly for about 20mil then the heap usage suddenly
>> > jumped
>> > from 2GB to 3GB which is the up limit of JVM, --from this point
>> > Cassandra
>> > will freeze for long time (terrible latency, no response to nodetool
>> > that I
>> > have to stop the import client ) before it comes back to normal . It's a
>> > single node cluster with JVM maximum heap size of 3GB. So what could
>> > cause
>> > this spike? What kind of tool can I use to find out what are the objects
>> > that are filling the additional 1GB heap? I did a heap dump but could
>> > get
>> > jhat to work to browse the dumped file.
>> >
>> > Thanks,
>> >
>> > -Weijun
>> >
>
>

Reply via email to