Matthew Toseland wrote:

> [B [I [C [[B almost certainly mean byte[], int[], char[], byte[][].

Good to know.

> I didn't get jmap to work (maybe because I was using java 5), but I did
> do some invasive profiling (with stack traces), and used that to
> identify and eliminate some high-churn objects; if the current problem is
> that too much garbage collection is occurring (causing 100% cpu usage),
> this is most likely caused by too many objects being allocated per second.

FWIW, I have the OoM problem but not high CPU problems (see my graphs
pending moderation).

Though, as the point of OoM gets closer, the JVM will attempt a full OoM
each time it would run out of memory, so I'd say that in the final moments
of a node, CPU churn due to GC will be very high (I'll try to capture these
moments with jconsole, very nice tool).

> Anyway my original trace is here:
> http://amphibian.dyndns.org/java.hprof.multi-day.no-logging.1011.txt
> 
> This is produced by options:
> wrapper.java.additional.3=-Xloggc:freenet.loggc
>
wrapper.java.additional.4=-Xrunhprof:heap=all,format=a,depth=12,lineno=y,doe=y,
> gc_okay=y
> 
> (Set logging to NORMAL if you haven't already; heavy logging with
> profiling makes the node break)
> 
> On Sun, Feb 04, 2007 at 06:29:04PM +0100, Jano wrote:
>> Jano wrote:
>> 
>> > Matthew Toseland wrote:
>> 
>> >> How to identify such misuse?
>> > 
>> > Memory profiling, I'd say. Though I have never done it with java.
>> 
>> I've started toying with jmap/jhat after upgrading to java6. This node is
>> a linux one. Here's a heap dump at node start:
>> 
>> Top 12 Class histogram:
>> 
>> Class                                   Instance Count  Total Size
>> class [B                                        379353  18473891
>> class [I                                        315048  12805068
>> class [C                                        56549   11786932
>> class [[B                                       6137    3190228
>> class [Lcom.sleepycat.je.tree.Node;             6135    3190200
>> class [Ljava.util.HashMap$Entry;                33158   2950160
>> class [J                                        2037    2091504
>> class freenet.client.async.SingleBlockInserter  30965   1795970
>> class [[I                                       20812   1415092
>> class java.util.HashMap$Entry                   66778   1068448
>> class java.util.HashMap                         33144   1060608
>> class java.lang.String                          56831   909296
>> 
>> Top 10 class instance counts (excluding platform):
>> 
>> 31204 instances of class freenet.client.FailureCodeTracker
>> 30965 instances of class freenet.client.async.SingleBlockInserter
>> 20604 instances of class freenet.support.io.ReadOnlyFileSliceBucket
>> 16213 instances of class com.sleepycat.je.tree.LN
>> 10413 instances of class freenet.support.io.FileBucket
>> 10403 instances of class freenet.crypt.ciphers.Rijndael
>> 10367 instances of class
>> freenet.support.io.PaddedEphemerallyEncryptedBucket 10355 instances of
>> class freenet.support.io.DelayedFreeBucket 9947 instances of class
>> freenet.keys.FreenetURI 6157 instances of class
>> com.sleepycat.je.latch.Java5SharedLatchImpl
>> 
>> Total instances pending finalization: 0
>> 
>> The strange short names are, so it seems, "platform classes", which I
>> don't know what means (apparently java.* and javax.* classes, but these
>> strange names go away too).
>> 
>> And here's the heap dump at the moment of node death (node configured
>> with 128m max mem):
>> 
>> Class                                   Instance Count  Total Size
>> class [B                                        771728  34242096
>> class [C                                        113887  19922068
>> class [I                                        439531  18758092
>> class [[B                                       9527    4953028
>> class [Lcom.sleepycat.je.tree.Node;             9525    4953000
>> class [Ljava.util.HashMap$Entry;                50089   4642912
>> class [J                                        3030    3111184
>> class com.sleepycat.je.tree.LN                  173461  2254993
>> class [[I                                       28586   1943756
>> class java.lang.String                          115987  1855792
>> class freenet.client.async.SingleBlockInserter  31080   1802640
>> class java.util.HashMap$Entry                   108956  1743296
>> 
>> 173461 instances of class com.sleepycat.je.tree.LN
>> 31413 instances of class freenet.client.FailureCodeTracker
>> 31080 instances of class freenet.client.async.SingleBlockInserter
>> 20613 instances of class freenet.support.io.ReadOnlyFileSliceBucket
>> 17564 instances of class freenet.keys.FreenetURI
>> 14291 instances of class freenet.crypt.ciphers.Rijndael
>> 14246 instances of class
>> freenet.support.io.PaddedEphemerallyEncryptedBucket 11793 instances of
>> class freenet.support.io.FileBucket 10358 instances of class
>> freenet.support.io.DelayedFreeBucket 10065 instances of class
>> freenet.support.LRUQueue$QItem
>> 
>> Total instances pending finalization: 0
>> 
>> From these numbers it seems that sleepycat is notably leaking memory (the
>> LN objects). Someone familiar with DBD could perhaps pinpoint seeing this
>> if this is a bug in BDB or a mismanagement in freenet (unclosed
>> whatevers?)
>> 
>> I'm running now with 256m to see if the differences are even more
>> apparent. I'll send later some graphs obtained with jconsole that pretty
>> much back that there's leaking going on (i.e. if we find the leak,
>> freenet will run comfortably with 128m or less).
>> 
>> _______________________________________________
>> Devl mailing list
>> Devl at freenetproject.org
>> http://emu.freenetproject.org/cgi-bin/mailman/listinfo/devl
>>



Reply via email to