Christopher,

thanks for your comprehensive response !
See more comments down ...

----- Original Message -----
From: Christopher Schultz <[EMAIL PROTECTED]>
To: Tomcat Users List <[EMAIL PROTECTED]>
Sent: Wednesday, October 29, 2003 3:29 PM
Subject: Re: memory-leaks in servlets, tool for tracing ?


> Grisi,
>
> > our TC-based webapplication performs well but the java-processes
> > concerned are showing increasing memory usage over time. For tracing
> > we already stripped the app down to the very basic to get a clue.
> > Wasn't successful enough.
>
> Have you looked at the memory over a long time, including several large
> GCs? I've had to do this in the past, and it's no fun without a
> profiler.

No, it isn't indeed. I've been checking LINUX' 'top' over a period of time.
A mem-hole is apparent. Unfortunately our app makes use of a
(self-developed)
JNI-Module (C) which makes it even more difficult because the C-code also
runs
in the Java-context.

Unfortunately, profiling tools can get expensive. Anyone know
> of any decent OSS ones out there?

'JProbe' and 'OptimizeIt!' are really no bargains !
>
> When I've done this sans profiler in the past, I did two things:
>
> 1. Turn on verbose GC for the VM (dumps GC stats to stdout)
> 2. Write a program to parse the GC lines and graph them over time
>
> We got a curve that looked like this:
>
> ____________________________  <- max memory
>                      /\     /
> used memory-/\     /  \___/
>      /\     /  \___/
>     /  \___/
> --/                         ^ OutOfMemoryError
>
> This indicated that we were really screwing up somewhere.
>
> Had the curve looked like this, we would be happy:
>
> ____________________________ <- max memory
>       /\       /\       /\
>      /  \     /  \     /  \
>     /    \   /    \   /    \
>    /      \_/      \_/      \
> _/
>
>
> ... or even with high-frequency perturbations in there (usually from
> minor GCs happening periodically).
>
> It often helps to set the initial heap size and the maximum heap size to
> the same value (usually something like 256MB, 512MB, or 1024MB). Just
> remember that the higher it is, the longer you'll have to wait for a
> full GC.
>
> GCs don't always free everything they can. If they did, they'd take
> forever. It's only when the VM gets near its maximum heap size that the
> GC panics and goes on a collection rampage.
>
> If you ever get an OutOfMemoryError, go and get a thread-dump. On UNIX,
> you can sent the VM a STOP signal using kill or by pressing CTRL-\ if
> the VM is running in a terminal.

Good idea, I also achieved this by narrowing available mem on the machine
until java
gave up quitting with an 'out-of-mem' err.
The problem is that the dump written only contains the hex-adresses of the
modules being involved.
No possibilty to trace those adressesdown  to the functions or even modules
involved.
Even more difficult: the root cause given by this dump is not the
function(s)/method(s) causing the
memory problems.

>
> You'll get lots of good information including the number of threads and
> where they are. You might find that there are threads there that you
> thought had terminated long ago. Old active threads are always a source
> of tied-up memory.
>
> > Does anybody's got experience with a profiling toolkit which she/he
> > can suggest ?

There is a tool provided by SUN itself called 'HAT' (HeapAnalyzingTool)
http://java.sun.com/people/billf/heap/
but this seems to be only applicable to java programs, not servlets.
With Tomcat there were errors. May be in different cases this would help.

>
> I have some experience with Borland's OptimizeIt, and I've even recently
> installed it on Linux and run Tomcat 4.1 through it. I was able to
> determine something about the VM and Tomcat that might help you. I
> thought I had a memory leak, too.

In our case it was helpful to operate java with the -Xincgc option that
causes the
GC to collect smaller amounts of mem; but does this more often. It gives a
more
accurate impression in the current memory status.

What we did is the following:
- separate the JNI-module and write a little main()-frame around it to run
it alone.
- compile this with the 'dmalloc'-library to identify mem not being free'd
there.
- running the Servlet in one scenario (without JNI) and examine the few
methods being touched.
- rewrite java-code being in question.
- test it on mem-behavior.
- iterate with the next scenario etc.

Could be more satisfying, this kind of work, but it should be
straight-forward.

> It turns out that when you recompile your servlets and Tomcat does an
> auto-reload (I have it configured that way), a new ClassLoader gets
> installed to hold the classes loaded for the new context. However, the
> classes loaded from the old context stick around because the JVM doesn't
> want to discard java.lang.Class objects in case they're useful in the
> future.
>
> This increases the number of java.lang.Class objects by about 600 every
> time I recompiled. After many compile-deploy-reload cycles, my VM was
> hogging all my memory and lots of CPU. Perhaps this is your problem, too?
>
> -chris
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to