RE: memory leak on context reload or stop/start? [was Re: tracking memory usage over time]

2003-03-14 Thread Shapira, Yoav

Howdy,
Can you elaborate on your findings regarding log4j's memory leak?  A new
thread might be better for this.  Thanks,

Yoav Shapira
Millennium ChemInformatics


-Original Message-
From: Aditya [mailto:[EMAIL PROTECTED]
Sent: Thursday, March 13, 2003 11:58 PM
To: Tomcat Developers List
Subject: memory leak on context reload or stop/start? [was Re: tracking
memory usage over time]

Just to followup, we have found a few things that were causing this
leak, two that were particular to our setup, but the third seems to be
a Tomcat problem (4.1.20 with Jasper2):

1) log4j was eating up a lot of memory and there was a slow leak. Since
it wasn't strictly required, we've stopped using it and the largest
leak
stopped.

2) we are using jdbcpool from
http://www.bitmechanic.com/projects/jdbcpool/ (it is the only
connection pool we could find that can be instantiated
programmatically from within a context without having to define a pool
in advance via JDNI -- we give each context it's own database and
therefore it's own pool) which doesn't seem to have a clean way to
stop the pool manager thread when a context is stopped/reloaded. We've
worked around this, however the memory leak remains and is due to
context reloads / stops-starts

3) there seems to be a leak caused by reloading or stopping/starting a
context (we have an automatic httpunit test that builds a jar file
periodically and makes sure it is working in a context). We don't see
the memory leak unless one or more JSPs are compiled before the
context is reloaded or stopped/started.

Is there some particular section of the code we should be examining to
track this further?

Adi

 On Tue, 25 Feb 2003 22:08:41 -0600, Glenn Nielsen
[EMAIL PROTECTED]
said:
 Aditya wrote: Glenn, several months ago you had posted a URL to a
 document (at kinetic.more.net if I remember correctly) where you
 talked about having to restart your production Tomcat(s) every 4
 weeks or so due to Heap exhaustion. Is that still the case? If so
 what causes the heap exhaustion?


 I think that part of the heap problem for me was the recent bug in
 Jasper which I fixed where a number of resources such as Node trees
 from a JSP page compile were not dereferenced between compiles.
 This was fixed in Jasper before the 4.1.20 release.

 We've looked high and low, with JProbe etc, and we still can't find
 where the leak is. We're having to restart a Tomcat (4.1.20) with
 -Xms and -Xmx both set to 256M every 4 days or so.


 Does the increase in memory usage correlate with an increased number
 of connectors due to a spike in request volume?

 Perhaps you should try increasing the heap size.

 Regards,

 Glenn

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




This e-mail, including any attachments, is a confidential business communication, and 
may contain information that is confidential, proprietary and/or privileged.  This 
e-mail is intended only for the individual(s) to whom it is addressed, and may not be 
saved, copied, printed, disclosed or used by anyone else.  If you are not the(an) 
intended recipient, please immediately delete this e-mail from your computer system 
and notify the sender.  Thank you.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: memory leak on context reload or stop/start? [was Re: tracking memory usage over time]

2003-03-13 Thread Uddhav Shirname
The release note of  Tomcat 4.1.18 talks about a known memory leak with
JSPs. I am not aware of its status with Tomcat 4.1.20. This is what it has
to say,


JAVAC leaking memory:


The Java compiler leaks memory each time a class is compiled. Web
applications
containing hundreds of JSP files may as a result trigger out of memory
errors
once a significant number of pages have been accessed. The memory can only
be
freed by stopping Tomcat and then restarting it.

The JSP command line compiler (JSPC) can also be used to precompile the
JSPs.

-- Uddhav

- Original Message -
From: Aditya [EMAIL PROTECTED]
To: Tomcat Developers List [EMAIL PROTECTED]
Sent: Friday, March 14, 2003 10:27 AM
Subject: memory leak on context reload or stop/start? [was Re: tracking
memory usage over time]


 Just to followup, we have found a few things that were causing this
 leak, two that were particular to our setup, but the third seems to be
 a Tomcat problem (4.1.20 with Jasper2):

 1) log4j was eating up a lot of memory and there was a slow leak. Since
 it wasn't strictly required, we've stopped using it and the largest leak
stopped.

 2) we are using jdbcpool from
 http://www.bitmechanic.com/projects/jdbcpool/ (it is the only
 connection pool we could find that can be instantiated
 programmatically from within a context without having to define a pool
 in advance via JDNI -- we give each context it's own database and
 therefore it's own pool) which doesn't seem to have a clean way to
 stop the pool manager thread when a context is stopped/reloaded. We've
 worked around this, however the memory leak remains and is due to
 context reloads / stops-starts

 3) there seems to be a leak caused by reloading or stopping/starting a
 context (we have an automatic httpunit test that builds a jar file
 periodically and makes sure it is working in a context). We don't see
 the memory leak unless one or more JSPs are compiled before the
 context is reloaded or stopped/started.

 Is there some particular section of the code we should be examining to
 track this further?

 Adi

  On Tue, 25 Feb 2003 22:08:41 -0600, Glenn Nielsen [EMAIL PROTECTED]
said:
  Aditya wrote: Glenn, several months ago you had posted a URL to a
  document (at kinetic.more.net if I remember correctly) where you
  talked about having to restart your production Tomcat(s) every 4
  weeks or so due to Heap exhaustion. Is that still the case? If so
  what causes the heap exhaustion?
 

  I think that part of the heap problem for me was the recent bug in
  Jasper which I fixed where a number of resources such as Node trees
  from a JSP page compile were not dereferenced between compiles.
  This was fixed in Jasper before the 4.1.20 release.

  We've looked high and low, with JProbe etc, and we still can't find
  where the leak is. We're having to restart a Tomcat (4.1.20) with
  -Xms and -Xmx both set to 256M every 4 days or so.
 

  Does the increase in memory usage correlate with an increased number
  of connectors due to a spike in request volume?

  Perhaps you should try increasing the heap size.

  Regards,

  Glenn

 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: tracking memory usage over time

2003-02-26 Thread Sachin Chowdhary
you can also use jmeter for this purpose i have not used it but i heard that
it help in detecting memory leaks in tomcat.JMeter is also under jakarta
project you can use it may be it will help you
sachin

-Original Message-
From: Aditya [mailto:[EMAIL PROTECTED]
Sent: Wednesday, February 26, 2003 6:55 AM
To: Tomcat Developers List
Subject: Re: tracking memory usage over time


Glenn,

several months ago you had posted a URL to a document (at
kinetic.more.net if I remember correctly) where you talked about
having to restart your production Tomcat(s) every 4 weeks or so due to
Heap exhaustion. Is that still the case? If so what causes the heap
exhaustion?

We've looked high and low, with JProbe etc, and we still can't find
where the leak is. We're having to restart a Tomcat (4.1.20) with
-Xms and -Xmx both set to 256M every 4 days or so.

Thanks,
Adi

 On Fri, 14 Feb 2003 06:45:26 -0600, Glenn Nielsen [EMAIL PROTECTED]
said:
 An easier way to measure memory usage in production is to start the
 JVM which runs Tomcat with the arg -verbose:gc, this will print
 information to stdout about each garbage collection and the memory
 used.

 I doubt if the memory leak is in Tomcat itself.  The best way to
 find the memory leak in your application is to setup a test server
 and use OptimizeIt or JProbe to profile Tomcat and your web
 application.  You can use something like JMeter to simulate load.

 Regards,

 Glenn


 Aditya wrote: I have the following JSP that I hit every 5 minutes
 and stuff the returned values into a RRD (www.rrdtool.org) to
 measure the memory (heap I presume) consumption of Tomcat over
 time. Is there a better way, short of using JMX in the newer
 Tomcat builds, of doing this?  %@ page language=java % %@ page
 session=false % % long free =
 java.lang.Runtime.getRuntime().freeMemory(); long total =
 java.lang.Runtime.getRuntime().totalMemory(); out.print(free + | +
 total + |); % I can see a clear leak (about 20 contexts with a
 dozen or so hit constantly and recompiling JSPs very often) which
 necessitates (-Xmx and -Xms set to 256 MB) a restart of Tomcat every
 4 days or so (with 4.1.14). I just upgraded to 4.1.20 thinking that
 the constant compiling was the source of the leak and that doesn't
 seem to have made a difference. Running 4.1.14 under jprobe doesn't
 evidence any leaks in our JSPs/filters.  Hints on how to trace this
 leak down would be most welcome.  Thanks, Adi
 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: tracking memory usage over time

2003-02-25 Thread Aditya
Glenn,

several months ago you had posted a URL to a document (at
kinetic.more.net if I remember correctly) where you talked about
having to restart your production Tomcat(s) every 4 weeks or so due to
Heap exhaustion. Is that still the case? If so what causes the heap
exhaustion?

We've looked high and low, with JProbe etc, and we still can't find
where the leak is. We're having to restart a Tomcat (4.1.20) with
-Xms and -Xmx both set to 256M every 4 days or so.

Thanks,
Adi

 On Fri, 14 Feb 2003 06:45:26 -0600, Glenn Nielsen [EMAIL PROTECTED] said:
 An easier way to measure memory usage in production is to start the
 JVM which runs Tomcat with the arg -verbose:gc, this will print
 information to stdout about each garbage collection and the memory
 used.

 I doubt if the memory leak is in Tomcat itself.  The best way to
 find the memory leak in your application is to setup a test server
 and use OptimizeIt or JProbe to profile Tomcat and your web
 application.  You can use something like JMeter to simulate load.

 Regards,

 Glenn


 Aditya wrote: I have the following JSP that I hit every 5 minutes
 and stuff the returned values into a RRD (www.rrdtool.org) to
 measure the memory (heap I presume) consumption of Tomcat over
 time. Is there a better way, short of using JMX in the newer
 Tomcat builds, of doing this?  %@ page language=java % %@ page
 session=false % % long free =
 java.lang.Runtime.getRuntime().freeMemory(); long total =
 java.lang.Runtime.getRuntime().totalMemory(); out.print(free + | +
 total + |); % I can see a clear leak (about 20 contexts with a
 dozen or so hit constantly and recompiling JSPs very often) which
 necessitates (-Xmx and -Xms set to 256 MB) a restart of Tomcat every
 4 days or so (with 4.1.14). I just upgraded to 4.1.20 thinking that
 the constant compiling was the source of the leak and that doesn't
 seem to have made a difference. Running 4.1.14 under jprobe doesn't
 evidence any leaks in our JSPs/filters.  Hints on how to trace this
 leak down would be most welcome.  Thanks, Adi
 -
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: tracking memory usage over time

2003-02-25 Thread Glenn Nielsen
Aditya wrote:
Glenn,

several months ago you had posted a URL to a document (at
kinetic.more.net if I remember correctly) where you talked about
having to restart your production Tomcat(s) every 4 weeks or so due to
Heap exhaustion. Is that still the case? If so what causes the heap
exhaustion?
I think that part of the heap problem for me was the recent bug in Jasper
which I fixed where a number of resources such as Node trees from a JSP
page compile were not dereferenced between compiles.  This was fixed in
Jasper before the 4.1.20 release.
We've looked high and low, with JProbe etc, and we still can't find
where the leak is. We're having to restart a Tomcat (4.1.20) with
-Xms and -Xmx both set to 256M every 4 days or so.
Does the increase in memory usage correlate with an increased number of
connectors due to a spike in request volume?
Perhaps you should try increasing the heap size.

Regards,

Glenn

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


RE: tracking memory usage over time

2003-02-14 Thread Sachin Chowdhary
Hi Aditya,
there is product of Borland OptimizeitT Suite 5 which makes performance
management for JavaT easier .It dramatically improvs application performance
and supporting the development of fast, scalable, and reliable applications.
Optimizeit Suite offers another innovative first: the Automatic Memory Leak
Detector (AMLD), which enables teams who are brand new to Java to
immediately solve memory leaks. Optimizeit Suite integrates with popular
application servers and

you can also find solution in cofiguration on this link which specially
designed for site maintaining onn tomcat and also deals with memory leak
problem

http://www.cs.bris.ac.uk/Teaching/Resources/General/web/tomcat.html



i think it can help you
best of luck



-Original Message-
From: news [mailto:[EMAIL PROTECTED]]On Behalf Of Aditya
Sent: Friday, February 14, 2003 10:38 AM
To: [EMAIL PROTECTED]
Subject: tracking memory usage over time


I have the following JSP that I hit every 5 minutes and stuff the
returned values into a RRD (www.rrdtool.org) to measure the memory
(heap I presume) consumption of Tomcat over time. Is there a better
way, short of using JMX in the newer Tomcat builds, of doing this?

%@ page language=java %
%@ page session=false %
%
long free = java.lang.Runtime.getRuntime().freeMemory();
long total = java.lang.Runtime.getRuntime().totalMemory();
out.print(free + | + total + |);
%

I can see a clear leak (about 20 contexts with a dozen or so hit
constantly and recompiling JSPs very often) which necessitates (-Xmx
and -Xms set to 256 MB) a restart of Tomcat every 4 days or so (with
4.1.14). I just upgraded to 4.1.20 thinking that the constant
compiling was the source of the leak and that doesn't seem to have
made a difference. Running 4.1.14 under jprobe doesn't evidence any
leaks in our JSPs/filters.

Hints on how to trace this leak down would be most welcome.

Thanks,
Adi


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: tracking memory usage over time

2003-02-14 Thread Glenn Nielsen
An easier way to measure memory usage in production is to start the JVM which
runs Tomcat with the arg -verbose:gc, this will print information
to stdout about each garbage collection and the memory used.

I doubt if the memory leak is in Tomcat itself.  The best way to find
the memory leak in your application is to setup a test server and use
OptimizeIt or JProbe to profile Tomcat and your web application.  You
can use something like JMeter to simulate load.

Regards,

Glenn


Aditya wrote:

I have the following JSP that I hit every 5 minutes and stuff the
returned values into a RRD (www.rrdtool.org) to measure the memory
(heap I presume) consumption of Tomcat over time. Is there a better
way, short of using JMX in the newer Tomcat builds, of doing this?

%@ page language=java %
%@ page session=false %
%
long free = java.lang.Runtime.getRuntime().freeMemory();
long total = java.lang.Runtime.getRuntime().totalMemory();
out.print(free + | + total + |);
%

I can see a clear leak (about 20 contexts with a dozen or so hit
constantly and recompiling JSPs very often) which necessitates (-Xmx
and -Xms set to 256 MB) a restart of Tomcat every 4 days or so (with
4.1.14). I just upgraded to 4.1.20 thinking that the constant
compiling was the source of the leak and that doesn't seem to have
made a difference. Running 4.1.14 under jprobe doesn't evidence any
leaks in our JSPs/filters.

Hints on how to trace this leak down would be most welcome.

Thanks, 
Adi


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: tracking memory usage over time

2003-02-14 Thread Peter Lin

Is there any particular reason the pages are
recompiled frequently?  If you're using tomcat 4.1.12
or newer, it should use Ant to compile the pages,
which should get around the issue of memory leak due
to page compilation.

peter


--- Aditya [EMAIL PROTECTED] wrote:
 I have the following JSP that I hit every 5 minutes
 and stuff the
 returned values into a RRD (www.rrdtool.org) to
 measure the memory
 (heap I presume) consumption of Tomcat over time. Is
 there a better
 way, short of using JMX in the newer Tomcat builds,
 of doing this?
 
 %@ page language=java %
 %@ page session=false %
 %
 long free =
 java.lang.Runtime.getRuntime().freeMemory();
 long total =
 java.lang.Runtime.getRuntime().totalMemory();
 out.print(free + | + total + |);
 %
 
 I can see a clear leak (about 20 contexts with a
 dozen or so hit
 constantly and recompiling JSPs very often) which
 necessitates (-Xmx
 and -Xms set to 256 MB) a restart of Tomcat every 4
 days or so (with
 4.1.14). I just upgraded to 4.1.20 thinking that the
 constant
 compiling was the source of the leak and that
 doesn't seem to have
 made a difference. Running 4.1.14 under jprobe
 doesn't evidence any
 leaks in our JSPs/filters.
 
 Hints on how to trace this leak down would be most
 welcome.
 
 Thanks, 
 Adi
 
 

-
 To unsubscribe, e-mail:
 [EMAIL PROTECTED]
 For additional commands, e-mail:
 [EMAIL PROTECTED]
 


__
Do you Yahoo!?
Yahoo! Shopping - Send Flowers for Valentine's Day
http://shopping.yahoo.com

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: tracking memory usage over time

2003-02-14 Thread Aditya
 On Fri, 14 Feb 2003 06:45:26 -0600, Glenn Nielsen [EMAIL PROTECTED] said:
 An easier way to measure memory usage in production is to start the
 JVM which runs Tomcat with the arg -verbose:gc, this will print
 information to stdout about each garbage collection and the memory
 used.

thank you, we'll try that.

 I doubt if the memory leak is in Tomcat itself.  The best way to
 find the memory leak in your application is to setup a test server
 and use OptimizeIt or JProbe to profile Tomcat and your web
 application.  You can use something like JMeter to simulate load.

we did run 4.1.14 under JProbe and didn't find any obvious leaks in
our application/classes -- it was clearly in the Tomcat
compilation. We haven't run 4.1.20 under JProbe yet.

Thanks,
Adi

 seem to have made a difference. Running 4.1.14 under jprobe doesn't
 evidence any leaks in our JSPs/filters.  Hints on how to trace this
 leak down would be most welcome.  Thanks, Adi

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




tracking memory usage over time

2003-02-13 Thread Aditya
I have the following JSP that I hit every 5 minutes and stuff the
returned values into a RRD (www.rrdtool.org) to measure the memory
(heap I presume) consumption of Tomcat over time. Is there a better
way, short of using JMX in the newer Tomcat builds, of doing this?

%@ page language=java %
%@ page session=false %
%
long free = java.lang.Runtime.getRuntime().freeMemory();
long total = java.lang.Runtime.getRuntime().totalMemory();
out.print(free + | + total + |);
%

I can see a clear leak (about 20 contexts with a dozen or so hit
constantly and recompiling JSPs very often) which necessitates (-Xmx
and -Xms set to 256 MB) a restart of Tomcat every 4 days or so (with
4.1.14). I just upgraded to 4.1.20 thinking that the constant
compiling was the source of the leak and that doesn't seem to have
made a difference. Running 4.1.14 under jprobe doesn't evidence any
leaks in our JSPs/filters.

Hints on how to trace this leak down would be most welcome.

Thanks, 
Adi


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]