Re: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-14 Thread Rahman Akhlaqur

I have some more info about the issue that I think is related. I found our 
Tomcat executable is also establishing a lot of TCP connections. just doing a 
simple netstat just after I start tomcat and comparing it to when thread count 
for tomcat reaches 2000, I found a lot more lines like

  TCP    127.0.0.1:4912 127.0.0.1:2170 ESTABLISHED 4064
  [tomcat5.exe]
  TCP    127.0.0.1:4912 127.0.0.1:4913 ESTABLISHED 4064
  [tomcat5.exe]
  TCP    127.0.0.1:4913 127.0.0.1:4912 ESTABLISHED 4064
  [tomcat5.exe]
  TCP    127.0.0.1:4914 127.0.0.1:4917 ESTABLISHED 4064
  [tomcat5.exe]

Is this some sort of tomcat ping? 

Our tomcat connectors are set up to limit the max http threads as below

    Connector port=8080 maxHttpHeaderSize=8192
   maxThreads=250 minSpareThreads=25 maxSpareThreads=75
   enableLookups=false redirectPort=8443 acceptCount=100
   connectionTimeout=2 disableUploadTimeout=true /

    Connector port=8443 maxHttpHeaderSize=8192
   maxThreads=100 minSpareThreads=25 maxSpareThreads=50
   enableLookups=false disableUploadTimeout=true
   connectionTimeout=2 acceptCount=100 scheme=https 
   secure=false  proxyPort=443/


The only timeouts are for the http requests, are there any other timeouts I can 
configure that could potentially stop all those selector threads from 
persisting?


- Original Message 
From: Caldarale, Charles R chuck.caldar...@unisys.com
To: Tomcat Users List users@tomcat.apache.org
Sent: Tuesday, 14 July, 2009 2:59:36
Subject: RE: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

 From: Christopher Schultz [mailto:ch...@christopherschultz.net]
 Subject: Re: Tomcat 5.5.23 keeps starting threads until OS runs out of
 memory
 
 I'm surprised you're not hitting a thread maximum in the OS
 and halting the JVM.

I'm not aware of any hard limit in Windows; regardless, hitting such a limit 
just returns an error status on the system call, not a process abort.

 Showing more of the stack trace will certainly help reveal the problem.

Not likely; that will only show the stack of the started thread, not where it 
was started from.  A heap profiler should show who created the Thread objects, 
if the profiler captures enough of the call stack at object creation time.

One would think a grep of the webapp source for calls to start() would be a 
rather quick first cut if a profiler can't be used.

If the source isn't available, then one possible way to trap the origin of the 
Thread.start() call is to enable a security manager and only allow start() 
calls from Tomcat, not webapp, code.  This would probably require several 
iterations and would be somewhat tedious.

- Chuck


THIS COMMUNICATION MAY CONTAIN CONFIDENTIAL AND/OR OTHERWISE PROPRIETARY 
MATERIAL and is thus for use only by the intended recipient. If you received 
this in error, please contact the sender and delete the e-mail and its 
attachments from all computers.




-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



RE: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-14 Thread Caldarale, Charles R
 From: Rahman Akhlaqur [mailto:aki...@yahoo.co.uk]
 Subject: Re: Tomcat 5.5.23 keeps starting threads until OS runs out of
 memory
 
 I have some more info about the issue that I think is related. I found
 our Tomcat executable is also establishing a lot of TCP connections.

Tomcat isn't, your webapp is.  Don't blame Tomcat for your misbehaving code.

 Is this some sort of tomcat ping?

No, it's your webapp opening up some sort of connection - likely RMI, judging 
from your stack trace.

 Our tomcat connectors are set up to limit the max http threads as below

That's irrelevant, since it's your webapp starting the extra threads.

 The only timeouts are for the http requests, are there any other
 timeouts I can configure that could potentially stop all those selector
 threads from persisting?

Whatever you can configure in your webapp - this is a problem with your code, 
not with Tomcat.

 - Chuck


THIS COMMUNICATION MAY CONTAIN CONFIDENTIAL AND/OR OTHERWISE PROPRIETARY 
MATERIAL and is thus for use only by the intended recipient. If you received 
this in error, please contact the sender and delete the e-mail and its 
attachments from all computers.


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-14 Thread Rahman Akhlaqur

I found the cause of the issue. A call was being made to a jsp that opened a 
connection to another application sitting on the same server (via orbd.exe). 
This jsp is being called approx 10 times a second (as it is a monitor jsp) and 
the majority of requests are being served but on occasion it would seem that 
the request is not being served and a connection is left dangling. So I guess 
it was not Tomcats fault in the end! 

I found some error messages in the catalina log alluding to a connection 
problem related to a jsp that is not actually part of the webapp - hence why I 
didn't think of it until I saw the stack trace.

To prove it was this jsp, I took out the code that creates a connection and 
found the windows threads just stood at the same thread count. Gonna have to 
have a word with the TA about this...


- Original Message 
From: Caldarale, Charles R chuck.caldar...@unisys.com
To: Tomcat Users List users@tomcat.apache.org
Sent: Tuesday, 14 July, 2009 13:00:53
Subject: RE: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

 From: Rahman Akhlaqur [mailto:aki...@yahoo.co.uk]
 Subject: Re: Tomcat 5.5.23 keeps starting threads until OS runs out of
 memory
 
 I have some more info about the issue that I think is related. I found
 our Tomcat executable is also establishing a lot of TCP connections.

Tomcat isn't, your webapp is.  Don't blame Tomcat for your misbehaving code.

 Is this some sort of tomcat ping?

No, it's your webapp opening up some sort of connection - likely RMI, judging 
from your stack trace.

 Our tomcat connectors are set up to limit the max http threads as below

That's irrelevant, since it's your webapp starting the extra threads.

 The only timeouts are for the http requests, are there any other
 timeouts I can configure that could potentially stop all those selector
 threads from persisting?

Whatever you can configure in your webapp - this is a problem with your code, 
not with Tomcat.

- Chuck


THIS COMMUNICATION MAY CONTAIN CONFIDENTIAL AND/OR OTHERWISE PROPRIETARY 
MATERIAL and is thus for use only by the intended recipient. If you received 
this in error, please contact the sender and delete the e-mail and its 
attachments from all computers.


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org




-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-13 Thread Rahman Akhlaqur

Hi

I am having an issue with Tomcat starting too many selector threads. I have got 
some stack trace info about these threads as below:

SelectorThread 
sun.nio.ch.PollArrayWrapper.poll0 ( native code ) 
sun.nio.ch.PollArrayWrapper.poll ( PollArrayWrapper.java:74 ) 
sun.nio.ch.WindowsSelectorImpl.doSelect ( WindowsSelectorImpl.java:61 ) 
sun.nio.ch.SelectorImpl.lockAndDoSelect ( SelectorImpl.java:69 ) 
sun.nio.ch.SelectorImpl.select ( SelectorImpl.java:80 ) 
com.sun.corba.se.impl.transport.SelectorImpl.run ( SelectorImpl.java:249 ) 

When Tomcat starts approx 1800-1900 hundred of these selector threads, Windows 
runs out of memory and we have to restart tomcat to restore the website.

I think these are started by our webapp, but not sure how to trace this back to 
the bit of code that is doing this. Any suggestions would be appreciated.

Best Regards,
Akik Rahman




-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-13 Thread Christopher Schultz
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Rahman,

On 7/13/2009 9:13 AM, Rahman Akhlaqur wrote:
 Hi
 
 I am having an issue with Tomcat starting too many selector threads.
 I have got some stack trace info about these threads as below:
 
 SelectorThread 
 sun.nio.ch.PollArrayWrapper.poll0 ( native code ) 
 sun.nio.ch.PollArrayWrapper.poll ( PollArrayWrapper.java:74 ) 
 sun.nio.ch.WindowsSelectorImpl.doSelect ( WindowsSelectorImpl.java:61 ) 
 sun.nio.ch.SelectorImpl.lockAndDoSelect ( SelectorImpl.java:69 ) 
 sun.nio.ch.SelectorImpl.select ( SelectorImpl.java:80 ) 
 com.sun.corba.se.impl.transport.SelectorImpl.run ( SelectorImpl.java:249 ) 

CORBA, eh? Are you starting an RMI server per request?

 When Tomcat starts approx 1800-1900 hundred of these selector 
 threads, Windows runs out of memory and we have to restart tomcat to
 restore the website.

1900 hundred? That's a whole lotta threads. I'm surprised you're not
hitting a thread maximum in the OS and halting the JVM.

 I think these are started by our webapp, but not sure how to trace 
 this back to the bit of code that is doing this. Any suggestions
 would be appreciated.

Showing more of the stack trace will certainly help reveal the problem.

- -chris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAkpb4jkACgkQ9CaO5/Lv0PD8+wCdEyIQGJ6y4QR0QAuUFgrIsk5w
rvcAoIrxIdO4yg0nP9Oo/7IX+HFQlNV7
=ALeE
-END PGP SIGNATURE-

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



RE: Tomcat 5.5.23 keeps starting threads until OS runs out of memory

2009-07-13 Thread Caldarale, Charles R
 From: Christopher Schultz [mailto:ch...@christopherschultz.net]
 Subject: Re: Tomcat 5.5.23 keeps starting threads until OS runs out of
 memory
 
 I'm surprised you're not hitting a thread maximum in the OS
 and halting the JVM.

I'm not aware of any hard limit in Windows; regardless, hitting such a limit 
just returns an error status on the system call, not a process abort.

 Showing more of the stack trace will certainly help reveal the problem.

Not likely; that will only show the stack of the started thread, not where it 
was started from.  A heap profiler should show who created the Thread objects, 
if the profiler captures enough of the call stack at object creation time.

One would think a grep of the webapp source for calls to start() would be a 
rather quick first cut if a profiler can't be used.
 
If the source isn't available, then one possible way to trap the origin of the 
Thread.start() call is to enable a security manager and only allow start() 
calls from Tomcat, not webapp, code.  This would probably require several 
iterations and would be somewhat tedious.

 - Chuck


THIS COMMUNICATION MAY CONTAIN CONFIDENTIAL AND/OR OTHERWISE PROPRIETARY 
MATERIAL and is thus for use only by the intended recipient. If you received 
this in error, please contact the sender and delete the e-mail and its 
attachments from all computers.