https://issues.apache.org/bugzilla/show_bug.cgi?id=53173
Robert Hardin <roberthar...@hotmail.com> changed: What |Removed |Added ---------------------------------------------------------------------------- Hardware|PC |All Version|7.0.27 |7.0.47 OS|All |Linux --- Comment #8 from Robert Hardin <roberthar...@hotmail.com> --- (in reply to comments #5 and #6) There still exists a defect in Tomcat 7.0.47 (most likely with the Executor of the connector threads) where, as described in brauckmann's comment #5, all threads in the Tomcat thread pool will begin hanging at peak traffic times. By "peak" I'm referring to complete thread pool saturation. The bug is highly recreatable when maxThreads<=maxConnections in config and the threads are executing some logic (jsp, servlet, etc.) with relatively extensive, timely operations (like reading from a database, file I/O, etc). Once the thread pool is exhausted and if the maxConnections continues to allow & queue new requests pending available threads, the Executor seems to start hanging the Tomcat threads themselves (somehow) - it is not the operation that the threads themselves are executing that's hanging the threads either; neither is it the number of connections still available (due to a connection count). It was stated in comment #5 that "The problem disappears when the Apache and the Tomcat parameters are adjusted so that MaxClients < maxThreads" and even the sample configuration suggested in the very first comment from Filip that allows for recreation of the bug suggests a bug manifestation when the number of socket connections allowed by Tomcat exceed the available number of threads to handle incoming requests: <Executor name="tomcatThreadPool" namePrefix="catalina-exec-" maxThreads="5" minSpareThreads="0" maxQueueSize="15"/> <Connector port="8080" protocol="HTTP/1.1" executor="tomcatThreadPool" connectionTimeout="10000" redirectPort="8443" maxConnections="30"/> This is why adding maxConnections=-1 as an option (effectively putting no upper-limit to accepting client socket connections) actually just makes matters worse. After changing my configuration (where I had previously just specified maxThreads=200) to maxThreads=200 and maxConnections=-1, and by running a simple HTTP grinder against a small, test web application on Tomcat I had brought all 200 connector threads to their knees in a matter of seconds (Tomcat completely unresponsive until Catalina Tomcat was bounced). These threads never recovered and the back end processes that the web app started from the thread were all also hung (until the Tomcat server restart). To recreate, simply write a simple webapp containing a jsp to execute system process 'netstat -a' and route the process stdout to the http response. Then from an exerciser client, spin up something like 500 threads each continually submitting HTTP gets against the Tomcat server (web app jsp) every 1/2 second or so. The only way I was able to keep Tomcat server thread pool healthy and the server responsive was to make sure maxThreads>maxConnections. Setting maxConnections=-1, or maxConnections>=maxThreads all result in hung connector threads and ultimately an unresponsive Tomcat server on that connector. Best, Robert -- You are receiving this mail because: You are the assignee for the bug. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org For additional commands, e-mail: dev-h...@tomcat.apache.org