Thanks very much, Chuck.
Looking in the server log did, indeed, show an error going on down in my database code. Nice theory, but fixing that changed nothing.


Here is a typical error in that log now:

2005-05-15 17:49:29 StandardWrapperValve[TopicMap]: Servlet.service() for servlet TopicMap threw exception
java.lang.OutOfMemoryError


There are no other statements in the TC log that hint at what was being tried when the error was thrown.

It throws hundreds of them.
Then, Tomcat seems to die -- that is, it stops using 99% of the cycles in the Windows (XP Pro) task manager, drops to 0, cycles up to 88% then 0 in a slow pulsing fashion -- maybe one cycle every two seconds, with nothing moving in the tomcat console, all the while still tossing OutOfMemoryError messages in its logfile.


One experiment I have tried is this.
I am running the stress engine in jdk 1.5 and TC in 1.4 -- both on the same box.


The only files I open are the 3 for the database, a couple of log files, a couple of velocity macros, and the occasional gif image.

I just ran it twice in a row without rebooting TC and with 16 threads. Nothing in my log or TC's log until the end when the results were printed. 208 to 210 thousand milliseconds on the first run and 197 to 199 thousand millis on the second.

It's in the middle of a run with 24 threads hitting the server and, thus far, only one "connection refused" message in each of two threads, and those didn't result in a missed URL (failure after 4 tries). Browsing the site while the test is running remains snappy.

The run ended and it took 741 seconds on average for each thread. I upped it to 28 threads -- on the way to 32. Now, the TC log is slowly tossing out OutOfMemoryErrors, I got one while browsing, and my tester is receiving lots of "connection refused" or "500" errors, and just one missed URL (cave in after 4 tries with a 4-second delay).

I think you can see the nonlinearity going on. 16 is right at the edge, 28 puts a foot in the grave, and 32 seems to take out a howitzer and start blasting.

I'm at somewhat a loss to understand how this relates to Perm Gen space, or if that's an issue.

All of the errors mimic the one I pasted in above, except for the Http response codes shown in the error handler in my test framework, and those are only "connection refused" or "500" type errors.

It certainly doesn't sound like heap exhaustion, but, whatever it is, it seems rather nonlinear, where 16 threads don't provoke any errors, 24 threads show the server stumbling, but not tossing errors, and 28 threads appear to have tickled a sleeping tiger.

At 28 threads, the OOM errors are being recorded at a rate of one every 4 or so seconds.

Still don't know why inserting heap size info in catalina.bat causes Tomcat to boot, flash something, then close the console before you can read it.

Dunno if that helps, but that's the story from this end. I do hope that provokes further ideas.
Many thanks.
Jack


Caldarale, Charles R wrote:

From: Jack Park [mailto:[EMAIL PROTECTED] Subject: Tomcat 5.0.30 OutOfMemory error


It would be helpful if you told us what OS you're using, along with what
version of the JDK you have installed.  Did you check your logs for any
pertinent diagnostic information?


The java process is calling up 66,500k of heap so it strikes me that the OutOfMemory issue isn't related to the heap.


Highly likely that it's not complete heap exhaustion.  If you've
searched the archives at all, you should have found the following:

1) The OutOfMemoryError is a catch-all for exhausting not only the heap,
but also pretty much any system resource, such as the number of open
files.  You need to look at the stack trace in the logs to find out what
was being attempted at the time.

2) You can run out of Perm Gen space rather easily in any app server.
Look at the JVM documentation and the archives for details.


If I try to place
     set CATALINA_OPTS= "-Xms256m -Xmx512m -Xss512k"
in catalina.bat, Tomcat won't start at all.


Exactly what errors are displayed?

 - Chuck



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to