You should be aware that AsyncAppender is backed by a bounded fifo. When
this fifo gets full, all requests to that instance of the async appender
will block, until such time that fifo is no longer full. A 'background'
thread is used to consume the entries in the fifo.

So, if the subordinate appender is so slow that your async appender fifo
fills up, your application will appear to hang as soon as a logging call is
made (which points to that appender).

If you can determine the maximum amount of time it takes to consume an entry
in the fifo, the maximum sustained rate and time period at which entries are
added to the fifo, then you can determine how large your fifo must be to
keep from filling up.

Tom

-----Original Message-----
From: Thomas Muller [mailto:[EMAIL PROTECTED] 
Sent: Friday, February 10, 2006 4:51 AM
To: [email protected]
Subject: Possible deadlock using AsyncAppender 


Log4j experts,

We're using Log4j (version 1.2.9) extensibly in a high-performancy telephony
server, and in our latest release we have seen quite a few "frozen server"
incidents, i.e. extensive deadlocks significantly deteriorating service.
I've now managed to isolate the deadlocks (using Jconsole), and it seems
like /maybe/ log4j's AsyncAppender is the culprit. We're using a number of
AsyncAppenders for performance reasons.

Environment:

Log4j version: 1.2.9
Java version:  1.5.0 update 6
OS:            Win32

This following shows a thread in WAITING state and there are a number other
threads waiting for it (seemingly forever).

Name: Timer-0
State: WAITING on [EMAIL PROTECTED]
Total blocked: 9  Total waited: 7,025

Stack trace: 
java.lang.Object.wait(Native Method)
java.lang.Object.wait(Object.java:474)
org.apache.log4j.AsyncAppender.append(AsyncAppender.java:85)
org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:221)
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(Append
erAttachableImpl.java:57)
org.apache.log4j.Category.callAppenders(Category.java:194)
org.apache.log4j.Category.forcedLog(Category.java:379)
org.apache.log4j.Category.debug(Category.java:248)
com.onrelay.pref.ldap.LDAPPreferences.createPreferenceFromAttributes(LDAPPre
ferences.java:961)
[clip]


A thread waiting for Timer-0:

Name: Dispatcher-Thread-10
State: BLOCKED on [EMAIL PROTECTED] owned by: Timer-0
Total blocked: 5  Total waited: 12,707

Stack trace: 
org.apache.log4j.Category.callAppenders(Category.java:192)
org.apache.log4j.Category.forcedLog(Category.java:379)
org.apache.log4j.Category.log(Category.java:826)
com.onrelay.log.LoggingOutputStream.flush(LoggingOutputStream.java:194)
java.io.PrintStream.write(PrintStream.java:414)
sun.nio.cs.StreamEncoder$CharsetSE.writeBytes(StreamEncoder.java:336)
sun.nio.cs.StreamEncoder$CharsetSE.implFlushBuffer(StreamEncoder.java:404)
sun.nio.cs.StreamEncoder.flushBuffer(StreamEncoder.java:115)
java.io.OutputStreamWriter.flushBuffer(OutputStreamWriter.java:169)
java.io.PrintStream.write(PrintStream.java:459)
java.io.PrintStream.print(PrintStream.java:602)
java.io.PrintStream.println(PrintStream.java:739)
org.apache.log4j.helpers.LogLog.debug(LogLog.java:91)
org.apache.log4j.DailyRollingFileAppender.rollOver(DailyRollingFileAppender.
java:313)
org.apache.log4j.DailyRollingFileAppender.subAppend(DailyRollingFileAppender
.java:343)
org.apache.log4j.WriterAppender.append(WriterAppender.java:150)
org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:221)
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(Append
erAttachableImpl.java:57)
org.apache.log4j.Dispatcher.run(AsyncAppender.java:310)

Yet another:

Name: SingleConnectionTCPServer-0
State: BLOCKED on [EMAIL PROTECTED] owned by: Timer-0
Total blocked: 1  Total waited: 0

Stack trace: 
org.apache.log4j.Category.callAppenders(Category.java:192)
org.apache.log4j.Category.forcedLog(Category.java:379)
org.apache.log4j.Category.log(Category.java:844)
com.onrelay.log.LoggingElement.log(LoggingElement.java:183)
com.onrelay.log.LoggingElement.info(LoggingElement.java:90)
com.onrelay.server.SingleConnectionTCPServer$1.run(SingleConnectionTCPServer
.java:69)
java.lang.Thread.run(Thread.java:595)

Yet another:

Name: Timer-4
State: BLOCKED on [EMAIL PROTECTED] owned by: Timer-0
Total blocked: 5  Total waited: 1,595

Stack trace: 
org.apache.log4j.Category.callAppenders(Category.java:192)
org.apache.log4j.Category.forcedLog(Category.java:379)
org.apache.log4j.Category.log(Category.java:844)
com.onrelay.log.LoggingElement.log(LoggingElement.java:183)
com.onrelay.log.LoggingElement.info(LoggingElement.java:90)
com.onrelay.mbx.protocol.AbstractSession.sessionIdle(AbstractSession.java:41
0)
com.onrelay.mbx.protocol.tic.socket.client.TicClientSession.sessionIdle(TicC
lientSession.java:233)
com.onrelay.mbx.protocol.AbstractSession$5.run(AbstractSession.java:620)
java.util.TimerThread.mainLoop(Timer.java:512)
java.util.TimerThread.run(Timer.java:462)

The list goes on - in this case there are 6 more threads hanging in log4j
statements with the same lines topping the stack:

org.apache.log4j.Category.callAppenders(Category.java:192)
org.apache.log4j.Category.forcedLog(Category.java:379)

FYI from the stack traces I can verify (I think) that we're talking about
different Loggers that have independent AsyncAppenders attached to them.

1. I'm I correct in believing this is caused by log4j code, or could it well
be our own?

2. If 1 holds true, is this something someone can address, or am I on my own
finding and fixing a possible flaw?

3. Does 1.2.13 have fix for this particular issue?

I can provide more information if needed, i.e. Log4j.xml file(s), log-files,
code extracts etc.

Any input would be much appreciated.

--

Thomas


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to