Bob --
Have you considered using an appender that doesn't write to any files
but simply collects all messages for later analysis?
Here's how:
1) Create an appender that simply stores all logging messages in a
static array. This appender should also provide static methods to (a)
retrieve all messages that were logged and (b) clear out the static
message array. Something like this:
package com.fwl;
import java.util.ArrayList;
import java.util.List;
import org.apache.log4j.AppenderSkeleton;
import org.apache.log4j.spi.LoggingEvent;
public class AppenderForTesting extends AppenderSkeleton {
private static List messages = new ArrayList();
protected void append(LoggingEvent event) {
messages.add(event.getRenderedMessage());
}
public void close() {}
public boolean requiresLayout() {return false;}
public static String[] getMessages() {
return (String[]) messages.toArray(new String[messages.size()]);
}
public static void clear() {
messages.clear();
}
}
BTW, "fwl" is my TLA for "fun with logging".
2) Configure log4j so that this appender is attached to all loggers are
used by the target of your test. I'm taking a very simplistic approach
here and just attaching this appender to the root logger:
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>
<appender name="PrimaryAppender" class="com.fwl.AppenderForTesting">
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%m%n" />
</layout>
</appender>
<root>
<level value="debug" />
<appender-ref ref="PrimaryAppender" />
</root>
</log4j:configuration>
3) In the setUp method in your test harness, clear out the message list
that's used by the appender:
protected void setUp() throws Exception {
super.setUp();
AppenderForTesting.clear();
}
You also may want to include this logic in the tearDown method as well,
just to ensure that the message list is kept to a minimum.
4) In your test, after you invoke the method that should generate
logging messages, validate the actual output. Here is the code for a
simple class I wrote that just divides two numbers and a test harness
for that class.
package com.fwl;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
public class MyMath {
private static final Logger LOG =
LogManager.getLogger(MyMath.class);
public static double divide(int numerator, int denominator) {
LOG.debug("dividing " + numerator + " by " + denominator);
try {
return (numerator / denominator);
} catch (Exception e) {
LOG.error("Unable to divide " + numerator + " by "
+ denominator, e);
if (denominator==0){
LOG.error("You should never attempt to divide by 0");
}
return 0;
}
}
}
package com.fwl;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import junit.framework.TestCase;
public class MyMathTest extends TestCase {
public void testDivideDenominatorIsZero() {
MyMath.divide(2, 0);
String[] expectedMessage = new String[]{"dividing 2 by 0"};
validateMessages(expectedMessage);
}
private void validateMessages(String expectedMessages[]) {
// Build a list of expected messages.
List expected = new ArrayList();
for (int i = 0; i < expectedMessages.length; i++) {
if (expectedMessages[i] != null) {
expected.add(expectedMessages[i]);
}
}
// Evaluate the actual messages that were recorded.
List unexpected = new ArrayList();
String actual[] = AppenderForTesting.getMessages();
for (int i = 0; i < actual.length; i++) {
if (actual[i] != null) {
if (expected.contains(actual[i])) {
expected.remove(actual[i]);
} else {
unexpected.add(actual[i]);
}
}
}
// Fail if any expected messages were not found.
if (!expected.isEmpty()) {
StringBuffer sb = new StringBuffer();
for (Iterator iter = expected.iterator(); iter.hasNext();) {
sb.append("\n").append((String) iter.next());
}
fail("did not receive the following message(s):" + sb);
}
// Fail if any unexpected messages were found.
if (!unexpected.isEmpty()) {
StringBuffer sb = new StringBuffer();
for (Iterator iter = unexpected.iterator(); iter.hasNext();)
{
sb.append("\n").append((String) iter.next());
}
fail("received the following unexpected message(s):" + sb);
}
}
}
In the testDivideDenominatorIsZero method, we are only expecting the
message "dividing 2 by 0". However, the actual test writes out two
extra messages when the denominator is 0:
* "Unable to divide 2 by 0"
* "You should never attempt to divide by 0"
The validateMessages method retrieves the actual messages that were
written out to the logger and will record a test failure since the
messages above were not included in the expected messages list.
Ron Gallagher, AT&T Mobility
-----Original Message-----
From: Bob Jacobsen [mailto:[EMAIL PROTECTED]
Sent: Saturday, June 23, 2007 2:50 AM
To: Log4J Users List
Subject: JUnit testing of output logged with log4j?
I've got a bunch of code that uses log4j for logging, and a large
number of unit tests done with JUnit.
As is the nature of these tests, they test a lot of error conditions,
which results in lots of error and warning messages logged when
running the tests. Up until now, we just ignored those: If the
tests passed, nobody looked at the test output. If the tests failed,
then we started looking at the output for "abnormal errors", which is
really a mess.
My co-developers and I would like to clean this whole situation up,
so that the tests run with no error or warning messages appearing in
the output log. But we still want to _test_ the conditions that are
"errors", and in the code as it exists will throw those messages.
And we still want to see if _other_ error and warning messages start
appearing when the tests are run.
Anybody know how to Do This Right?
So far, we've got a couple suggestions:
1) Go through and find all the places where the error messages are
logged, and make then log through a service routine which can be
over-ridden. E.g.
class Mine {
void doSomething() {
...
if (bad) log.error("Oops");
}
}
becomes
class Mine {
void doSomething() {
...
if (bad) report("Oops");
}
void report(String msg) {
log.error(msg);
}
}
Then we can test it with
new Mine(){
void report(String msg) {
if (!msg.equals("Oops"))
// assert error
}
}
The subclassing lets us intercept the error _before_ it gets to
log4j, and test it manually. But there's a _huge_ amount of this. It
seems a complete pain to do all this, and since we'd be putting in a
"cover" layer in front of log4j in each class, it seems far from a
best practice.
2) Instead of suppressing the "good" messages, learn to ignore them.
To do this, we'd just take a log from "normal" tests, and built into
our testing that we diff the current output with the "normal" log.
New or missing lines would be visible to the author; the same old
stuff would pass through the diff.
Of course, this is a maintenance nightmare. As code changes, we'd
have to keep the "master error log" synchronized, but it would be
easy for somebody to pass a Really Bad Thing as a "new normal error".
This is currently the most popular choice, mostly because it doesn't
need to rework the existing code, but I really don't like it.
3) Do something via log4j itself.
I'm not even sure what I'm asking here, but can we somehow
programmatically access a stream of log4j log entries, and manipulate
it?
What I'm imagining is a test like this:
a) Push all existing stuff in log to whatever outputs are configured.
b) Temporarily stop output of log entries
c) run test
d) Check through log entries accumulated since (b), which come from
the test, and remove the ones that are expected (optionally, tell
JUnit to fail if something is missing)
e) Allow all other log entries to continue to their configured outputs
If there was some way to do this, we could have "unexpected" messages
still go to the usual places, as configured, but remove "expected"
messages from the log stream.
Is there a way to do this?
Or is there a better solution to the whole need?
Thanks in advance.
Bob
--
Bob Jacobsen, UC Berkeley
[EMAIL PROTECTED] +1-510-486-7355 fax +1-510-643-8497 AIM, Skype
JacobsenRG
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]