I have setup a cluster with several machines up and running. But I encounter a problem that my mapper reducer class does not log.
The hadoop version I use is 0.20.2. The rootLogger in log4j.properties adds DRFA (daily rolling file appender) and RFA (rolling file appender). My map reduce class looks as below public class MyTest{ public static final Log LOG = LogFactory.getLog(MyTest.class); public static class MiniMap extends Mapper<LongWritable, Text, Text, Text>{ private long count = 0; public void map(LongWritable key, Text value, Mapper.Context ctx) throws IOException, InterruptedException{ String valueString = value.toString(); LOG.info("XXXXXXXXXXX value string obtained: "+valueString); ctx.write(new LongWritable(counter++), doSomething(valueString)); } } ... static String fetch(String urlpath){ ... LOG.info("xxxxxxxxxxxxxx"); ... } } The error shows 0java.lang.NullPointerException [java] at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:849) [java] at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541) [java] at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80) [java] at myapp.MyTest.$MiniMap.map(MiniMap.java:51) [java] at myapp.MyTest.$MiniMap.map(MiniMap.java:44) The error message points out the line that goes wrong, but the problem is I do not see any message (in LOG.info(...)) get logged. What else I need to turn on or modify so that log message can be seen in e.g. hadoop/logs/hadoop-...log? I appreciate any suggestion.