Hi, I am playing with a single node instance hadoop on Solaris 10 x64, when running map-reduce jobs, the following error occured on reduce tasks, but map tasks complete successfully, I am sure the log path has enough space, and the hadoop user has permission to write to it.
-bash-3.00$ more stderr log4j:ERROR Failed to flush writer, java.io.InterruptedIOException at java.io.FileOutputStream.writeBytes(Native Method) at java.io.FileOutputStream.write(FileOutputStream.java:260) at sun.nio.cs.StreamEncoder.writeBytes(StreamEncoder.java:202) at sun.nio.cs.StreamEncoder.implFlushBuffer(StreamEncoder.java:272) at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:276) at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:122) at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:212) at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:58) at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:316) at org.apache.log4j.WriterAppender.append(WriterAppender.java:160) at org.apache.hadoop.mapred.TaskLogAppender.append(TaskLogAppender.java:58) at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251) at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66) at org.apache.log4j.Category.callAppenders(Category.java:206) at org.apache.log4j.Category.forcedLog(Category.java:391) at org.apache.log4j.Category.log(Category.java:856) at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:199) at org.apache.hadoop.mapreduce.task.reduce.ShuffleScheduler.freeHost(ShuffleScheduler.java:345) at org.apache.hadoop.mapreduce.task.reduce.Fetcher.run(Fetcher.java:152) -bash-3.00$