Hi Sami,

----- Original Message ----
[EMAIL PROTECTED] wrote:
> 
> I assume the idea is that the JVM knows about hadoop.log.dir system
> property, and then log4j knows about it, too. However, it doesn't
> _always_ work.
> 
> That is, when invoking various bin/nutch commands as described in
> http://lucene.apache.org/nutch/tutorial8.html , this fails, and the
> system attempts to write to "/" which, of course, is a directory, not
> a file.
> 
Can you be more precise on this one - what commands do fail? What kind 
of configuration are you running this on?


I'll have to look at another server's logs tomorrow, but I can tell you that 
the error is much like the one in 
http://issues.apache.org/jira/browse/NUTCH-307 :

java.io.FileNotFoundException: / (Is a directory) 
 cr06:   at java.io.FileOutputStream.openAppend(Native Method) 
 cr06:   at java.io.FileOutputStream.<init>(FileOutputStream.java:177) 
 cr06:   at java.io.FileOutputStream.<init>(FileOutputStream.java:102) 
 cr06:   at org.apache.log4j.FileAppender.setFile(FileAppender.java:289) 
 cr06:   at 
org.apache.log4j.FileAppender.activateOptions(FileAppender.java:163) 
 cr06:   at 
org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:215)
 
 cr06:   at 
org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:256) 


There is really not much to any kind of particular configuration, it is just 
that those properties are unset, so when log4j has to interpret this:

  log4j.appender.DRFA.File=${hadoop.log.dir}/${hadoop.log.file}


It gets interpreted as:

log4j.appender.DRFA.File=/

Because those 2 properties are undefined.
And that will happen if you follow this tutorial: 
http://lucene.apache.org/nutch/tutorial8.html
This tutorial uses things like inject, generate, fetch, etc., while the 0.8 
tutorial on Wiki does not.  When you use the 0.8 tutorial from the Wiki, the 
properties do get set somehow, so everything works.  So it's a matter of those 
properties not getting set.

Thanks,
Otis



Reply via email to