Thanks for pointing this out! I've sent 2 messages to the lists asking where
the Fetcher logs have disappeared to and no-one else seemed to be
experiencing this problem. Hardwiring the "log4j.appender.DRFA.File"
variable to a specified filename has solved this and the logs are back. If
anyone finds the correct solution please share it.

-Ed

On 8/12/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:

Hello,

I noticed the following line in conf/log4j.properties:

  log4j.appender.DRFA.File=${hadoop.log.dir}/${hadoop.log.file}

I noticed that the ${hadoop.log.dir}/${hadoop.log.file} sometimes gets
interpreted as "/", indicating that the 2 hadoop properties there are
undefined.

I also noticed that the bin/nutch script defines these two properties
while invoking Nutch, like this:

  NUTCH_OPTS="$NUTCH_OPTS -Dhadoop.log.dir=$NUTCH_LOG_DIR"

I assume the idea is that the JVM knows about hadoop.log.dir system
property, and then log4j knows about it, too.
However, it doesn't _always_ work.

That is, when invoking various bin/nutch commands as described in
http://lucene.apache.org/nutch/tutorial8.html , this fails, and the system
attempts to write to "/" which, of course, is a directory, not a file.

On the other hand, I also run Nutch using commands described in
http://wiki.apache.org/nutch/NutchHadoopTutorial , which are slightly
different.  I noticed that when I did that, log4j worked as designed - those
2 properties were defined and logging really went to logs/hadoop.log.

I'm not yet familiar enough with all the internals and configs to figure
out what to change to fix this, but can somebody else more familar with the
setup fix this?

Otis




Reply via email to