It might be caused by multiple log4j.properties files in your class path (e.g. in different jar files). For example, I find hadoop-mapreduce-client-jobclient-2.4.0-tests.jar in my class path, and there is log4j.properties inside it. I have to manually delete it to get logging written to files.
On Sun, May 18, 2014 at 8:57 AM, Sebastian Gäde <[email protected]>wrote: > Hi, > > I'd like to log events within my MR application using Flume writing to a > central log file. I've followed http://flume.apache.org/ > releases/content/1.4.0/FlumeUserGuide.html#log4j-appender putting the > stuff into Hadoop's log4j.properties and copying it to Hadoop's conf > directory on all nodes. > > However, my in-application events are only logged to stdout (I can see > them using the YARN webapp) but not to Flume. Although I thought Hadoop's > conf directory is added to classpath, it does not seem to work for me. > > Could you please tell me where I should put the log4j.properties? > > Cheers > Seb. >
