Please see the reply from oers in this thread:

http://stackoverflow.com/questions/9081625/override-log4j-properties-in-hadoop

On May 18, 2014, at 10:38 AM, bo yang <[email protected]> wrote:

> It might be caused by multiple log4j.properties files in your class path 
> (e.g. in different jar files). For example, I find 
> hadoop-mapreduce-client-jobclient-2.4.0-tests.jar in my class path, and there 
> is log4j.properties inside it. I have to manually delete it to get logging 
> written to files.
> 
> 
> On Sun, May 18, 2014 at 8:57 AM, Sebastian Gäde <[email protected]> 
> wrote:
>> Hi,
>> 
>> I'd like to log events within my MR application using Flume writing to a 
>> central log file. I've followed 
>> http://flume.apache.org/releases/content/1.4.0/FlumeUserGuide.html#log4j-appender
>>  putting the stuff into Hadoop's log4j.properties and copying it to Hadoop's 
>> conf directory on all nodes.
>> 
>> However, my in-application events are only logged to stdout (I can see them 
>> using the YARN webapp) but not to Flume. Although I thought Hadoop's conf 
>> directory is added to classpath, it does not seem to work for me.
>> 
>> Could you please tell me where I should put the log4j.properties?
>> 
>> Cheers
>> Seb.
> 

Reply via email to