On Wed, Aug 19, 2009 at 11:50 PM, Brian Bockelman<bbock...@cse.unl.edu> wrote: > Hey Mike, > > Yup. We find the stock log4j needs two things: > > 1) Set the rootLogger manually. The way 0.19.x has the root logger set up > breaks when adding new appenders. I.e., do: > > log4j.rootLogger=INFO,SYSLOG,console,DRFA,EventCounter > > 2) Add the headers; otherwise log4j is not compatible with syslog: > > log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender > log4j.appender.SYSLOG.facility=local0 > log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout > log4j.appender.SYSLOG.layout.ConversionPattern=%p %c{2}: %m%n > log4j.appender.SYSLOG.SyslogHost=red > log4j.appender.SYSLOG.threshold=ERROR > log4j.appender.SYSLOG.Header=true > log4j.appender.SYSLOG.FacilityPrinting=true > > Brian > > On Aug 19, 2009, at 6:32 PM, Mike Anderson wrote: > >> Has anybody had any luck setting up the log4j.properties file to send logs >> to a syslog-ng server? >> My log4j.properties excerpt: >> log4j.appender.SYSLOG=org.apache.log4j.net.SyslogAppender >> log4j.appender.SYSLOG.syslogHost=10.0.20.164 >> log4j.appender.SYSLOG.layout=org.apache.log4j.PatternLayout >> log4j.appender.SYSLOG.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n >> log4j.appender.SYSLOG.Facility=HADOOP >> >> and my syslog-ng.conf file running on 10.0.20.164 >> >> source s_hadoop { >> # message generated by Syslog-NG >> internal(); >> # standard Linux log source (this is the default place for the >> syslog() >> # function to send logs to) >> unix-stream("/dev/log"); >> udp(); >> }; >> destination df_hadoop { file("/var/log/hadoop/hadoop.log");}; >> filter f_hadoop {facility(hadoop);}; >> log { >> source(s_hadoop); >> filter(f_hadoop); >> destination(df_hadoop); >> }; >> >> >> Thanks in advance, >> Mike > >
Mike slightly off topic but you can also run a Log 4J server which perfectly transports the messages fired off by LOG4j. The log4J->syslog loses/ changes some information. If anyone is interested in this let me know and I will write up something about it.