I am working on a similar use-case.
I think there are 2 approaches which you can think of:
1) Application generating syslog should have a parallel producer to send
syslog to kafka broker.
2) try the kafka log4j appender, but again it will be in parallel with
syslogs

Alternatively, i am trying with file APIs - infinite while loop on the
syslog file, but that is very inefficient.

-navneet sharma

On Tue, Apr 24, 2012 at 1:31 AM, Flinkster <flinks...@gmail.com> wrote:

> How does LinkedIn handle syslog? Is Kafka at LinkedIN dedicated to
> application logs & metrics and syslog messages (e.g. network, OS logs)
> is a separate flow and datastore?
>
> On Mon, Apr 23, 2012 at 11:26 AM, Joel Koshy <jjkosh...@gmail.com> wrote:
> > This is an interesting use-case and similar to using the log4j appender
> for
> > application logs. What you describe sounds reasonable: i.e., have a
> > producer process on the syslog server to send syslog messages to your
> Kafka
> > brokers.
> >
> >
> >> 2) How about tailing a file to a central logging like you can do with
> >> scribe/flume agents?
> >>
> >
> > You could use tail and pipe (or a named pipe) to a console-based
> producer.
> > I don't know enough about syslog.d to tell if this would be too much of a
> > hack or not, but I think it would be more reliable than using tail+pipes:
> > you could configure your local machines to direct syslog messages to the
> > remote server and you could write a simple component that listens on the
> > syslog socket and redirects incoming messages to a Kafka producer.
> >
> > Joel
>

Reply via email to