I've read through many sites regarding using Kafka for log aggregation but 
haven't really found anything that actually talks about how people are shipping 
their logs into Kafka and consuming them. I'm really interested in 
implementation that would watch any kind of logs (local syslogs and application 
logs) and ship them near realtime into kafka. I think products like logstash 
and flume really shine in this area as they have multitude of options to ship 
any data stream into central aggregation service.

Since Kafka is proclaimed to be far more scalable I'm hoping there are options 
such as (http://logstash.net/docs/1.1.13/) to be able to vacuum any data source 
and put it into kafka queues and then consume them.

Any suggestions?
[cid:41F2EB5A-543E-42B3-B328-54EA3808E32D]

Reply via email to