On Tue, 26 May 2015, T-SOC Operations wrote:

hello ossec fellows,



i'm struggling with the json syslog_output filter. The are some "kind of"
json format, but logstash is not able

to decode the message right away.



example json outputs in kibana4:

windows alert: http://pastebin.com/2n4jsJYS

linux alert: http://pastebin.com/UPAUq9pB

you need to look at the message that's arriving and configure logstash to handle it

to handle this in logstash, you need to first invoke the syslog filter to handle the first part of the message, then the json filter to handle the json

the two samples you post show ossec sending a different format, so I suspect that the linux one is wrong (note, : at the end of the hostname, and the programname (ossec) is there, with a : after it.)

It looks like you need to configure the grok filters in logstash correctly. First to parse out the syslog header info, then to parse the json message.

by the way, you do know that you don't have to use logstash to get logs into ElasticSearch, right? so if you are having trouble configuring it to handle your messages, you may want to look at using rsyslog to put the logs into logstash instead.

David Lang





yes, i've tried all recent grok-filters to watch the alerts.log and
ossec.log with logstash directly, but as soon i forward

windows event logs, this is a pure nightmare to build proper regex.



Therefore i really like the idea with forwarding them through the
syslog_ouput json filter and on the other

side to use logstash native udp input - which is working perfectly fine!





I'm really wondering, that i couldn't find any recent ossec configuration
for latest logstash 1.5.0_1 release.





It would be an amazing help to have a permanent, working ossec syslog
forwarding solution. I'm pretty

Sure a lot of people are looking fort hat - in the wonderful new world of
threat analytics  with ELK ;-)





Thanks for any hints!



Kind Regards,

Gerald




Reply via email to