in a filter section do:
grok {
match => { "message" => "%{SYSLOGBASE} %{DATA:message}" }
}
json {
source => "message"
}
I'm not saying to go to rsyslog to then go to logstash, I'm saying go to rsyslog
to go to ElasticSearch. There is no requirement to use logstash to get the logs
in ElasticSearch.
David Lang
On Wed, 27 May 2015, T-SOC Operations wrote:
Date: Wed, 27 May 2015 16:03:46 +0200
From: T-SOC Operations <[email protected]>
Reply-To: [email protected]
To: [email protected]
Subject: AW: [ossec-list] OSSEC 2.8.1 JSON Format and Logstash challenges
Thanks david.
I'd like to avoid rsyslog and write directly to logstash and especially if
ossec already supports json format.
Unfortunately the alert ossec is sending in json format an the t_source
table including the alert details, are very hard to find a proper regex.
Therefore I asked if someone is willing to share his configuration.
Anyway before i waste to much time to sort out the proper regex for the
ossec
Json message, i will go fort he old syslog way.
Rgds,
gerald
-----Ursprüngliche Nachricht-----
Von: [email protected] [mailto:[email protected]] Im
Auftrag von David Lang
Gesendet: Dienstag, 26. Mai 2015 20:06
An: [email protected]
Betreff: Re: [ossec-list] OSSEC 2.8.1 JSON Format and Logstash challenges
On Tue, 26 May 2015, T-SOC Operations wrote:
hello ossec fellows,
i'm struggling with the json syslog_output filter. The are some "kind of"
json format, but logstash is not able
to decode the message right away.
example json outputs in kibana4:
windows alert: http://pastebin.com/2n4jsJYS
linux alert: http://pastebin.com/UPAUq9pB
you need to look at the message that's arriving and configure logstash to
handle it
to handle this in logstash, you need to first invoke the syslog filter to
handle the first part of the message, then the json filter to handle the
json
the two samples you post show ossec sending a different format, so I suspect
that the linux one is wrong (note, : at the end of the hostname, and the
programname (ossec) is there, with a : after it.)
It looks like you need to configure the grok filters in logstash correctly.
First to parse out the syslog header info, then to parse the json message.
by the way, you do know that you don't have to use logstash to get logs into
ElasticSearch, right? so if you are having trouble configuring it to handle
your messages, you may want to look at using rsyslog to put the logs into
logstash instead.
David Lang
yes, i've tried all recent grok-filters to watch the alerts.log and
ossec.log with logstash directly, but as soon i forward
windows event logs, this is a pure nightmare to build proper regex.
Therefore i really like the idea with forwarding them through the
syslog_ouput json filter and on the other
side to use logstash native udp input - which is working perfectly fine!
I'm really wondering, that i couldn't find any recent ossec
configuration for latest logstash 1.5.0_1 release.
It would be an amazing help to have a permanent, working ossec syslog
forwarding solution. I'm pretty
Sure a lot of people are looking fort hat - in the wonderful new world
of threat analytics with ELK ;-)
Thanks for any hints!
Kind Regards,
Gerald