Correct, the grok parser pattern file is on HDFS.

I combined the parserConfig's and its not working. The error has changed 
though, back to the timestamp.


java.lang.IllegalStateException: Grok parser Error: For input string: 
"2017-05-03T23:28:58849Z" on 
{"@timestamp":"2017-05-03T23:28:58.849Z","beat":{"hostname"

So I changed the parser config to this and it worked :)

{
  "parserClassName":"org.apache.metron.parsers.json.JSONMapParser",
  "sensorTopic":"winlogbeat",
  "parserConfig": {}
}


I had to modify the elasticsearch index but it seems to be in like flynn.

I am missing some information that is being slurped by the JSON parser, but let 
me noodle it a bit before i come back.

Thanks for your help, this is better than any support call to any company.

Here is the script i use to delete the index and rebuild it.

curl -XDELETE 127.0.0.1:9200/winlogbeat*

curl -XPOST 127.0.0.1:9200/_template/winlogbeat_index -d '
{
"template":"winlogbeat_index*",
"mappings":{
  "winlogbeat_doc":{
         "_timestamp":{
                "enabled":true
         },
         "properties":{
                "@timestamp":{
                   "type":"date",
                   "format":"dateOptionalTime"
                },
                "timestamp":{
                   "type":"date",
                   "format":"epoch_millis"
                },
                "keywords":{
                        "type":"string",
                                "index":"not_analyzed"
                },
                "event_id":{
                        "type":"string",
                                "index":"not_analyzed"
                },
                "computer_name":{
                        "type":"string",
                                "index":"not_analyzed"
                }
         }
  }
}
}'


________________________________

From: Simon Elliston Ball <si...@simonellistonball.com>
Sent: Wednesday, May 3, 2017 8:33 PM
To: user@metron.apache.org
Subject: Re: Question on Windows event log ingest and parse

And just to check… you have the pattern definition you previously sent in 
/patterns/winlogbeat (file) on HDFS.

It looks like the most likely problem from your config is that you have two 
parserConfig elements. I suspect the second is over-riding the first, and hence 
you are losing the grokPath config, if you move the dc2tz element into the 
first parserConfig, you should be good.

As an aside from a quick look at your pattern, it looks like it may be easier 
to use the JSONMapParser for this particular sensor.

Simon

On 4 May 2017, at 01:28, ed d 
<ragdel...@hotmail.com<mailto:ragdel...@hotmail.com>> wrote:

Correction, deploying the Storm topology is this:


/usr/metron/$METRON_VERSION/bin/start_parser_topology.sh -z `hostname -f`:2181 
-k `hostname -f`:6667 -s winlogbeat





________________________________
From: Simon Elliston Ball 
<si...@simonellistonball.com<mailto:si...@simonellistonball.com>>
Sent: Wednesday, May 3, 2017 5:59 PM
To: user@metron.apache.org<mailto:user@metron.apache.org>
Subject: Re: Question on Windows event log ingest and parse

Hi Ed,

Sounds like a really nice piece of work to get pushed into the core… how would 
you feel about taking that grok parser and formalising it into the core of 
Metron (happy to help there by the way).

On the actual issue, is sounds like it’s likely to be something to do with 
conversion of the timestamp format to the unixtime used in Metron. We can look 
at that. Did you see any log messages in the storm logs from the topology that 
died?

Simon


On 3 May 2017, at 22:34, ed d 
<ragdel...@hotmail.com<mailto:ragdel...@hotmail.com>> wrote:


Metron version – 0.4.0
Single node install, bare metal install
No significant changes to base install besides maintenance mode on 
elasticsearch mpack and manual configuration.

I have a Windows 2012 server running AD, AD LDS, DNS, and DHCP. I installed 
Winlogbeat<https://www.elastic.co/downloads/beats/winlogbeat>5.3.2 64 bit onto 
the server. It was configured to push logs to the Elasticsearch on my Metron 
install, and it works great. No issues.

I modified the Winlogbeat configuration to push logs directly to Kafka as I 
want to enrich the logs. I followed this 
guide<https://www.elastic.co/guide/en/beats/winlogbeat/master/kafka-output.html>.

I can see logs coming into the Kafka topic, so I built a Grok parser to slice 
and dice. It seems to work fine on Grok 
Constructor<http://grokconstructor.appspot.com/do/match> and Grok 
Debugger<https://grokdebug.herokuapp.com/>, but when I load it into Metron as a 
parser, it kills the Storm topology. It seems to be sticking on the timestamp, 
which is ISO_8601<https://en.wikipedia.org/wiki/ISO_8601> format 
(2017-05-03T21:04:33Z).

My question to the group, before troubleshooting my install, is to see if 
anyone else has had success ingesting and parsing Windows event logs?

Does anyone pull Windows log into Kafka, Nifi, or other with the intent to 
enrich the elements of the log? And if yes, what have you found to be most 
useful?

FYI here is my Grok parser for reference:

timestamp"\:"%{TIMESTAMP_ISO8601:timestamp}","beat"\:\{"hostname"\:%{QUOTEDSTRING:hostname},"name"\:%{QUOTEDSTRING:name},"version"\:%{QUOTEDSTRING:beat_version}\},"computer_name"\:%{QUOTEDSTRING:computer_name},"event_data"\:\{("AuthenticationPackageName"\:%{QUOTEDSTRING:AuthenticationPackageName},?)?("ImpersonationLevel"\:%{QUOTEDSTRING:ImpersonationLevel},?)?("FailureReason"\:%{QUOTEDSTRING:FailureReason},?)?("IpAddress"\:"%{IP:ip_src_addr}",?)?("IpPort"\:%{QUOTEDSTRING:IpPort},?)?("KeyLength"\:%{QUOTEDSTRING:KeyLength},?)?("LmPackageName"\:%{QUOTEDSTRING:LmPackageName},?)?("LogonGuid"\:%{QUOTEDSTRING:LogonGuid},?)?("LogonProcessName"\:%{QUOTEDSTRING:LogonProcessName},?)?("LogonType"\:%{QUOTEDSTRING:LogonType},?)?("PrivilegeList"\:%{QUOTEDSTRING:PrivilegeList},?)?("ProcessId"\:%{QUOTEDSTRING:ProcessId},?)?("ProcessName"\:%{QUOTEDSTRING:ProcessName},?)?("PackageName"\:%{QUOTEDSTRING:PackageName},?)?("Status"\:%{QUOTEDSTRING:Status},?)?("SubStatus"\:%{QUOTEDSTRING:SubStatus},?)?("SubjectDomainName"\:%{QUOTEDSTRING:SubjectDomainName},?)?("SubjectLogonId"\:%{QUOTEDSTRING:SubjectLogonId},?)?("SubjectUserName"\:%{QUOTEDSTRING:SubjectUserName},?)?("SubjectUserSid"\:%{QUOTEDSTRING:SubjectUserSid},?)?("TargetDomainName"\:%{QUOTEDSTRING:TargetDomainName},?)?("TargetLogonId"\:%{QUOTEDSTRING:TargetLogonId},?)?("TargetUserName"\:%{QUOTEDSTRING:TargetUserName},?)?("TargetUserSid"\:%{QUOTEDSTRING:TargetUserSid},?)?("TransmittedServices"\:%{QUOTEDSTRING:TransmittedServices},?)?("Workstation"\:%{QUOTEDSTRING:Workstation},?)?("WorkstationName"\:%{QUOTEDSTRING:WorkstationName},?)?\},"event_id"\:%{NUMBER:event_id},"keywords"\:\[%{QUOTEDSTRING:keywords}\],"level"\:%{QUOTEDSTRING:level},"log_name"\:%{QUOTEDSTRING:log_name},"message"\:%{QUOTEDSTRING:message},"opcode"\:%{QUOTEDSTRING:opcode},"process_id"\:%{NUMBER:process_id},"provider_guid"\:%{QUOTEDSTRING:provider_guid},"record_number"\:%{QUOTEDSTRING:record_number},"source_name"\:%{QUOTEDSTRING:source_name},"task"\:%{QUOTEDSTRING:task},"thread_id"\:%{NUMBER:thread_id},"type"\:%{QUOTEDSTRING:type},?("version"\:%{NUMBER:version},?)?\}

Reply via email to