I'm pushing netflow into elasticSearch and it looks like there is a block of
fields that come in as
json_N. I've included a sample document from kibana. Two questions:
1) Is there a way I can push these docs into redis instead so that I can then
pull it out using
logstash so I have the ability to mutate fields?
2) Is there a way I can define these fields before ntopng ships them to
elasticSearch?
here's the json:
{
"_index": "ntopng-2015.12.10",
"_type": "ntopng",
"_id": "AVGNJSTcITc7jbmnrBAl",
"_score": null,
"_source": {
"@timestamp": "2015-12-10T18:26:04.0Z",
"type": "ntopng",
"IPV4_SRC_ADDR": "192.168.118.16",
"L4_SRC_PORT": 52009,
"IPV4_DST_ADDR": "199.16.156.70",
"L4_DST_PORT": 443,
"PROTOCOL": 6,
"L7_PROTO": 91,
"L7_PROTO_NAME": "SSL",
"TCP_FLAGS": 0,
"IN_PKTS": 8,
"IN_BYTES": 838,
"OUT_PKTS": 0,
"OUT_BYTES": 0,
"FIRST_SWITCHED": 1449771964,
"LAST_SWITCHED": 1449771964,
##HERE IS THE BLOCK OF WEIRD FIELDS##
"json": {
"5": "0",
"9": "0",
"10": "1",
"13": "0",
"14": "16",
"15": "0.0.0.0",
"16": "6522",
"17": "13414",
"42": "32102093"
},
##END OF WEIRD FIELDS##
"CLIENT_NW_LATENCY_MS": 0,
"SERVER_NW_LATENCY_MS": 0,
"SRC_IP_COUNTRY": "US",
"SRC_IP_LOCATION": [
-75.354698,
40.590199
],
"DST_IP_COUNTRY": "US",
"DST_IP_LOCATION": [
-122.393303,
37.769699
],
"PASS_VERDICT": true
},
"fields": {
"@timestamp": [
1449771964000
]
},
"sort": [
1449771964000
]
}
--
Munroe Sollog
LTS - Network Analyst
x85002
_______________________________________________
Ntop mailing list
[email protected]
http://listgateway.unipi.it/mailman/listinfo/ntop