> Yes, just playing around with it and seeing how the components fit together as a lightweight version of it with real traffic. I needed to make numerous docker-compose, dockerfiles and Ansible role modifications
Wow, you are a glutton for punishment. Nice work getting that far. But, like I said, metron-docker is not what you need right now. You should be able run "Full Dev" by running a single command. It should not be painful for you at all. There is a pre-loaded Metron Dashboard in Kibana that gets loaded during the deployment of Full Dev. On Wed, Jul 19, 2017 at 1:48 PM, Kashif Chowdhree <[email protected]> wrote: > Yes, just playing around with it and seeing how the components fit > together as a lightweight version of it with real traffic. I needed to make > numerous docker-compose, dockerfiles and Ansible role modifications but got > there eventually as I do have it all working now (enriched snort and bro > data in ES), those final set of errors were due to a hbase-master startup > timing dependency on zk. > > There doesn't seem to be any preloaded Metron UI dashboards in Kibana, > which is a shame as I was interested in seeing what the visualisations > looked like against my data (I suppose I will have to spin up fulldev VM > after all). I'd next like to create a Splunk+kafkaconnect equivalent > container and strip out ES+Kibana. > > > -- > Regards, > Kashif Chowdhree > > > > On 19 July 2017 at 16:00, Nick Allen <[email protected]> wrote: > >> What are you trying to do? Are you just trying to experiment with >> Metron? If so, I would suggest that you use the "Full Dev" VM environment >> for this. [1] The Docker stuff is only intended for Metron developers. It >> is not as well-tested as our "Full Dev" VM. >> >> [1] https://github.com/apache/metron/tree/master/metron-deployme >> nt/vagrant/full-dev-platform >> >> On Wed, Jul 19, 2017 at 8:04 AM, Kashif Chowdhree <[email protected]> >> wrote: >> >>> Hi, >>> >>> I've setup metron-docker and successfully have snort and bro logs >>> streaming into their respective kafka topics (I tweaked the docker-compose >>> configs because I didn't want to use docker-machine plus I have live bro >>> and snort sensors running). The enrichment toploogy starts fine, and I can >>> see enriched data if I consume the kafka topic. >>> >>> The issue I have is that the indexing topology doesn't seem to generate >>> anything into it's kafka topoc, there are no errors in the logs aside from >>> the below. What is it that creates the elasticsearch index and thus allow >>> kibana to search against that ES index? No indexes ever get created, per >>> http://elasticsearch:9200/_cat/indices?v >>> >>> health status index pri rep docs.count docs.deleted store.size >>> pri.store.size >>> yellow open .kibana 1 1 1 0 3.1kb >>> 3.1kb >>> >>> >>> Excerpt of errors from /usr/share/apache-storm/logs/w >>> orkers-artifacts/indexing-4-1500464220/6703/worker.log >>> >>> 2017-07-19 11:37:30.219 o.a.z.ClientCnxn [INFO] Socket connection >>> established to elasticsearch/192.168.111.3:2181, initiating session >>> 2017-07-19 11:37:30.217 o.a.c.f.r.c.TreeCache [ERROR] >>> com.fasterxml.jackson.core.metron.elasticsearch.JsonParseException: >>> Unrecognized token 'indexing': was expecting ('true', 'false' or 'null') >>> at [Source: java.io.ByteArrayInputStream@3c456c02; line: 1, column: 17] >>> at com.fasterxml.jackson.core.metron.elasticsearch.JsonParser._ >>> constructError(JsonParser.java:1581) ~[stormjar.jar:?] >>> at com.fasterxml.jackson.core.metron.elasticsearch.base.ParserM >>> inimalBase._reportError(ParserMinimalBase.java:533) ~[stormjar.jar:?] >>> at com.fasterxml.jackson.core.metron.elasticsearch.json.UTF8Str >>> eamJsonParser._reportInvalidToken(UTF8StreamJsonParser.java:3451) >>> ~[stormjar.jar:?] >>> at com.fasterxml.jackson.core.metron.elasticsearch.json.UTF8Str >>> eamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2610) >>> ~[stormjar.jar:?] >>> at com.fasterxml.jackson.core.metron.elasticsearch.json.UTF8Str >>> eamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:841) >>> ~[stormjar.jar:?] >>> at com.fasterxml.jackson.core.metron.elasticsearch.json.UTF8Str >>> eamJsonParser.nextToken(UTF8StreamJsonParser.java:737) ~[stormjar.jar:?] >>> at >>> com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3847) >>> ~[stormjar.jar:?] >>> at >>> com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3792) >>> ~[stormjar.jar:?] >>> at >>> com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2874) >>> ~[stormjar.jar:?] >>> at org.apache.metron.common.utils.JSONUtils.load(JSONUtils.java:41) >>> ~[stormjar.jar:?] >>> at org.apache.metron.common.configuration.IndexingConfiguration >>> s.updateSensorIndexingConfig(IndexingConfigurations.java:52) >>> ~[stormjar.jar:?] >>> at org.apache.metron.common.configuration.IndexingConfiguration >>> s.updateSensorIndexingConfig(IndexingConfigurations.java:48) >>> ~[stormjar.jar:?] >>> at org.apache.metron.common.bolt.ConfiguredIndexingBolt.updateC >>> onfig(ConfiguredIndexingBolt.java:54) ~[stormjar.jar:?] >>> at >>> org.apache.metron.common.bolt.ConfiguredBolt$1.childEvent(ConfiguredBolt.java:94) >>> ~[stormjar.jar:?] >>> at >>> org.apache.curator.framework.recipes.cache.TreeCache$2.apply(TreeCache.java:685) >>> [stormjar.jar:?] >>> at >>> org.apache.curator.framework.recipes.cache.TreeCache$2.apply(TreeCache.java:679) >>> [stormjar.jar:?] >>> at >>> org.apache.curator.framework.listen.ListenerContainer$1.run(ListenerContainer.java:92) >>> [stormjar.jar:?] >>> at org.apache.metron.guava.util.concurrent.MoreExecutors$SameTh >>> readExecutorService.execute(MoreExecutors.java:297) [stormjar.jar:?] >>> at >>> org.apache.curator.framework.listen.ListenerContainer.forEach(ListenerContainer.java:84) >>> [stormjar.jar:?] >>> at >>> org.apache.curator.framework.recipes.cache.TreeCache.callListeners(TreeCache.java:678) >>> [stormjar.jar:?] >>> at >>> org.apache.curator.framework.recipes.cache.TreeCache.access$1400(TreeCache.java:69) >>> [stormjar.jar:?] >>> at >>> org.apache.curator.framework.recipes.cache.TreeCache$4.run(TreeCache.java:790) >>> [stormjar.jar:?] >>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) >>> [?:1.8.0_101] >>> >> >> >
