I dont think so at present. You can have several graylog2 server, but only 1 ES index, you cannot search in more than 1 ES cluster.
I understand your strategy, disconnected indexing with distributed search, but graylog2 cannot search in more than 1 ES gluster. I was thinking about using 1 ES cluster with 2 nodes, 2 graylog2 instances with each their own index prefix in the same ES index. Graylog2 searching might just search in all graylog2_* indices and therefore might just search through them all. This is not a recommended strategy though, just a thought. /Martin On Wednesday, 18 June 2014 13:16:35 UTC+2, [email protected] wrote: > > I have two Graylog2 servers at two locations; Server1 and Server2. > > Server1 holds the Mongo database, Both servers use the Mongo database on > Server1. > > Both servers however also store their data in Elasticsearch on Server1. If > Server1 goes down, Server2 will stop receiving messages. > > Server1 should store its data in ES on Server1 > Server2 should store its data in ES on Server2 > > So when Server1 goes down, Server2 should still be receiving messages. > > I would rather not replicate the ES indices or by some other way use > double disk space or cause massive network load. > > Main goal is to have a dedicated Graylog2+ES server on each location; > receiving message from hosts on their respective location and being able to > search via 1 webinterface in both ES indices. > > Is this possible? And if so; how? > -- You received this message because you are subscribed to the Google Groups "graylog2" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
