Salutations Lennart, Thanks for your reply! I believe you are correct partner. I had an older installation on my system. Should I install the 0.20.1 fresh, or can I recover what I have?
Cheers! Terron On Thursday, March 13, 2014 6:14:41 PM UTC-4, lennart wrote: > Hey Terron, > > before we dig deeper into the issue: Could it be that you upgraded an > old Graylog2 installation to the 0.20 series? > > Thanks, > Lennart > > On Wed, Mar 12, 2014 at 4:45 PM, Terron Williams > <[email protected] <javascript:>> wrote: > > Friends, > > > > I just installed graylog2-server-0.20.1 & graylog2-web-interface-0.20.1, > and > > when I perform a search from the Graylog web interface, I receive an > > exception. Please see below. Any ideas? Please forgive if this issue is > > previously known. > > > > Thanks much in advance! > > > > Terron > > > > <Load line Up> > > *graylog2-server-0.20.1 > > *graylog2-web-interface-0.20.1 > > *Linux version 2.6.32-431.5.1.el6.x86_64 > > ([email protected] <javascript:>) (gcc version 4.4.7 > 20120313 (Red Hat > > 4.4.7-4) (GCC) ) #1 SMP Wed Feb 12 00:41:43 UTC 2014 > > *java version "1.7.0_51" > > *rpm -qpil elasticsearch-0.90.10.noarch.rpm > > *MongoDB version: 2.4.9 > > > > //graylog2-server-0.20.1 config > > > > grep -v "#" /etc/graylog2.conf | egrep -v "^[[:space:]]*$" > > > > *is_master = true > > *node_id_file = /etc/graylog2-server-node-id > > *password_secret = > > > VrjK7vEqdABVMk93mDrW1WxcNzitWXOHzjJ3Mgzs6a3YnPqO5chfdn5xm7vtBAtxVl6jCilBXKLeIoV3rVNpNq7ZwTW0qjiY > > > > *root_password_sha2 = > > d2c3c5a9fa646162d110cda388a251171d65b4ddb1d74443c62fa7da6b56d31b > > * plugin_dir = plugin > > *rest_listen_uri = http://127.0.0.1:12900/ > > *elasticsearch_max_docs_per_index = 20000000 > > *elasticsearch_max_number_of_indices = 20 > > *retention_strategy = delete > > *elasticsearch_shards = 1 > > *elasticsearch_replicas = 0 > > *elasticsearch_index_prefix = graylog2 > > *allow_leading_wildcard_searches = false > > *elasticsearch_cluster_name = elasticsearch > > *elasticsearch_analyzer = standard > > *output_batch_size = 5000 > > *processbuffer_processors = 5 > > *outputbuffer_processors = 5 > > *processor_wait_strategy = blocking > > *ring_size = 1024 > > *dead_letters_enabled = false > > *mongodb_useauth = false > > *mongodb_host = 127.0.0.1 > > *mongodb_database = graylog2 > > *mongodb_port = 27017 > > *mongodb_max_connections = 100 > > *mongodb_threads_allowed_to_block_multiplier = 5 > > *transport_email_enabled = false > > *transport_email_hostname = mail.example.com > > *transport_email_port = 587 > > *transport_email_use_auth = true > > *transport_email_use_tls = true > > *transport_email_use_ssl = true > > *transport_email_auth_username = [email protected] <javascript:> > > *transport_email_auth_password = secret > > *transport_email_subject_prefix = [graylog2] > > *transport_email_from_email = [email protected] <javascript:> > > > > //graylog2-web-interface-0.20.1 config > > > > # grep -v "#" > > > /root/Downloads/graylog2-web-interface-0.20.1/conf/graylog2-web-interface.conf > > > > | egrep -v "^[[:space:]]*$" > > > > *graylog2-server.uris="http://127.0.0.1:12900/" > > > *application.secret=gnBBVpVKWMoS2NlWBlQhcwPdeB3qJyK9f1axCLkYCOPAlDVV2ztkeNuOmfLxH2hziyBLwQbvLetZMM5LKTWhFRFM7CSzjNQE > > > > *field_list_limit=0 > > *application.global=lib.Global > > > > //Starting graylog server > > > > tail -f graylog2-server.log > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > > > at java.lang.Thread.run(Thread.java:744) > > 2014-03-11 14:21:54,191 INFO : org.graylog2.Main - Graylog2 0.20.1 > starting > > up. (JRE: Oracle Corporation 1.7.0_51 on Linux > 2.6.32-431.5.1.el6.x86_64) > > 2014-03-11 14:21:54,662 INFO : org.graylog2.plugin.system.NodeId - Node > ID: > > bcfdcd43-addc-451f-bca4-2d88852ccb09 > > 2014-03-11 14:21:54,664 INFO : org.graylog2.Core - No > rest_transport_uri > > set. Falling back to [http://172.17.23.157:12900]. > > 2014-03-11 14:21:56,208 INFO : org.graylog2.buffers.ProcessBuffer - > > Initialized ProcessBuffer with ring size <1024> and wait strategy > > <BlockingWaitStrategy>. > > 2014-03-11 14:21:56,228 INFO : org.graylog2.buffers.OutputBuffer - > > Initialized OutputBuffer with ring size <1024> and wait strategy > > <BlockingWaitStrategy>. > > 2014-03-11 14:21:58,700 INFO : org.elasticsearch.node - > [graylog2-server] > > version[0.90.10], pid[3171], build[0a5781f/2014-01-10T10:18:37Z] > > 2014-03-11 14:21:58,700 INFO : org.elasticsearch.node - > [graylog2-server] > > initializing ... > > 2014-03-11 14:21:58,853 INFO : org.elasticsearch.plugins - > [graylog2-server] > > loaded [], sites [] > > 2014-03-11 14:22:09,571 INFO : org.elasticsearch.node - > [graylog2-server] > > initialized > > 2014-03-11 14:22:09,571 INFO : org.elasticsearch.node - > [graylog2-server] > > starting ... > > 2014-03-11 14:22:09,711 INFO : org.elasticsearch.transport - > > [graylog2-server] bound_address {inet[/0:0:0:0:0:0:0:0:9350]}, > > publish_address {inet[/10.22.23.48:9350]} > > 2014-03-11 14:22:12,734 WARN : org.elasticsearch.discovery - > > [graylog2-server] waited for 3s and no initial state was set by the > > discovery > > 2014-03-11 14:22:12,734 INFO : org.elasticsearch.discovery - > > [graylog2-server] elasticsearch/BzASbvxgRS-xxP1bfMLxkA > > 2014-03-11 14:22:12,735 INFO : org.elasticsearch.node - > [graylog2-server] > > started > > 2014-03-11 14:22:12,971 INFO : org.elasticsearch.cluster.service - > > [graylog2-server] detected_master > > [graylog2-server][VWdvN_zxRtiWuRFdwz0kcw][inet[/10.22.23.48:9300]], > added > > {[graylog2-server][VWdvN_zxRtiWuRFdwz0kcw][inet[/10.22.23.48:9300]],}, > > reason: zen-disco-receive(from master > > [[graylog2-server][VWdvN_zxRtiWuRFdwz0kcw][inet[/10.22.23.48:9300]]]) > > 2014-03-11 14:22:13,912 INFO : org.graylog2.Core - Setting up > deflector. > > 2014-03-11 14:22:13,928 INFO : org.graylog2.indexer.Deflector - Found > > deflector alias <graylog2_deflector>. Using it. > > 2014-03-11 14:22:13,956 INFO : > org.graylog2.initializers.DroolsInitializer - > > Not using rules > > 2014-03-11 14:22:13,957 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer <org.graylog2.initializers.DroolsInitializer>. > > 2014-03-11 14:22:13,963 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.HostCounterCacheWriterInitializer>. > > 2014-03-11 14:22:13,969 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.ThroughputCounterInitializer>. > > 2014-03-11 14:22:13,971 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer <org.graylog2.initializers.NodePingInitializer>. > > 2014-03-11 14:22:13,973 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > <org.graylog2.initializers.AlarmScannerInitializer>. > > 2014-03-11 14:22:13,974 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.DeflectorThreadsInitializer>. > > 2014-03-11 14:22:13,976 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.AnonymousInformationCollectorInitializer>. > > 2014-03-11 14:22:13,986 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.IndexRetentionInitializer>. > > 2014-03-11 14:22:13,992 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.MasterCacheWorkersInitializer>. > > 2014-03-11 14:22:13,998 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.ClusterHealthCheckInitializer>. > > 2014-03-11 14:22:14,005 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > > <org.graylog2.initializers.StreamThroughputCounterInitializer>. > > 2014-03-11 14:22:14,244 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > <org.graylog2.initializers.VersionCheckInitializer>. > > 2014-03-11 14:22:14,281 INFO : org.graylog2.initializers.Initializers - > > Initialized initializer > <org.graylog2.initializers.DeadLetterInitializer>. > > 2014-03-11 14:22:14,281 INFO : org.graylog2.outputs.OutputRegistry - > > Initialized output <org.graylog2.outputs.ElasticSearchOutput>. > > 2014-03-11 14:22:14,677 INFO : org.graylog2.inputs.InputRegistry - > Starting > > [org.graylog2.inputs.syslog.udp.SyslogUDPInput] input with ID > > <531f55d00cf2d233ccd13197> > > 2014-03-11 14:22:15,393 INFO : > org.graylog2.inputs.syslog.udp.SyslogUDPInput > > - Started syslog UDP input server on /0.0.0.0:514 > > 2014-03-11 14:22:15,703 INFO : org.graylog2.inputs.InputRegistry - > > Completed starting [org.graylog2.inputs.syslog.udp.SyslogUDPInput] input > > with ID <531f55d00cf2d233ccd13197> > > > > //Starting web interface > > > > [root@debugger graylog2-web-interface-0.20.1]# > bin/graylog2-web-interface > > Play server process ID is 3380 > > [debug] application - Loading timeout value into cache from > configuration > > for key DEFAULT: Not configured, falling back to default. > > [debug] application - Loading timeout value into cache from > configuration > > for key node_refresh: Not configured, falling back to default. > > [info] play - Application started (Prod) > > [info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000 > > > > //Logged into web interface- Cool > > > > //Added syslog input attributes and confirmed event data is being > received > > > > //Perform search query and exception occurs > > > > //Exception caught in graylog2-server.log when trying to search > > > > 2014-03-11 13:31:49,413 ERROR: > > org.graylog2.plugin.rest.AnyExceptionClassMapper - Unhandled exception > in > > REST resource > > org.elasticsearch.action.search.SearchPhaseExecutionException: Failed > to > > execute phase [query_fetch], all shards failed; shardFailures > > {[P3vhBBg-RTOVoNl2prjj8Q][graylog2_0][0]: > > > RemoteTransportException[[Bela][inet[/10.22.23.48:9301]][search/phase/query+fetch]]; > > > > nested: SearchParseException[[graylog2_0][0]: > > query[_all:tgb],from[-1],size[-1]: Parse Failure [Failed to parse source > > > [{"query":{"query_string":{"query":"tgb","allow_leading_wildcard":false}},"facets":{"histogram":{"date_histogram":{"field":"timestamp","interval":"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-03-11 > > > > 18:26:49.407","to":"2014-03-11 > > 18:31:49.407","include_lower":true,"include_upper":true}}}}}}}}]]]; > nested: > > > ClassCastException[org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData > > > > cannot be cast to > org.elasticsearch.index.fielddata.IndexNumericFieldData]; > > } > > at > > > org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:272) > > > > at > > > org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$3.onFailure(TransportSearchTypeAction.java:224) > > > > at > > > org.elasticsearch.search.action.SearchServiceTransportAction$7.handleException(SearchServiceTransportAction.java:324) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.handleException(MessageChannelHandler.java:181) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:171) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:123) > > > > at > > > org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) > > > > at > > > org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > > > at > > > org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > > > at > > > org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > > > at > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > > > at java.lang.Thread.run(Thread.java:744) > > > > //Web Interface Error reported: > > > > (You caused a lib.APIException. API call failed GET > > http://@ > 172.17.23.157:12900/search/universal/relative/histogram?interval=minute&query=tgb&range=300&range_type=relative&filter=* > > > returned 500 Internal Server Error body: Failed to execute phase > > [query_fetch], all shards failed; shardFailures > > {[VWdvN_zxRtiWuRFdwz0kcw][graylog2_0][0]: > > > RemoteTransportException[[graylog2-server][inet[/10.22.23.48:9300]][search/phase/query+fetch]]; > > > > nested: SearchParseException[[graylog2_0][0]: > > query[_all:tgb],from[-1],size[-1]: Parse Failure [Failed to parse source > > > [{"query":{"query_string":{"query":"tgb","allow_leading_wildcard":false}},"facets":{"histogram":{"date_histogram":{"field":"timestamp","interval":"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-03-11 > > > > 19:22:10.221","to":"2014-03-11 > > 19:27:10.222","include_lower":true,"include_upper":true}}}}}}}}]]]; > nested: > > > ClassCastException[org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData > > > > cannot be cast to > org.elasticsearch.index.fielddata.IndexNumericFieldData]; > > } org.elasticsearch.action.search.SearchPhaseExecutionException: Failed > to > > execute phase [query_fetch], all shards failed; shardFailures > > {[VWdvN_zxRtiWuRFdwz0kcw][graylog2_0][0]: > > > RemoteTransportException[[graylog2-server][inet[/10.22.23.48:9300]][search/phase/query+fetch]]; > > > > nested: SearchParseException[[graylog2_0][0]: > > query[_all:tgb],from[-1],size[-1]: Parse Failure [Failed to parse source > > > [{"query":{"query_string":{"query":"tgb","allow_leading_wildcard":false}},"facets":{"histogram":{"date_histogram":{"field":"timestamp","interval":"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-03-11 > > > > 19:22:10.221","to":"2014-03-11 > > 19:27:10.222","include_lower":true,"include_upper":true}}}}}}}}]]]; > nested: > > > ClassCastException[org.elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData > > > > cannot be cast to > org.elasticsearch.index.fielddata.IndexNumericFieldData]; > > } at > > > org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:272) > > > > at > > > org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$3.onFailure(TransportSearchTypeAction.java:224) > > > > at > > > org.elasticsearch.search.action.SearchServiceTransportAction$7.handleException(SearchServiceTransportAction.java:324) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.handleException(MessageChannelHandler.java:181) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:171) > > > > at > > > org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:123) > > > > at > > > org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) > > > > at > > > org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) > > > > at > > > org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) > > > > at > > > org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) > > > > at > > > org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) > > > > at > > > org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) > > > > at > > > org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) > > > > at > > > org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) > > > > at > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > > > at > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > > > at java.lang.Thread.run(Thread.java:744) ) > > > > //Stack trace > > > > Stacktrace > > lib.ApiClientImpl$ApiRequestBuilder#execute (ApiClientImpl.java:372) > > models.UniversalSearch#dateHistogram (UniversalSearch.java:148) > > controllers.SearchController#index (SearchController.java:93) > > Routes$$anonfun$routes$1$$anonfun$applyOrElse$7$$anonfun$apply$19#apply > > (routes_routing.scala:657) > > Routes$$anonfun$routes$1$$anonfun$applyOrElse$7$$anonfun$apply$19#apply > > (routes_routing.scala:657) > > play.core.Router$HandlerInvoker$$anon$7$$anon$2#invocation > > (Router.scala:183) > > play.core.Router$Routes$$anon$1#invocation (Router.scala:377) > > play.core.j.JavaAction$$anon$1#call (JavaAction.scala:56) > > play.GlobalSettings$1#call (GlobalSettings.java:64) > > play.mvc.Security$AuthenticatedAction#call (Security.java:45) > > play.core.j.JavaAction$$anon$3#apply (JavaAction.scala:91) > > play.core.j.JavaAction$$anon$3#apply (JavaAction.scala:90) > > play.core.j.FPromiseHelper$$anonfun$flatMap$1#apply > > (FPromiseHelper.scala:82) > > play.core.j.FPromiseHelper$$anonfun$flatMap$1#apply > > (FPromiseHelper.scala:82) > > scala.concurrent.Future$$anonfun$flatMap$1#apply (Future.scala:251) > > scala.concurrent.Future$$anonfun$flatMap$1#apply (Future.scala:249) > > scala.concurrent.impl.CallbackRunnable#run (Promise.scala:32) > > play.core.j.HttpExecutionContext$$anon$2#run > (HttpExecutionContext.scala:37) > > akka.dispatch.TaskInvocation#run (AbstractDispatcher.scala:42) > > akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask#exec > > (AbstractDispatcher.scala:386) > > scala.concurrent.forkjoin.ForkJoinTask#doExec (ForkJoinTask.java:260) > > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue#runTask > > (ForkJoinPool.java:1339) > > scala.concurrent.forkjoin.ForkJoinPool#runWorker > (ForkJoinPool.java:1979) > > scala.concurrent.forkjoin.ForkJoinWorkerThread#run > > (ForkJoinWorkerThread.java:107) > > > > -- > > You received this message because you are subscribed to the Google > Groups > > "graylog2" group. > > To unsubscribe from this group and stop receiving emails from it, send > an > > email to [email protected] <javascript:>. > > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "graylog2" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
