OK, good to know. I would eventually like to create tests for these
scenarios, so we can automatically suggest fixes for them, like we do for
several others.
Still not sure how a timestamp can become a string when writing mappings,
but this seems a common theme when elasticsearch runs out of memory.

Thanks,
Kay
On May 28, 2014 9:58 AM, "Martin René Mortensen" <
[email protected]> wrote:

> oh yes, my graylog2_deflector index was totally out of it, it had
> timestamp pegged as a string. Probably caused by elasticsearch running out
> of memory a while ago, and restarting seemed to fix it...
>
> Well I deleted the whole index and now its all better.
>
> /Martin
>
> On Wednesday, 28 May 2014 09:15:19 UTC+2, Martin René Mortensen wrote:
>>
>> This problem persist after downgrading to 0.20.1 - something must have
>> corrupted the data Im thinking. or the graylog2-web service is asking
>> something wrong. Google mentions trying to create a histogram on a field
>> that is not a date, so I might just have corrupt data.
>>
>> On Tuesday, 27 May 2014 15:20:13 UTC+2, Martin René Mortensen wrote:
>>>
>>> After upgrading from 0.20.1 to 0.20.2 all my requests fail with http
>>> 500. Something with a cast.
>>>
>>> Help!
>>>
>>> /Martin
>>>
>>> <http://syslog/>
>>>
>>>    - Search <http://syslog/>
>>>    - Streams <http://syslog/streams>
>>>    - Dashboards <http://syslog/dashboards>
>>>    - Sources <http://syslog/sources>
>>>    - System <http://syslog/system?page=0>
>>>    - <http://syslog/system?page=0>
>>>
>>> *527* msg/s on 1 node.  <http://syslog/logout>
>>>  Oh no, something went wrong!
>>>
>>> (You caused a lib.APIException. API call failed GET http://@
>>> syslog.adm.ku.dk:12900/search/universal/relative/histogram?interval=
>>> minute&query=*&range=300&range_type=relative&filter=* returned 500
>>> Internal Server Error body: Failed to execute phase [query], all shards
>>> failed; shardFailures {[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][2]:
>>> RemoteTransportException[[Demogoblin][inet[/130.225.127.
>>> 90:9300]][search/phase/query]]; nested: 
>>> SearchParseException[[graylog2_120][2]:
>>> query[ConstantScore(*:*)],from[-1],size[-1]: Parse Failure [Failed to
>>> parse source [{"query":{"query_string":{"query":"*","allow_leading_
>>> wildcard":false}},"facets":{"histogram":{"date_histogram":{
>>> "field":"timestamp","interval":"minute"},"facet_filter":{"
>>> bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][0]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][0]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][1]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][1]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][3]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][3]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData]; }
>>> org.elasticsearch.action.search.SearchPhaseExecutionException: Failed
>>> to execute phase [query], all shards failed; shardFailures
>>> {[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][2]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][2]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][0]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][0]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][1]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][1]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData];
>>> }{[xspVvDzYT0Gy-e0nJucfxw][graylog2_120][3]: RemoteTransportException[[
>>> Demogoblin][inet[/130.225.127.90:9300]][search/phase/query]]; nested:
>>> SearchParseException[[graylog2_120][3]: 
>>> query[ConstantScore(*:*)],from[-1],size[-1]:
>>> Parse Failure [Failed to parse source [{"query":{"query_string":{"
>>> query":"*","allow_leading_wildcard":false}},"facets":{"
>>> histogram":{"date_histogram":{"field":"timestamp","interval"
>>> :"minute"},"facet_filter":{"bool":{"must":{"range":{"timestamp":{"from":"2014-05-27
>>> 13:11:37.248","to":"2014-05-27 13:16:37.248","include_lower":
>>> true,"include_upper":true}}}}}}}}]]]; nested: ClassCastException[org.
>>> elasticsearch.index.fielddata.plain.PagedBytesIndexFieldData cannot be
>>> cast to org.elasticsearch.index.fielddata.IndexNumericFieldData]; } at
>>> org.elasticsearch.action.search.type.TransportSearchTypeAction$
>>> BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:272)
>>> at org.elasticsearch.action.search.type.TransportSearchTypeAction$
>>> BaseAsyncAction$3.onFailure(TransportSearchTypeAction.java:224) at
>>> org.elasticsearch.search.action.SearchServiceTransportAction$
>>> 4.handleException(SearchServiceTransportAction.java:222) at
>>> org.elasticsearch.transport.netty.MessageChannelHandler.handleException(
>>> MessageChannelHandler.java:181) at org.elasticsearch.transport.
>>> netty.MessageChannelHandler.handlerResponseError(
>>> MessageChannelHandler.java:171) at org.elasticsearch.transport.
>>> netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:123)
>>> at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.
>>> handleUpstream(SimpleChannelUpstreamHandler.java:70) at
>>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.
>>> sendUpstream(DefaultChannelPipeline.java:564) at
>>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline$
>>> DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
>>> at org.elasticsearch.common.netty.channel.Channels.
>>> fireMessageReceived(Channels.java:296) at org.elasticsearch.common.
>>> netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
>>> at org.elasticsearch.common.netty.handler.codec.frame.
>>> FrameDecoder.callDecode(FrameDecoder.java:443) at
>>> org.elasticsearch.common.netty.handler.codec.frame.
>>> FrameDecoder.messageReceived(FrameDecoder.java:303) at
>>> org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.
>>> handleUpstream(SimpleChannelUpstreamHandler.java:70) at
>>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.
>>> sendUpstream(DefaultChannelPipeline.java:564) at
>>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.
>>> sendUpstream(DefaultChannelPipeline.java:559) at
>>> org.elasticsearch.common.netty.channel.Channels.
>>> fireMessageReceived(Channels.java:268) at org.elasticsearch.common.
>>> netty.channel.Channels.fireMessageReceived(Channels.java:255) at
>>> org.elasticsearch.common.netty.channel.socket.nio.
>>> NioWorker.read(NioWorker.java:88) at org.elasticsearch.common.
>>> netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
>>> at org.elasticsearch.common.netty.channel.socket.nio.
>>> AbstractNioSelector.run(AbstractNioSelector.java:318) at
>>> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
>>> at org.elasticsearch.common.netty.channel.socket.nio.
>>> NioWorker.run(NioWorker.java:178) at org.elasticsearch.common.
>>> netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
>>> at org.elasticsearch.common.netty.util.internal.
>>> DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:744) )
>>>
>>> *Reason:* There was a problem with your search. We expected HTTP 200,
>>> but got a HTTP 500.
>>>
>>> Need help?Do not hesitate to consult the Graylog2 community if your
>>> questions are not answered in the 
>>> documentation<http://support.torch.sh/help/kb>
>>> .
>>>
>>>    -  Forum / Mailing 
>>> list<http://support.torch.sh/help/kb/general/forums-mailing-list>
>>>    -  Issue trackers<http://support.torch.sh/help/kb/general/issue-trackers>
>>>    -  Commercial support <http://www.torch.sh/>
>>>
>>> Stacktrace
>>>
>>>    - lib.ApiClientImpl$ApiRequestBuilder#execute (
>>>    *ApiClientImpl.java:372*)
>>>    - models.UniversalSearch#dateHistogram (*UniversalSearch.java:156*)
>>>    - controllers.SearchController#index (*SearchController.java:109*)
>>>    - Routes$$anonfun$routes$1$$anonfun$applyOrElse$7$$anonfun$apply$19#apply
>>>    (*routes_routing.scala:693*)
>>>    - Routes$$anonfun$routes$1$$anonfun$applyOrElse$7$$anonfun$apply$19#apply
>>>    (*routes_routing.scala:693*)
>>>    - play.core.Router$HandlerInvoker$$anon$7$$anon$2#invocation (
>>>    *Router.scala:183*)
>>>    - play.core.Router$Routes$$anon$1#invocation (*Router.scala:377*)
>>>    - play.core.j.JavaAction$$anon$1#call (*JavaAction.scala:56*)
>>>    - play.GlobalSettings$1#call (*GlobalSettings.java:64*)
>>>    - play.mvc.Security$AuthenticatedAction#call (*Security.java:45*)
>>>    - play.core.j.JavaAction$$anon$3#apply (*JavaAction.scala:91*)
>>>    - play.core.j.JavaAction$$anon$3#apply (*JavaAction.scala:90*)
>>>    - play.core.j.FPromiseHelper$$anonfun$flatMap$1#apply (
>>>    *FPromiseHelper.scala:82*)
>>>    - play.core.j.FPromiseHelper$$anonfun$flatMap$1#apply (
>>>    *FPromiseHelper.scala:82*)
>>>    - scala.concurrent.Future$$anonfun$flatMap$1#apply (
>>>    *Future.scala:251*)
>>>    - scala.concurrent.Future$$anonfun$flatMap$1#apply (
>>>    *Future.scala:249*)
>>>    - scala.concurrent.impl.CallbackRunnable#run (*Promise.scala:32*)
>>>    - play.core.j.HttpExecutionContext$$anon$2#run (
>>>    *HttpExecutionContext.scala:37*)
>>>    - akka.dispatch.TaskInvocation#run (*AbstractDispatcher.scala:42*)
>>>    - akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask#exec (
>>>    *AbstractDispatcher.scala:386*)
>>>    - scala.concurrent.forkjoin.ForkJoinTask#doExec (
>>>    *ForkJoinTask.java:260*)
>>>    - scala.concurrent.forkjoin.ForkJoinPool$WorkQueue#runTask (
>>>    *ForkJoinPool.java:1339*)
>>>    - scala.concurrent.forkjoin.ForkJoinPool#runWorker (
>>>    *ForkJoinPool.java:1979*)
>>>    - scala.concurrent.forkjoin.ForkJoinWorkerThread#run (
>>>    *ForkJoinWorkerThread.java:107*)
>>>
>>> Request informationMethodGETQuery
>>>
>>>    - relative=300
>>>    - to=
>>>    - q=
>>>    - fields=
>>>    - from=
>>>    - rangetype=relative
>>>
>>> X-Forwarded-HostsyslogRefererhttp://syslog/X-Forwarded-For10.76.7.39
>>> X-Forwarded-Serversyslog.adm.ku.dkConnectionKeep-AliveAccept
>>> text/html,application/xhtml+xml,application/xml;q=0.9,
>>> image/webp,*/*;q=0.8Accept-Languageen-US,en;q=0.8,da;q=0.6
>>> Accept-Encodinggzip,deflate,sdchUser-AgentMozilla/5.0 (X11; Linux
>>> x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.137
>>> Safari/537.36Host130.225.127.90:9000
>>>
>>  --
> You received this message because you are subscribed to the Google Groups
> "graylog2" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"graylog2" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to