More information. All 5 ES nodes are on 1.3.2 (checked with curl 
localhost:9200/) with java 1.7.0_65. Client machine is also on 1.3.2 
and 1.7.0_65


On Friday, September 19, 2014 6:21:06 PM UTC-3, Felipe Hummel wrote:
>
> I missed a part of the error message:
>
> [WARN] 2014-09-19 20:29:13.176 o.e.t.netty - [Sigyn] Message not fully 
>> read (response) for [61] handler 
>> org.elasticsearch.action.TransportActionNodeProxy$1@2e6201d0, error 
>> [false], resetting
>
>
> On Friday, September 19, 2014 5:58:15 PM UTC-3, Felipe Hummel wrote:
>>
>> Hey guys, I’m getting NullPointerException while using a 
>> *significant_terms* aggregation. It happens in this line: 
>>
>>
>> org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
>>
>> The error is in the deserialization: *Failed to deserialize response of 
>> type [org.elasticsearch.action.search.SearchResponse]*
>> I’m using the Java API. I just printed the request and manually did it 
>> through the REST API and everything went fine. It happens only when using 
>> the Java API.
>>
>> I'm using ES 1.3.2.
>>
>> *The printed search request:*
>>
>>> {
>>>   "from" : 0,
>>>   "size" : 6,
>>>   "timeout" : 30000,
>>>   "query" : {
>>>     "filtered" : {
>>>       "query" : {
>>>         "query_string" : {
>>>           "query" : "ayrton senna",
>>>           "fields" : [ "title^2.0", "description" ],
>>>           "default_operator" : "and"
>>>         }
>>>       },
>>>       "filter" : {
>>>         "bool" : {
>>>           "must" : [ {
>>>             "range" : {
>>>               "created_at" : {
>>>                 "from" : null,
>>>                 "to" : "2014-09-19T20:28:30.000Z",
>>>                 "include_lower" : true,
>>>                 "include_upper" : true
>>>               },
>>>               "_cache" : true
>>>             }
>>>           }, {
>>>             "range" : {
>>>               "published_at" : {
>>>                 "from" : null,
>>>                 "to" : "2014-09-19T20:28:30.000Z",
>>>                 "include_lower" : true,
>>>                 "include_upper" : true
>>>               },
>>>               "_cache" : true
>>>             }
>>>           }, {
>>>             "range" : {
>>>               "published_at" : {
>>>                 "from" : "2014-08-20T20:28:30.000Z",
>>>                 "to" : "2014-09-19T20:28:30.000Z",
>>>                 "include_lower" : true,
>>>                 "include_upper" : true
>>>               },
>>>               "_cache" : false
>>>             }
>>>           } ]
>>>         }
>>>       }
>>>     }
>>>   },
>>>   "fields" : [ ],
>>>   "aggregations" : {
>>>     "topics" : {
>>>       "significant_terms" : {
>>>         "field" : "topic_ids",
>>>         "size" : 20
>>>       }
>>>     }
>>>   }
>>> }
>>
>>
>> *The complete error stacktrace:*
>>
>> [ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet - 
>> org.elasticsearch.transport.TransportSerializationException: Failed to 
>> deserialize response of type 
>> [org.elasticsearch.action.search.SearchResponse]
>> org.elasticsearch.transport.TransportSerializationException: Failed to 
>> deserialize response of type 
>> [org.elasticsearch.action.search.SearchResponse]
>> at 
>> org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
>> at 
>> org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
>> at 
>> org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
>> at 
>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
>> at 
>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
>> at 
>> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
>> at 
>> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
>> at 
>> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
>> at 
>> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
>> at 
>> org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
>> at 
>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
>> at 
>> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
>> at 
>> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
>> at 
>> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
>> at 
>> org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
>> at 
>> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
>> at 
>> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
>> at 
>> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
>> at 
>> org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
>> at 
>> org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
>> at 
>> org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
>> at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.NullPointerException
>> at 
>> org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
>> at 
>> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
>> at 
>> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
>> at 
>> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
>> at 
>> org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
>> at 
>> org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
>> at 
>> org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
>> at 
>> org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
>> at 
>> org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
>> at 
>> org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
>> ... 23 more
>>
>> *The (Scala) code I used to generate the request :*
>>
>>       val request = ....
>>>    val topicsAggregation = 
>>> significantTerms("topics").field("topic_ids").size(20)
>>>    request.addAggregation(topicsAggregation)
>>
>>
>> *The code to retrieve the aggregation (although it seems it never reaches 
>> here):*
>>
>>>    val terms: SignificantTerms = response.getAggregations.get("topics")
>>
>>
>>
>> Any ideas?
>>
>> Thanks!
>>
>>
>> Felipe Hummel
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/c650d446-f61a-46cc-9b4a-9926a05af8e2%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to