I'm facing the same Problem... I tried the Aggregations using the REST-API 
and everything went fine.

It looks like the *registerStream *Method in the 
*SignificanceHeuristicStreams* Class gets never called. 
Therefore, the *STREAMS*-List remains empty... 

Cheers, 
Michael

Am Freitag, 19. September 2014 22:58:15 UTC+2 schrieb Felipe Hummel:
>
> Hey guys, I’m getting NullPointerException while using a 
> *significant_terms* aggregation. It happens in this line: 
>
>
> org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
>
> The error is in the deserialization: *Failed to deserialize response of 
> type [org.elasticsearch.action.search.SearchResponse]*
> I’m using the Java API. I just printed the request and manually did it 
> through the REST API and everything went fine. It happens only when using 
> the Java API.
>
> I'm using ES 1.3.2.
>
> *The printed search request:*
>
>> {
>>   "from" : 0,
>>   "size" : 6,
>>   "timeout" : 30000,
>>   "query" : {
>>     "filtered" : {
>>       "query" : {
>>         "query_string" : {
>>           "query" : "ayrton senna",
>>           "fields" : [ "title^2.0", "description" ],
>>           "default_operator" : "and"
>>         }
>>       },
>>       "filter" : {
>>         "bool" : {
>>           "must" : [ {
>>             "range" : {
>>               "created_at" : {
>>                 "from" : null,
>>                 "to" : "2014-09-19T20:28:30.000Z",
>>                 "include_lower" : true,
>>                 "include_upper" : true
>>               },
>>               "_cache" : true
>>             }
>>           }, {
>>             "range" : {
>>               "published_at" : {
>>                 "from" : null,
>>                 "to" : "2014-09-19T20:28:30.000Z",
>>                 "include_lower" : true,
>>                 "include_upper" : true
>>               },
>>               "_cache" : true
>>             }
>>           }, {
>>             "range" : {
>>               "published_at" : {
>>                 "from" : "2014-08-20T20:28:30.000Z",
>>                 "to" : "2014-09-19T20:28:30.000Z",
>>                 "include_lower" : true,
>>                 "include_upper" : true
>>               },
>>               "_cache" : false
>>             }
>>           } ]
>>         }
>>       }
>>     }
>>   },
>>   "fields" : [ ],
>>   "aggregations" : {
>>     "topics" : {
>>       "significant_terms" : {
>>         "field" : "topic_ids",
>>         "size" : 20
>>       }
>>     }
>>   }
>> }
>
>
> *The complete error stacktrace:*
>
> [ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet - 
> org.elasticsearch.transport.TransportSerializationException: Failed to 
> deserialize response of type 
> [org.elasticsearch.action.search.SearchResponse]
> org.elasticsearch.transport.TransportSerializationException: Failed to 
> deserialize response of type 
> [org.elasticsearch.action.search.SearchResponse]
> at 
> org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
> at 
> org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
> at 
> org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
> at 
> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
> at 
> org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
> at 
> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
> at 
> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
> at 
> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
> at 
> org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
> at 
> org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
> at 
> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
> at 
> org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
> at 
> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
> at 
> org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
> at 
> org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
> at 
> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
> at 
> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
> at 
> org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
> at 
> org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
> at 
> org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
> at 
> org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at 
> org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
> at 
> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
> at 
> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
> at 
> org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
> at 
> org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
> at 
> org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
> at 
> org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
> at 
> org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
> at 
> org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
> at 
> org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
> ... 23 more
>
> *The (Scala) code I used to generate the request :*
>
>       val request = ....
>>    val topicsAggregation = 
>> significantTerms("topics").field("topic_ids").size(20)
>>    request.addAggregation(topicsAggregation)
>
>
> *The code to retrieve the aggregation (although it seems it never reaches 
> here):*
>
>>    val terms: SignificantTerms = response.getAggregations.get("topics")
>
>
>
> Any ideas?
>
> Thanks!
>
>
> Felipe Hummel
>

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/c3692b99-3dc0-4e4f-a390-618d1eeef541%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to