Believe I found it, wasn't populating the docset and doclist.  Again
thanks for all of the support.

On Tue, Aug 30, 2011 at 11:00 PM, Jamie Johnson <jej2...@gmail.com> wrote:
> Found score, so this works for regular queries but now I'm getting an
> exception when faceting.
>
> SEVERE: Exception during facet.field of type:java.lang.NullPointerException
>        at 
> org.apache.solr.request.SimpleFacets.getFieldCacheCounts(SimpleFacets.java:451)
>        at 
> org.apache.solr.request.SimpleFacets.getTermCounts(SimpleFacets.java:313)
>        at 
> org.apache.solr.request.SimpleFacets.getFacetFieldCounts(SimpleFacets.java:357)
>        at 
> org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:191)
>        at 
> org.apache.solr.handler.component.FacetComponent.process(FacetComponent.java:81)
>        at 
> org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:231)
>        at 
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
>        at org.apache.solr.core.SolrCore.execute(SolrCore.java:1290)
>        at 
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:353)
>        at 
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:248)
>        at 
> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
>        at 
> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)
>        at 
> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>        at 
> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>        at 
> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>        at 
> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>        at 
> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>        at 
> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>        at 
> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>        at org.mortbay.jetty.Server.handle(Server.java:326)
>        at 
> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>        at 
> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>        at 
> org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
>        at 
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>
> Any insight into what would cause that?
>
> On Tue, Aug 30, 2011 at 10:13 PM, Jamie Johnson <jej2...@gmail.com> wrote:
>> So I looked at doing this, but I don't see a way to get the scores
>> from the docs as well.  Am I missing something in that regards?
>>
>> On Mon, Aug 29, 2011 at 8:53 PM, Jamie Johnson <jej2...@gmail.com> wrote:
>>> Thanks Hoss.  I am actually ok with that, I think something like
>>> 50,000 results from each shard as a max would be reasonable since my
>>> check takes about 1s for 50,000 records.  I'll give this a whirl and
>>> see how it goes.
>>>
>>> On Mon, Aug 29, 2011 at 6:46 PM, Chris Hostetter
>>> <hossman_luc...@fucit.org> wrote:
>>>>
>>>> : Also I see that this is before sorting, is there a way to do something
>>>> : similar after sorting?  The reason is that I'm ok with the total
>>>> : result not being completely accurate so long as the first say 10 pages
>>>> : are accurate.  The results could get more accurate as you page through
>>>> : them though.  Does that make sense?
>>>>
>>>> munging results after sorting is dangerous in the general case, but if you
>>>> have a specific usecase where you're okay with only garunteeing accurate
>>>> results up to result #X, then you might be able to get away with something
>>>> like...
>>>>
>>>> * custom SearchComponent
>>>> * configure to run after QueryComponent
>>>> * in prepare, record the start & rows params, and replace them with 0 &
>>>> (MAX_PAGE_NUM * rows)
>>>> * in process, iterate over the the DocList and build up your own new
>>>> DocSlice based on the docs that match your special criteria - then use the
>>>> original start/rows to generate a subset and return that
>>>>
>>>> ...getting this to play nicely with stuff like faceting be possible with
>>>> more work, and manipulation of the DocSet (assuming you're okay with the
>>>> facet counts only being as accurate as much as the DocList is -- filtered
>>>> up to row X).
>>>>
>>>> it could fail misserablly with distributed search since you hvae no idea
>>>> how many results will pass your filter.
>>>>
>>>> (note: this is all off the top of my head ... no idea if it would actually
>>>> work)
>>>>
>>>>
>>>>
>>>> -Hoss
>>>>
>>>
>>
>

Reply via email to