You are trying to use explain for more than what it has been designed for. Calling explain on the top hits is fine, but it seems that you need/want to do this for all matches. We don't have a solution for this.
Caching the scorer doesn't work since scorers can only be iterated once. Le jeu. 22 févr. 2018 à 12:11, Vadim Gindin <[email protected]> a écrit : > I'd like to use "explain" mechanism to output some additional match > information: scoring formula, detailed matching information and so on. But > now it seems, "explain" works slower even than just logging of matching > information to a file from score() method. > > - What is the most effective way to do this? Is there a possibility to > accelerate "explain", for example with scorer caching? > - Lucene uses the only Scorer (for entire segment) for calling score() > method. What about explain()? > - Iterators are really - readable-once only? > > Regards, > Vadim Gindin > > On Thu, Feb 22, 2018 at 3:03 PM, Adrien Grand <[email protected]> wrote: > > > If you are talking about explanations, then yes, it's fine. Explain() is > > used for debugging, it is fine if it is slow. However Lucene creates only > > one Scorer for all documents of an entire segment when it comes to > actually > > running a query. > > > > Le jeu. 22 févr. 2018 à 07:06, Vadim Gindin <[email protected]> a > > écrit : > > > > > Adrien, thank's a lot! It looks like a working solution for my bugs. I > > > really appreciate it. > > > > > > I just want to ask. Is it really effective way create a Scorer for > every > > > document? Can we say, that it's designed for Scorer to be lightweight > and > > > fast enough so? > > > > > > On Wed, Feb 21, 2018 at 6:42 PM, Adrien Grand <[email protected]> > wrote: > > > > > > > This might not solve all problems, but you should stop caching the > > weight > > > > in the query and stop caching the scorer in the weight: just create a > > new > > > > scorer in calls to explain(). > > > > > > > > Le mer. 21 févr. 2018 à 14:05, Vadim Gindin <[email protected]> a > > > > écrit : > > > > > > > > > The test gives the following error: > > > > > > > > > > java.lang.AssertionError: Docs enums are only supposed to be > consumed > > > in > > > > > the thread in which they have been acquired. But was acquired in > > > > > Thread[elasticsearch[node_s2][search][T#4],5,TGRP- > > CustomQueryParserIT] > > > > and > > > > > consumed in > > > > > Thread[elasticsearch[node_s2][search][T#2],5,TGRP- > > CustomQueryParserIT]. > > > > > at __randomizedtesting.SeedInfo.seed([935231818B6C9F26]:0) > > > > > at > > > > > > > > > > org.apache.lucene.index.AssertingLeafReader.assertThread( > > > > AssertingLeafReader.java:42) > > > > > at > > > > > > > > > > org.apache.lucene.index.AssertingLeafReader.access$ > > > > 000(AssertingLeafReader.java:36) > > > > > at > > > > > > > > > > org.apache.lucene.index.AssertingLeafReader$ > > > > AssertingPostingsEnum.advance(AssertingLeafReader.java:330) > > > > > at > > > > > > > > > > org.apache.lucene.search.DisjunctionDISIApproximation.advance( > > > > DisjunctionDISIApproximation.java:66) > > > > > at > > > > > > > > > > com.detectum.query.phrase.PrizeDisjunctionScorer.explain( > > > > PrizeDisjunctionScorer.java:220) > > > > > > > > > > from explain() method. > > > > > > > > > > > > > > > > > > > > On Tue, Feb 20, 2018 at 8:03 PM, Vadim Gindin < > [email protected]> > > > > > wrote: > > > > > > > > > > > Probably it is not possible to attach files from email letter. > Here > > > > they > > > > > > are: > > > > > > > > > > > > ConstTermScorer.java > > > > > > < > > > http://lucene.472066.n3.nabble.com/file/t493564/ConstTermScorer.java> > > > > > > PrizeDisjunctionScorer.java > > > > > > <http://lucene.472066.n3.nabble.com/file/t493564/ > > > > > > PrizeDisjunctionScorer.java> > > > > > > PhraseQuery.java > > > > > > < > http://lucene.472066.n3.nabble.com/file/t493564/PhraseQuery.java> > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > > > Sent from: http://lucene.472066.n3.nabble.com/Lucene-Java-Users- > > > > > > f532864.html > > > > > > > > > > > > ------------------------------------------------------------ > > --------- > > > > > > To unsubscribe, e-mail: [email protected] > > > > > > For additional commands, e-mail: > [email protected] > > > > > > > > > > > > > > > > > > > > > > > > > > >
