Hello dev!
Users are interested in the meaning of absolute value of the score, but we
always reply that it's just relative value. Maximum score of matched docs
is not an answer.
Ultimately we need to measure how much sense a query has in the index. e.g.
[jet OR propulsion OR spider] query should be measured like
nonsense, because the best matching docs have much lower scores than
hypothetical (and assuming absent) doc matching [jet AND propulsion AND
spider].
Could it be a method that returns the maximum possible score if all query
terms would match. Something like stubbing postings on virtual all_matching
doc with average stats like tf and field length and kicks scorers in? It
reminds me something about probabilistic retrieval, but not much. Is there
anything like this already?

-- 
Sincerely yours
Mikhail Khludnev

Reply via email to