Sounds like an interesting improvement to propose.

It will also depend on various factors, such as number of unique terms in a field, field type, etc.

Which field types are giving you the most trouble and how many unique values do they have? And do you specify a facet.method or just let it default?

What release of Solr are you on? Are you using "trie" for numeric fields? Are these mostly string fields? Any boolean fields?

-- Jack Krupansky

-----Original Message----- From: Andrew Laird
Sent: Thursday, June 07, 2012 4:01 AM
To: solr-user@lucene.apache.org
Subject: How to cap facet counts beyond a specified limit

We have an index with ~100M documents and I am looking for a simple way to speed up faceted searches. Is there a relatively straightforward way to stop counting the number of matching documents beyond some specifiable value? For our needs we don't really need to know that a particular facet has exactly 14,203,527 matches - just knowing that there are "more than a million" is enough. If I could somehow limit the hit counts to a million (say) it seems like that could decrease the work required to compute the values (just stop counting after the limit is reached) and potentially improve faceted search time - especially when we have 20-30 fields to facet on. Has anyone else tried to do something like this?

Many thanks for comments and info,

Sincerely,


andy laird | gettyimages | 206.925.6728





Reply via email to