Chris Hostetter wrote:

: Could you suggest a better configuration based on this?

If that's what your stats look like after a single request, then i would
guess you would need to make your cache size at least 1.6 million in order
for it to be of any use in improving your facet speed.
Would this have any strong impacts on my system? Should I just set it to an even 2 million to allow for growth?

: My data is 492,000 records of book data.  I am faceting on 4 fields:
: author, subject, language, format.
: Format and language are fairly simple as their are only a few unique
: terms.  Author and subject however are much different in that there are
: thousands of unique terms.

by the looks of it, you have a lot more then a few thousand unique terms
in those two fields ... are you tokenizing on these fields?  that's
probably not what you want for ields you're going to facet on.
All of these fields are set as "string" in my schema, so if I understand the fields correctly, they are not being tokenized. I also have an author field that is set as "text" for searching.

Thanks
Andrew

Reply via email to