On Tue, Apr 2, 2013 at 5:33 PM, Toke Eskildsen <t...@statsbiblioteket.dk> wrote:
> Memory does not help you if you commit too frequently. If you commit
> each X seconds and warming takes X+Y seconds, then you will run out of
> memory at some point.
>

How might I time the warming? I've been googling warming since your
earlier message but there does not seem to be any really good
documentation on the subject. If there is anything that you feel I
should be reading I would appreciate a link or a keyword to search on.
I've read the Solr wiki on caching and performance, but other than
that I don't see the issue addressed.


>> I have increased maxWarmingSearchers to 4, let's see how this goes.
>
> If you still get the error with 4 concurrent searchers, you will have to
> either speed up warmup time or commit less frequently. You should be
> able to reduce facet startup time by switching to segment based faceting
> (at the cost of worse search-time performance) or maybe by using
> DocValues. Some of the current threads on the solr-user list is about
> these topics.
>
> How often do you commit and how many unique values does your facet
> fields have?
>

Batches of 20-50 results are added to solr a few times a minute, and a
commit is done after each batch since I'm calling Solr as such:
http://127.0.0.1:8983/solr/core/update/json?commit=true

Should I remove commit=true and run a cron job to commit once per minute?

--
Dotan Cohen

http://gibberish.co.il
http://what-is-what.com

Reply via email to