Hi,

we recently began having trouble with our Solr 1.4 instance. We've about 850k documents in the index which is about 1.2GB in size; the JVM which runs tomcat/solr (no other apps are deployed) has been given 2GB.

We've a forum and run a process every minute which indexes the new messages. The number of messages updated are from 0 to 20 messages average. The commit takes about 1 or two minutes, but usually when it finished a few seconds later the next batch of documents is processed and the story starts again.

So actually it's like Solr is running commits all day long and CPU usage ranges from 80% to 120%.

This continuous CPU usage caused ill effects on other services running on the same machine.

Our environment is being providing by a company purely using VMWare infrastructure, the Solr index itself is on an NSF for which we get some 33MB/s throughput.

So, an easy solution would be to just put more resources into it, e.g. a separate machine. But before I make the decision I'd like to find out whether the app behaves properly under this circumstances or if its possible to shorten the commit time down to a few seconds so the CPU is not drained that long.

thanks for any pointers,

- Markus

Reply via email to