On 5/10/2013 1:06 PM, heaven wrote:
Again, just finished reindexing, server utilization was about 5-10%, I
started index optimization. As result I now lost (again) entire index, got a
lot of errors, they are appear so fast and contain 0 useful information.

<http://lucene.472066.n3.nabble.com/file/n4062434/Screenshot_546.png>
<http://lucene.472066.n3.nabble.com/file/n4062434/Screenshot_547.png>

You can see that server is not loaded at all, and load was the same when I
started the optimization process.

BTW, it seems like an infinite loop, the picture does not change, replica is
down, shard in recovering.

In the shard log I see:
org.apache.solr.common.SolrException: No registered leader was found,​
collection:crm-prod slice:shard1

And the same in replica + sometimes:
Error getting leader from zk
org.apache.solr.common.SolrException: No registered leader was found,
collection:crm-test slice:shard1

Based on the screenshot of your processes, I don't think you have enough RAM for what this machine is doing. The performance issues are causing zookeeper communication problems, which results in SolrCloud going a little crazy because it can't tell what's really going on.

Each of your four Solr processes has a 23GB virtual memory size. Since you have said that your Solr JVMs have a max heap of 4GB, this suggests that each of those Solr processes has an index in the neighborhood of 19GB. That's 76GB of index.

You've got 32GB of RAM. Four Solr servers that each have a max heap of 4GB leaves 16GB of free memory. With 76GB of index, you'll want between 40 and 80GB of free memory so that your index is well-cached. You're looking at needing a total memory size of between 64 and 96GB just for Solr.

You also have a MongoDB process that is using 4GB of real memory, with a 56GB virtual memory size. That probably means that your mongodb database is in the neighborhood of 50GB, though i could be off on that estimate. MongoDB uses free memory for caching in the same way that Solr does, so add on at least half the size of your MongoDB database to your memory requirement.

Basically, an ideal machine for what you're trying to do will have at least 128GB of RAM. With 96GB, you might still be OK.

The problem is likely further complicated by long GC pauses during heavy indexing. You'll want to tune your garbage collection.

http://wiki.apache.org/solr/ShawnHeisey#GC_Tuning
http://wiki.apache.org/solr/SolrPerformanceProblems

Thanks,
Shawn

Reply via email to