On 5/4/2015 12:07 AM, Zheng Lin Edwin Yeo wrote:
> Would like to check, will this method of splitting the synonyms into
> multiple files use up a lot of memory?
> 
> I'm trying it with about 10 files and that collection is not able to be
> loaded due to insufficient memory.
> 
> Although currently my machine only have 4GB of memory, but I only have
> 500,000 records indexed, so not sure if there's a significant impact in the
> future (even with larger memory) when my index grows and other things like
> faceting, highlighting, and carrot tools are implemented.

For Solr, depending on exactly how you use it, the number of docs, and
the nature of those docs, a 4GB machine will usually be considered quite
small.  Solr requires a big chunk of RAM for its heap, but usually
requires an even larger chunk of RAM for the OS disk cache.

My Solr machines have 64GB (as much as the servers can hold) and I wish
they had two or four times as much, so I could get better performance.
My larger indexes (155 million docs, 103 million docs, and 18 million
docs, not using SolrCloud) are NOT considered very large by this
community -- we have users wrangling billions of docs with SolrCloud,
using hundreds of servers.

On this Wiki page, I have tried to outline how various aspects of memory
can affect Solr performance:

http://wiki.apache.org/solr/SolrPerformanceProblems

Thanks,
Shawn

Reply via email to