On 9/6/2013 2:54 AM, prabu palanisamy wrote:
> I am currently using solr -3.5.0,  indexed  wikipedia dump (50 gb) with
> java 1.6.
> I am searching the solr with text (which is actually twitter tweets) .
> Currently it takes average time of 210 millisecond for each post, out of
> which 200 millisecond is consumed by solr server (QTime).  I used the
> jconsole monitor tool.

If the size of all your Solr indexes on disk is in the 50GB range of
your wikipedia dump, then for ideal performance, you'll want to have
50GB of free memory so the OS can cache your index.  You might be able
to get by with 25-30GB of free memory, depending on your index composition.

Note that this is memory over and above what you allocate to the Solr
JVM, and memory used by other processes on the machine.  If you do have
other services on the same machine, note that those programs might ALSO
require OS disk cache RAM.

http://wiki.apache.org/solr/SolrPerformanceProblems#OS_Disk_Cache

Thanks,
Shawn

Reply via email to