Ramprasad Padmanabhan [ramprasad...@gmail.com] wrote:
> I have a single machine 16GB Ram with 16 cpu cores

Ah! I thought you had more machines, each with 16 Solr cores.

This changes a lot. 400 Solr cores of ~200MB ~= 80GB of data. You're aiming for 
7 times that, so about 500GB of data. Running that on a single machine with 
16GB of RAM is not unrealistic, but it depends a lot on how often a search is 
issued and whether or not you can unload inactive cores and accept the startup 
penalty of loading it the first time a user searches for something. Searches 
will be really slow if you are using a spinning drive.

You might be interested in 
http://sbdevel.wordpress.com/2013/06/06/memory-is-overrated/

As for indexing then I can understand if you run into problems with 400 
concurrent updates to your single machine setup. You should limit the amount of 
concurrent updates to a bit more than the number of cores, so try with 20 or 40.

- Toke Eskildsen

Reply via email to