On 10/13/2016 2:52 AM, Girish Chafle wrote:
> We are using Solr 5.2.1 with SolrJ API. To improve/minimize the Solr heap
> utilization we would like to explicitly unload Solr collections after
> completing the search queries.Is there an API to unload Solr Collections
> for SolrCloud?
>
> The real issue we are trying to solve is Solr running into out of memory
> errors on searching large amount of data for a given heap setting. Keeping
> fixed heap size, we plan to load/unload collections so that we do not let
> Solr run out of memory. Any suggestions/help on this is highly appreciated.

What you are describing sounds like the LotsOfCores functionality ...
but this functionality does NOT work with SolrCloud.  SolrCloud
currently assumes that all collections in the zookeeper database are
available if the servers that host them are available.

https://wiki.apache.org/solr/LotsOfCores

If you were to unload the *cores* that make up a collection, then the
collection would be down until you manually re-loaded them, because
SolrCloud doesn't have the functionality you're after.  Using CoreAdmin
(which can unload and create cores) with SolrCloud is not recommended,
because it's very easy to do it incorrectly.

SolrCloud could in theory be adjusted to make what you want possible,
but it would not be a trivial development effort.  Writing, testing, and
fixing the code could take weeks or months, particularly because Solr
does not have paid developers.  We are all volunteers.

Assuming that such a feature IS developed, or you figure out how to
unload/reload cores manually without problems, if a collection is large
enough that it is causing serious issues with your heap, then reloading
it after it has been unloaded would be a very time-consuming process. 
I'm not sure that users would appreciate the long search delays while
Solr spins the collection back up.

Your best bet right now is to add hardware, either additional servers or
more memory in the servers you have.

Thanks,
Shawn

Reply via email to