Hello Cezary,

I’d monitor your system's memory as well as the JVM’s memory usage 
(http://visualvm.java.net/ <http://visualvm.java.net/>). Solr’s documentation 
on OutofMemoryErrors is here -> 
http://wiki.apache.org/solr/SolrPerformanceFactors#OutOfMemoryErrors 
<http://wiki.apache.org/solr/SolrPerformanceFactors#OutOfMemoryErrors>. 
Updating search.solr.jvm_options should alleviate most issues however.

Zeeshan Lakhani
programmer | 
software engineer at @basho | 
org. member/founder of @papers_we_love | paperswelove.org
twitter => @zeeshanlakhani

> On Mar 10, 2015, at 4:32 PM, Cezary Kosko <[email protected]> wrote:
> 
> All,
> 
> I've been working on a Riak setup with Search (version 2.0.0), 2 custom 
> schemas for 3 datatypes (maps of sets, wanted to query against values of 
> these sets, one of the schemas stores one of these sets, but there's max 2 
> values in that particular one per record).
> 
> Everything seemed to be running smoothly up to one day, when all of a sudden 
> Solr started throwing OutOfMemoryErrors. I have increased the max allocated 
> memory from the default 1g to 2g, then 3g, but that did not help.
> 
> Is there a routine to be done in such cases?
> 
> Kind regards,
> Cezary
> _______________________________________________
> riak-users mailing list
> [email protected]
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to