On 7/24/2014 7:53 AM, Ameya Aware wrote:
> I did not make any other change than this.. rest of the settings are
> default.
> 
> Do i need to set garbage collection strategy?

The collector chosen and its and tuning params can have a massive impact
on performance, but it will make no difference at all if you are getting
OutOfMemoryError exceptions.  This means the program is trying to
allocate more memory than it has been told it can allocate.  Changing
the garbage collector will not change Java's response when the program
wants to allocate too much memory.

The odd location of the commas in the start of this thread make it hard
to understand exactly what numbers you were trying to say, but I think
you were saying that you were trying to index 200000 documents and it
died after indexing 15000.

How big was the solr index before you started indexing, both in number
of documents and disk space consumed?  How are you doing the indexing?
Is it being done with requests to the /update handler, or are you using
the dataimport handler to import from somewhere, like a database?

Is it a single index, or distributed?  Are you running in "normal" mode
or SolrCloud?  Can you share your solrconfig.xml file so we can look for
possible problems?

I already gave you a wiki URL that gives possible reasons for needing a
very large heap, and some things you can do to reduce the requirements.

Thanks,
Shawn

Reply via email to