On Mon, 2018-09-17 at 17:52 +0200, Vincenzo D'Amore wrote:
> org.apache.solr.common.SolrException: Error while processing facet
> fields:
> java.lang.OutOfMemoryError: Java heap space
> 
> Here the complete stacktrace:
> https://gist.github.com/freedev/a14aa9e6ae33fc3ddb2f02d602b34e2b
> 
> I suppose these errors are generated by an increase of traffic coming
> from crawlers/spiders.

Solr does not have any effective limits on the number of parallel
requests it tries to process. I'll recommend that you limit that number
either in your frontend or in your middleware layer, if you use that.

This is sane practice regardless of your current problem: Ensure that
your public facing system rejects requests instead of crashing, if it
for some reason is hammered.

If you feel for it, you can queue requests if all active slots are
taken, but that only works for excessive traffic in short bursts, not
for a sustained high traffic level.


This advice is independent of Shawn's BTW. You could increase your
server capabiblities 10-fold and it would still apply.

- Toke Eskildsen, Royal Danish Library

Reply via email to