Re: OOM Solr 4.8.1

2018-09-18 Thread Toke Eskildsen
On Mon, 2018-09-17 at 17:52 +0200, Vincenzo D'Amore wrote:
> org.apache.solr.common.SolrException: Error while processing facet
> fields:
> java.lang.OutOfMemoryError: Java heap space
> 
> Here the complete stacktrace:
> https://gist.github.com/freedev/a14aa9e6ae33fc3ddb2f02d602b34e2b
> 
> I suppose these errors are generated by an increase of traffic coming
> from crawlers/spiders.

Solr does not have any effective limits on the number of parallel
requests it tries to process. I'll recommend that you limit that number
either in your frontend or in your middleware layer, if you use that.

This is sane practice regardless of your current problem: Ensure that
your public facing system rejects requests instead of crashing, if it
for some reason is hammered.

If you feel for it, you can queue requests if all active slots are
taken, but that only works for excessive traffic in short bursts, not
for a sustained high traffic level.


This advice is independent of Shawn's BTW. You could increase your
server capabiblities 10-fold and it would still apply.

- Toke Eskildsen, Royal Danish Library


Re: OOM Solr 4.8.1

2018-09-17 Thread Shawn Heisey

On 9/17/2018 9:52 AM, Vincenzo D'Amore wrote:

recently I had few Java OOM in my Solr 4.8.1 instance.

Here the configuration I have.


The only part of your commandline options that matters for OOM is the 
max heap. Which is 16GB for your server.  Note, you should set the min 
heap and max heap to the same value.  Java will eventually allocate the 
entire max heap it has been allowed ... better to do so right from the 
start.



This is the error:

org.apache.solr.common.SolrException: Error while processing facet fields:
java.lang.OutOfMemoryError: Java heap space


Your heap isn't big enough.  You have two choices.  Make the heap 
bigger, or change something so Solr doesn't need as much heap memory.


https://wiki.apache.org/solr/SolrPerformanceProblems#Reducing_heap_requirements

If you enable docValues for fields that you use for faceting, much less 
heap memory will be required to get facet results.


Sometimes the only real way to change how much memory is required is to 
reduce the size of the index.  Put fewer documents into the index, 
probably by spreading the index across multiple servers (shards).


Thanks,
Shawn



OOM Solr 4.8.1

2018-09-17 Thread Vincenzo D'Amore
Hi there,

recently I had few Java OOM in my Solr 4.8.1 instance.

Here the configuration I have.

-Djava.util.logging.config.file=/opt/tomcat/conf/logging.properties
-Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
-Dsolr.log=/opt/tomcat/logs
-DzkHost=ep-1:2181,ep-2:2181,ep-3:2181
-Dsolr.solr.home=/store/solr
-Xms2g -Xmx16g
-server
-XX:+UseG1GC
-XX:+ParallelRefProcEnabled
-XX:G1HeapRegionSize=8m
-XX:MaxGCPauseMillis=400
-XX:+UseLargePages
-XX:+AggressiveOpts
-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/opt/tomcat/dumpoom/dump.hprof
-Dcom.sun.management.jmxremote.port=1616
-Dcom.sun.management.jmxremote.ssl=false
-Dcom.sun.management.jmxremote.authenticate=false
-Dcom.sun.management.jmxremote.rmi.port=1616
-Dcom.sun.management.jmxremote.local.only=false
-Djava.rmi.server.hostname=localhost
-Djava.endorsed.dirs=/opt/tomcat/endorsed

This is the error:

org.apache.solr.common.SolrException: Error while processing facet fields:
java.lang.OutOfMemoryError: Java heap space

Here the complete stacktrace:
https://gist.github.com/freedev/a14aa9e6ae33fc3ddb2f02d602b34e2b

I suppose these errors are generated by an increase of traffic coming from
crawlers/spiders.
So given the sudden appear of these errors, I've configured a memory dump
of jvm in case of oom.

Analyzing the memory dump with the Eclipse Memory Analizer and running the
Usual argh... :) the "Leak Suspects Report" I've found that about 80% of
memory was occupied by one instance of FieldCacheImpl :

One instance of "org.apache.lucene.search.FieldCacheImpl" loaded by
"org.apache.catalina.loader.WebappClassLoader @ 0x3c145b028" occupies
8,248,329,008 (79.69%) bytes. The memory is accumulated in one instance of
"java.util.WeakHashMap$Entry[]" loaded by "".

I was unable to understand what field was, it seems to be a float.

Anyone has an advice to give me? For long time this server has worked well,
without problems. Recently we have a huge traffic coming from
spiders/crawlers but I don't understand how these requests can consume all
the available memory.

Best regards,
Vincenzo

-- 
Vincenzo D'Amore