Hello again,
After a heavy query on my index (returning 100K docs in a single query) my
JVM heap's floods and I get an JAVA OOM exception, and then that my
GCcannot collect anything (GC
overhead limit exceeded) as these memory chunks are not disposable.
I want to afford queries like this, my
One of my users requested it, they are less aware of what's allowed and I
don't want apriori blocking them for long specific request (there are other
params that might end up OOMing me).
I thought of timeAllowed restriction, but also this solution cannot
guarantee during this delay I would not
Don't request 100K docs in a single query. Fetch them in smaller batches.
wunder
On Jun 17, 2013, at 1:44 PM, Manuel Le Normand wrote:
Hello again,
After a heavy query on my index (returning 100K docs in a single query) my
JVM heap's floods and I get an JAVA OOM exception, and then that my
Make them aware of what is required. Solr is not designed to return huge
requests.
If you need to do this, you will need to run the JVM with a big enough heap to
build the request. You are getting OOM because the JVM does not have enough
memory to build a response with 100K documents.
wunder
Unfortunately my organisation's too big to control or teach every employee
what are the limits, as well as they can vary (many facets - how much is
ok?, asking for too many fields in proportion of too many rows etc)
Don't you think it is preferable to commit the maxBufferSize in the JVM
heap for
There is a java cmd line arg that lets you run a command on OOM - I'd configure
it to log and kill -9 Solr. Then use runit or something to supervice Solr - so
that if it's killed, it just restarts.
I think that is the best way to deal with OOM's. Other than that, you have to
write a middle
I think you can modify the response writer and stream results instead of
building them first and then sending in one go. I am using this technique
to dump millions of docs in json format - but in your case you may have to
figure out how to dump during streaming if you don't want to save data to