The char[] which occupies 180MB has the following "path to root"

char[87690841] @ 0x7940ba658  <add><doc boost="1.0"><field 
name="_my_id">shopproducts#<CUT>...
|- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil executorService for 
core 'fust-1-fr_CH_1' -3-thread-1 Thread
|- value java.lang.String @ 0x79e804110  <add><doc boost="1.0"><field 
name="_my_id">shopproducts#<CUT>...
|  '- str org.apache.solr.common.util.ContentStreamBase$StringStream @ 
0x77fd84680
|     |- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil executorService 
for core 'fust-1-fr_CH_1' -3-thread-1
|     |- contentStream 
org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream @ 
0x77fd846a0
|     |  |- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil 
executorService for core 'fust-1-fr_CH_1' -3-thread-1 Thread
|     |  |- [0] org.apache.solr.common.util.ContentStream[1] @ 0x79e802fb8
|     |  |  '- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil 
executorService for core 'fust-1-fr_CH_1' -3-thread-1 Thread

And there is another byte[] with 260MB.

The logic is somewhat this:

SolrClient solrClient = new HttpSolrClient( coreUrl );
while ( got more elements to index )
{
  batch = create 100 SolrInputDocuments
  solrClient.add( batch )
 }


-----Ursprüngliche Nachricht-----
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch] 
Gesendet: Freitag, 19. Februar 2016 09:07
An: solr-user@lucene.apache.org
Betreff: OutOfMemory when batchupdating from SolrJ

Environment: Solr 5.4.1

I am facing OOMs when batchupdating SolrJ. I am seeing approx 30'000(!) 
SolrInputDocument instances, although my batchsize is 100. I.e. I call 
solrClient.add( documents ) for every 100 documents only. So I'd expect to see 
at most 100 SolrInputDocument's in memory at any moment UNLESS 
a) solrClient.add is "asynchronous" in its nature. Then QueryResponse would be 
an async-result? 
or 
b) SolrJ is spooling the documents in client-side

What might be going wrong?

Thx for your advices
Clemens

Reply via email to