Clemens,

What i understand from your above emails that you are creating
SolrInputDocuments in a batch inside a loop which gets created in heap .
SolrJ/SolrClient doesn't have  any control on removing those objects from
heap which is controlled by Garbage Collection.  So your program may end up
in a situation where there is no more heap memory left or GC is not able to
free up memory and hits OOM because you have the loop running and creating
more & more objects.  By defaults a minimum heap memory is allocated to
java program unless you set -Xmx when you launch your Java program.

Hope that clarifies.

On Fri, Feb 19, 2016 at 12:11 PM, Clemens Wyss DEV <clemens...@mysign.ch>
wrote:

> Thanks Susheel,
> but I am having problems in and am talking about SolrJ, i.e. the
> "client-side of Solr" ...
>
> -----Ursprüngliche Nachricht-----
> Von: Susheel Kumar [mailto:susheel2...@gmail.com]
> Gesendet: Freitag, 19. Februar 2016 17:23
> An: solr-user@lucene.apache.org
> Betreff: Re: OutOfMemory when batchupdating from SolrJ
>
> Clemens,
>
> First allocating higher or right amount of heap memory is not a workaround
> but becomes a requirement depending on how much heap memory your Java
> program needs.
> Please read about why Solr need heap memory at
> https://wiki.apache.org/solr/SolrPerformanceProblems
>
> Thanks,
> Susheel
>
>
>
> On Fri, Feb 19, 2016 at 9:17 AM, Clemens Wyss DEV <clemens...@mysign.ch>
> wrote:
>
> > > increase heap size
> > this is a "workaround"
> >
> > Doesn't SolrClient free part of its buffer? At least documents it has
> > sent to the Solr-Server?
> >
> > -----Ursprüngliche Nachricht-----
> > Von: Susheel Kumar [mailto:susheel2...@gmail.com]
> > Gesendet: Freitag, 19. Februar 2016 14:42
> > An: solr-user@lucene.apache.org
> > Betreff: Re: OutOfMemory when batchupdating from SolrJ
> >
> > When you run your SolrJ Client Indexing program, can you increase heap
> > size similar below.  I guess it may be on your client side you are
> > running int OOM... or please share the exact error if below doesn't
> > work/is the issue.
> >
> >  java -Xmx4096m ....
> >
> >
> > Thanks,
> >
> > Susheel
> >
> > On Fri, Feb 19, 2016 at 6:25 AM, Clemens Wyss DEV
> > <clemens...@mysign.ch>
> > wrote:
> >
> > > Guessing on ;) :
> > > must I commit after every "batch", in order to force a flushing of
> > > org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream
> > > et
> > al?
> > >
> > > OTH it is propagated to NOT "commit" from a (SolrJ) client
> > >
> > > https://lucidworks.com/blog/2013/08/23/understanding-transaction-log
> > > s-
> > > softcommit-and-commit-in-sorlcloud/
> > > 'Be very careful committing from the client! In fact, don’t do it'
> > >
> > > I would not want to commit "just to flush a client side buffer" ...
> > >
> > > -----Ursprüngliche Nachricht-----
> > > Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
> > > Gesendet: Freitag, 19. Februar 2016 11:09
> > > An: solr-user@lucene.apache.org
> > > Betreff: AW: OutOfMemory when batchupdating from SolrJ
> > >
> > > The char[] which occupies 180MB has the following "path to root"
> > >
> > > char[87690841] @ 0x7940ba658  <add><doc boost="1.0"><field
> > > name="_my_id">shopproducts#<CUT>...
> > > |- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil
> > > |executorService for core 'fust-1-fr_CH_1' -3-thread-1 Thread
> > > |- value java.lang.String @ 0x79e804110  <add><doc
> > > |boost="1.0"><field
> > > name="_my_id">shopproducts#<CUT>...
> > > |  '- str org.apache.solr.common.util.ContentStreamBase$StringStream
> > > | @
> > > 0x77fd84680
> > > |     |- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil
> > > executorService for core 'fust-1-fr_CH_1' -3-thread-1
> > > |     |- contentStream
> > > org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream
> > > @
> > > 0x77fd846a0
> > > |     |  |- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil
> > > executorService for core 'fust-1-fr_CH_1' -3-thread-1 Thread
> > > |     |  |- [0] org.apache.solr.common.util.ContentStream[1] @
> > 0x79e802fb8
> > > |     |  |  '- <Java Local> java.lang.Thread @ 0x7321d9b80  SolrUtil
> > > |executorService for core 'fust-1-fr_CH_1' -3-thread-1 Thread
> > >
> > > And there is another byte[] with 260MB.
> > >
> > > The logic is somewhat this:
> > >
> > > SolrClient solrClient = new HttpSolrClient( coreUrl ); while ( got
> > > more elements to index ) {
> > >   batch = create 100 SolrInputDocuments
> > >   solrClient.add( batch )
> > >  }
> > >
> > >
> > > -----Ursprüngliche Nachricht-----
> > > Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
> > > Gesendet: Freitag, 19. Februar 2016 09:07
> > > An: solr-user@lucene.apache.org
> > > Betreff: OutOfMemory when batchupdating from SolrJ
> > >
> > > Environment: Solr 5.4.1
> > >
> > > I am facing OOMs when batchupdating SolrJ. I am seeing approx
> > > 30'000(!) SolrInputDocument instances, although my batchsize is 100.
> > > I.e. I call solrClient.add( documents ) for every 100 documents only.
> > > So I'd expect to see at most 100 SolrInputDocument's in memory at
> > > any moment UNLESS
> > > a) solrClient.add is "asynchronous" in its nature. Then
> > > QueryResponse would be an async-result?
> > > or
> > > b) SolrJ is spooling the documents in client-side
> > >
> > > What might be going wrong?
> > >
> > > Thx for your advices
> > > Clemens
> > >
> > >
> >
>

Reply via email to