Hi, 
You can also make use of autocommit feature of solr.
You have two possibilities either based on max number of uncommited docs or 
based on time.
see <updateHandler> of your solrconfig.xml.

Example:-

<autoCommit>
   <!--  
   <maxDocs>10000</maxDocs>
   -->
   
   <!-- maximum time (in MS) after adding a doc before an autocommit is 
triggered --> 
   <maxTime>600000</maxTime> 
  </autoCommit>


once your done with adding run final optimize/commit.

Regards, 
P.N.Raju, 




________________________________
From: Jagdish Vasani <jvasani1...@gmail.com>
To: solr-user@lucene.apache.org
Sent: Thu, February 18, 2010 3:12:15 PM
Subject: Re: optimize is taking too much time

Hi,

you should not optimize index after each insert of document.insted you
should optimize it after inserting some good no of documents.
because in optimize it will merge  all segments to one according to setting
of lucene index.

thanks,
Jagdish
On Fri, Feb 12, 2010 at 4:01 PM, mklprasad <mklpra...@gmail.com> wrote:

>
> hi
> in my solr u have 1,42,45,223 records having some 50GB .
> Now when iam loading a new record and when its trying optimize the docs its
> taking 2 much memory and time
>
>
> can any body please tell do we have any property in solr to get rid of
> this.
>
> Thanks in advance
>
> --
> View this message in context:
> http://old.nabble.com/optimize-is-taking-too-much-time-tp27561570p27561570.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>



      

Reply via email to