The benefit falls off fairly rapidly as the batch size increases. I did some 
crude benchmarking here: 
https://lucidworks.com/2015/10/05/really-batch-updates-solr-2/

If I were going to pick a number, I’d say 100 docs _per shard_. So if you have 
10 shards, batch 1,000 docs if possible.

Note, the efficiencies here are all about establishing a connection to Solr 
etc. rather than the actual work Solr does.

Best,
Erick

> On Apr 25, 2019, at 3:13 PM, Markus Jelsma <markus.jel...@openindex.io> wrote:
> 
> Hello,
> 
> There is no definitive rule for this, it depends on your situation such as 
> size of documents, resource constraints and possible heavy analysis chain. 
> And in case of (re)indexing a large amount, your autocommit time/limit is 
> probably more important.
> 
> In our case, some collections are fine with 5000+ batch sizes, but others are 
> happy with just a hundred. One has small documents and no text analysis, the 
> other quite the opposite.
> 
> Finding a sweet spot is trial and error.
> 
> Cheers,
> Markus
> 
> 
> 
> -----Original message-----
>> From:Lucky Sharma <goku0...@gmail.com>
>> Sent: Thursday 25th April 2019 21:48
>> To: solr-user@lucene.apache.org
>> Subject: Solr-Batch Update
>> 
>> Hi all,
>> While creating an update request to solr, Its recommended creating
>> batch request instead of small updates. What is the optimum batch
>> size? Is there any number or any computation which can help us to
>> assist on the same.
>> 
>> 
>> -- 
>> Warm Regards,
>> 
>> Lucky Sharma
>> Contact No :+91 9821559918
>> 

Reply via email to