Thanks Alex,
My usage is that 
1. I execute query and get result, return id only 2. Add a value to a dynamic 
field3. Save to solr with batch size1000
I have define 50 queries and run them parallel Also I disable hard commit and 
soft commit per 1000 docs 

I am wondering whether any configuration can speed it




Sent from Yahoo Mail for iPhone


On Sunday, June 30, 2019, 10:39 AM, Alexandre Rafalovitch <arafa...@gmail.com> 
wrote:

Indexing new documents is just adding additional segments.

Adding new field to a document means:
1) Reading existing document (may not always be possible, depending on
field configuration)
2) Marking existing document as deleted
3) Creating new document with reconstructed+plus new fields
4) Possibly triggering a marge if a lot of documents have been updated

Perhaps the above is a contributing factor. But I also feel that maybe
there is some detail in your question I did not fully understand.

Regards,
  Alex.

On Sun, 30 Jun 2019 at 10:33, derrick cui
<derrickcui...@yahoo.ca.invalid> wrote:
>
> I have 400k data, indexing is pretty fast, only take 10 minutes, but add 
> dynamic field to all documents according to query results is very slow, take 
> about 1.5 hours.
> Anyone knows what could be the reason?
> Thanks
>
>
>
> Sent from Yahoo Mail for iPhone



Reply via email to