Sorry. Thanks in advance !!

De : LOPEZ-CORTES Mariano-ext
Envoyé : lundi 19 mars 2018 16:50
À : 'solr-user@lucene.apache.org'
Objet : RE: Question liste solr

Hello

We have an index Solr with 3 nodes, 1 shard et 2 replicas.

Our goal is to index 42 millions rows. Indexing time is important. The data 
source is an oracle database.

Our indexing strategy is :

·         Reading from Oracle to a big CSV file.

·         Reading from 4 files (big file chunked) and injection via 
ConcurrentUpdateSolrClient

Is it the optimal way of injecting such mass of data into Solr ?

For information, estimated time for our solution is 6h.

Reply via email to