I am doing some test about merge indexing and have a performance doubt
I am doing merge in a simple way, something like:
FSDirectory indexes[] = new FSDirectory[indexList.size()];
for (int i = 0; i < indexList.size(); i++) {
indexes[i] = FSDirectory.open(new File(indexList.get(i)));
}
w.addIndexesNoOptimize(indexes);
w.close();
IndexList.size() is 50 and contains paths to index. These 50 indexes contain
500.000 docs each and have about 500m size each (each index). I have
realised that 50% of the time is spent in w.addIndexesNoOptimize(indexes)
and the other 50 in w.close() (I suppose because close commits and have to
wait for all the merges to be completed).
I am wondering if is there a way to do this faster. For example, merge the
50 indexes into 25 indexes, these 25 into 12, these 12 into 6... till geting
a single big index. Could this be faster?
Does anyone have experience with this? Any advice?
Thanks in advance
--
View this message in context:
http://lucene.472066.n3.nabble.com/performance-merging-indexes-with-addIndexesNoOptimize-tp1889378p1889378.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]