Hi, We are having a requirement where we are having almost 100,000 documents to be indexed (atleast 20 fields). These fields are not having length greater than 10 KB.
Also we are running parallel search for the same index. We found that it is taking almost 3 min to index the entire documents. Strategy what we are doing is that We are making a commit after 15000 docs (single large xml doc) We are having merge factor of 10 as if now I am wondering if increasing the merge factor to 25 or 50 would increase the performance. also what about RAM Size (default is 32 MB) ? Which other factors we need to consider ? When should we consider optimize ? Any other deviation from default would help us in achieving the target. We are allocating JVM max heap size allocation 512 MB, default concurrent mark sweep is set for garbage collection. Thanks Naveen