We have a 125GB shard that we are attempting to split, but each time we try to do so, we eventually run out of memory (java.lang.OutOfMemoryError: GC overhead limit exceeded). We have attempted it with the following heap sizes on the shard leader: 4GB, 6GB, 12GB, and 24GB. Even if it does eventually work with more heap, should I have to increase the heap size at all to do a split? Has anyone successfully done a split with a shard of this size using SolrCloud 4.6.0?
Thanks, Will