Due to a unique configuration requirement, we move our crawl directories
off of the node that generates them to the nodes that serve them.

What is the minimum amount of data that the searcher needs to function
correctly?  We're keeping separate crawls from 14 different sites, and
we're beginning to fill up space.  Looking for ways to reduce the size of
the crawldb!

Thanks!

Reply via email to