I am going to crawl a small set of sites and I never want to go off site and I also want to strictly control my link dept.
I setup crawls for each site using the crawl command. Then manually move the segments folder to my "master" directory and re-index. (This can all be scripted). This gives me the flex ability to QA each individual crawl. Am I jumping through unnecessary hoops here or does this sound like a reasonable plan?
