Hi, I have a use case where I need to define schedules for crawling of certain domains with nutch. I'm having a hard time wrapping my head around how this would be setup. It looks to me that the way nutch is designed it runs with a single instance that can in itself handle a huge number of hosts.
So let's say I have three organizations who I will be crawling their sites. Each organization will have their own set of seeds, configurations, and start and stop times of active crawling. Conceivably each of these three organizations would have their own crawl jobs that get fired up based on the organizations defined schedules. Therefore, it is possible that two or more jobs will be running at the same time. Is this something that can be setup? Thank you,

