Hi!

Is there a way to update the crawl generated.
When I run crawl command and there is already a crawl directory in the
directory
the prompt displays an error message that a thread is already using it.
So, I need to delete that directory and only then I can again run crawl.

This means that I cant update my crawled data. Is it?
Is there a way by which this is updated or I don't have to lose my previous
crawl?
Also, how do I keep the system updating my crawl data itself, once a week or
fortnight?

Thanks and regards,
Mayank.
-- 
Mayank Kamthan

Reply via email to