My nutch crawl just stopped. The process is still there, and doesn't respond to a "kill -TERM" or a "kill -HUP", but it hasn't written anything to the log file in the last 40 minutes. The last thing it logged was some calls to my custom url filter. Nothing has been written in the hadoop directory or the crawldir/crawldb or the segments dir in that time.
How can I tell what's going on and why it's stopped? -- http://www.linkedin.com/in/paultomblin http://careers.stackoverflow.com/ptomblin
