hi

One of the possibilities is the urls being filtered out by the filters


jibjoice wrote:
when i use this command "bin/nutch generate /user/root/crawld
/user/root/crawld/segments"

output is :

Generator: Selecting best-scoring urls due for fetch.
Generator:  starting
Generator: segment: crawl/segments/20070419134155
Generator:  filtering: false
Generator: topN: 50
Generator: 0 records selected for  fetching, exiting ...
Stopping at depth=0 - no more URLs to fetch.
No URLs  to fetch - check your seed list and URL filters.

why have this message "Generator: 0 records selected for  fetching, exiting
..."
how can i solve this?


--
This message has been scanned for viruses and
dangerous content and is believed to be clean.

Reply via email to