when i use this command "bin/nutch generate /user/root/crawld
/user/root/crawld/segments"

output is :

Generator: Selecting best-scoring urls due for fetch.
Generator:  starting
Generator: segment: crawl/segments/20070419134155
Generator:  filtering: false
Generator: topN: 50
Generator: 0 records selected for  fetching, exiting ...
Stopping at depth=0 - no more URLs to fetch.
No URLs  to fetch - check your seed list and URL filters.

why have this message "Generator: 0 records selected for  fetching, exiting
..."
how can i solve this? 
-- 
View this message in context: 
http://www.nabble.com/Generator%3A-0-records-selected-for--fetching%2C-exiting-...-tf4848287.html#a13871726
Sent from the Hadoop Users mailing list archive at Nabble.com.

Reply via email to