Oh.  It seems like you're looking for a parser to collect all links from a
given web page.  I doubt Nutch comes with a mechanism for doing this
directly, but it is a solved problem.  I'm sure Google could find you some
examples.

-- Jim


On 9/12/06, Jin Yang <[EMAIL PROTECTED]> wrote:

How to generate the urls list of a website? Should we put 1 on 1 into
them? Like this?

www.apache.org/1.html
www.apache.org/2.html
www.apache.org/3.html

Have any tool or command can do this?

Reply via email to