Hi

Try

    nutch   [ http://www.nutch.org/docs/en/about.html ]  underneath it uses
Lucene....  :)





-----Original Message-----
From: Luciano Barbosa [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 20, 2004 3:06 AM
To: [EMAIL PROTECTED]
Subject: Downloading Full Copies of Web Pages


Hi folks,
I want to download full copies of web pages and storage them locally as
well the hyperlink structures as local directories. I tried to use
Lucene, but I've realized that  it doesn't have a crawler.
Does anyone know a software that make this?
Thanks,

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to