Hi
Try
nutch [ http://www.nutch.org/docs/en/about.html ] underneath it uses
Lucene :)
-Original Message-
From: Luciano Barbosa [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 20, 2004 3:06 AM
To: [EMAIL PROTECTED]
Subject: Downloading Full Copies of Web Pages
Hi
wget does this. Little point in reinventing the wheel.
Luciano Barbosa wrote:
Hi folks,
I want to download full copies of web pages and storage them locally as
well the hyperlink structures as local directories. I tried to use
Lucene, but I've realized that it doesn't have a crawler.
Does
Hi folks,
I want to download full copies of web pages and storage them locally as
well the hyperlink structures as local directories. I tried to use
Lucene, but I've realized that it doesn't have a crawler.
Does anyone know a software that make this?
Thanks,