Hi folks,
I want to download full copies of web pages and storage them locally as well the hyperlink structures as local directories. I tried to use Lucene, but I've realized that it doesn't have a crawler.
Does anyone know a software that make this?
Thanks,


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to