Hello. I would like to have a database within wget. The database
would let wget know what it has downloaded earlier. Wget could
download only new and changed files, and could continue the download
without having the old downloadings in my disk.
The database would also be accessed by other programs.
E.g., new downloads could be later merged to the earlier downloads
with another program.
E.g., the database would allow me to remember when I'm trying to
download something I already have.
Do we have such a downloader with the requested features available
already? Could somebody install and test the Nedlib Harvester at
The NH was used to download all webpages at Finland (totalling 400GB).
I don't know if NH has all the features I need. E.g., I would
like to associate includes and excludes to individual sites and
webpage structures. When I next time update my copy, the downloader
would use the given includes and excludes.