According to Bruce, there's something good going on and it's
something to do with Re: [netconnect] Downloading Entire Websites.
> > > Does anybody know of a program that would let me download entire
> > > websites? I've tried Tcpdl, but when it's filled the memory, it bombs
> > > the machine. something like this should be on NC3.
> > ie. tcpdl (and others like this) download the *entire* site, links, support
> > files and all!
>
> But this is what I want (I've got unmetered calls at the weekend). I
> have a big collection of favourite sites, mainly NASA's.
> What are the others like it? Please tell me : )
Hello,
I know it's been a while, but I've just come across a nice little util
called GetAllHTML (very original name :) on Aminet. (.de)
It's actually an AREXX script but works very well. Just tell it the
base URL, and empty download dir and set it going.
You can specify whether you just want HTML, or pics as well, or
archives, or a combination, or any wildcards for filenames.
You can limit it to going to a certain depth (say, only 3 dirs down)
or let it go all the way (up to a max of 43 subdirs).
When done, all the files are arranged in the same structure as on the
original server, just like in AF.
It also requires HTTPResume, which is actually quite good because you
can resume downloading at a later date (which I suspect will be useful
for downloading the NASA site!)
Hope that's of some help.
> Bruce
Paul C.
--
.. Apparently, Johnson & Johnson have released Y2K Jelly........
it allows you to use 4 digits where previously only 2 were used...
_____________________________________________________________
NetConnect mailing list. To unsubscribe, send an 'unsubcribe'
message to <[EMAIL PROTECTED]>