I guess I'm not exactly sure what you're trying to do, but when I want to get a local copy of a website I do this:

nohup wget -m http://www.someUrL.org &

Shawn

On 12/2/05, Robert Persson <[EMAIL PROTECTED]> wrote:
I have been trying all afternoon to make local copies of web pages from a
netscape bookmark file. I have been wrestling with httrack (through
khttrack), pavuk and wget, but none of them work. httrack and pavuk seem to
claim they can do the job, but they can't, or at least not in any way an
ordinary mortal could be expected to work out. They do things like pretending
to download hundreds of files without actually saving them to disk, crashing
suddenly and frequently, and popping up messages saying that I haven't
contributed enough code to their project to expect the thing to work
properly. I don't want to do anything hideously complicated. I just want to
make local copies of some bookmarked pages. What tools should I be using?

I would be happy to use a windows tool in wine if it worked. I would be happy
to reboot into Windows if I could get this job done.

One option would be to feed wget a list of urls. The trouble is I don't know
how to turn an html bookmark file into a simple list of urls. I imagine I
could do it in sed if I spent enough time to learn sed, but my afternoon has
gone now and I don't have the time.

Many thanks
Robert
--
Robert Persson

"Don't use nuclear weapons to troubleshoot faults."
(US Air Force Instruction 91-111, 1 Oct 1997)

--
gentoo-user@gentoo.org mailing list




--
Shawn Singh

Reply via email to