Hello everyone,
Jan Prikryl wrote:
Quoting Ivan D Nestlerode ([EMAIL PROTECTED]):
[...] It compiled the stuff in src, and then tried
to make the documents in doc.
This is where the trouble started.
Dear Ivan,
this is an error in 1.7 makefiles. It's repaired in CVS - look at
This patch fixes -rpath handling under GNU/Linux and a problem when building
outside the source directory.
*** configure.in.orig Mon May 28 17:02:47 2001
--- configure.inWed Oct 31 10:36:24 2001
***
*** 205,211
AC_MSG_CHECKING(for runtime libraries flag)
case
Title: PRB: download a website and flatten the pages to single directory
Hi,
I want to download a website ( with all the linked files and graphics ) to a single directory. I know I can do this using -r and -nd option. But I find that the url in the html are not converted to reflect the new
I'm trying to do something similar to a mirror using the following:
wget -nc --cache=off -k -t999 -nH -q -r -l0 -np -Dhost http://host/D1 http://host/D2
Essentially, I want to recursively suck in everything on host below the
directories D1 and D2, while converting all links to relative as we