On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote: > I'd like to download the web page of bugs by maintainer, > http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all > the bug reports linked to on that page, so that I can refer to them offline. > But, wget doesn't work, I think because they are not static pages, they are > created on the fly by scripts. Before I spend time figuring this out, does > anyone know of an already existing means to do this?
I think it is to do with robots.txt Try wget -r -l 1 http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED] It nearly does what you want. - Craig -- Craig Small VK2XLZ GnuPG:1C1B D893 1418 2AF4 45EE 95CB C76C E5AC 12CA DFA5 Eye-Net Consulting http://www.enc.com.au/ <[EMAIL PROTECTED]> MIEEE <[EMAIL PROTECTED]> Debian developer <[EMAIL PROTECTED]>

