Hi,

On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> I'd like to download the web page of bugs by maintainer,
> http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED], and all
> the bug reports linked to on that page, so that I can refer to them offline.
> But, wget doesn't work, I think because they are not static pages, they are
> created on the fly by scripts.  Before I spend time figuring this out, does
> anyone know of an already existing means to do this?

I don't know for wget, but with httrack (a program I develop), you could use 
something like:

httrack --assume standard http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED] '-*' 
'+bugs.debian.org/cgi-bin/bugreport.cgi?bug=*'
or
httrack --assume standard http://bugs.debian.org/cgi-bin/[EMAIL PROTECTED] '-*' 
'+bugs.debian.org/cgi-bin/bugreport.cgi?bug=*[0-9]'
if you don't want to grab mbox/original versions too.

(You can also "flattern" the structure using -N3, depending on your needs)

Reply via email to