This is something that is not supported by the http protocol.
If you access the site via ftp://..., then you can use wildcards like *.pdf
-Original Message-
From: R Kimber [mailto:[EMAIL PROTECTED]
Sent: Saturday, May 12, 2007 06:43
To: wget@sunsite.dk
Subject: Re: simple wget question
Sorry, I didn't see that Steven has already answered the question.
-Original Message-
From: Steven M. Schweda [mailto:[EMAIL PROTECTED]
Sent: Saturday, May 12, 2007 10:05
To: WGET@sunsite.dk
Cc: [EMAIL PROTECTED]
Subject: Re: simple wget question
From: R Kimber
What I'm trying
On Thu, 10 May 2007 16:04:41 -0500 (CDT)
Steven M. Schweda wrote:
From: R Kimber
Yes there's a web page. I usually know what I want.
There's a difference between knowing what you want and being able
to describe what you want so that it makes sense to someone who does
not know what
From: R Kimber
What I'm trying to download is what I might express as:
http://www.stirling.gov.uk/*.pdf
At last.
but I guess that's not possible.
In general, it's not. FTP servers often support wildcards. HTTP
servers do not. Generally, an HTTP server will not give you a list of
On Sun, 6 May 2007 21:44:16 -0500 (CDT)
Steven M. Schweda wrote:
From: R Kimber
If I have a series of files such as
http://www.stirling.gov.uk/elections07abcd.pdf
http://www.stirling.gov.uk/elections07efg.pdf
http://www.stirling.gov.uk/elections07gfead.pdf
etc
is there a
From: R Kimber
If I have a series of files such as
http://www.stirling.gov.uk/elections07abcd.pdf
http://www.stirling.gov.uk/elections07efg.pdf
http://www.stirling.gov.uk/elections07gfead.pdf
etc
is there a single wget command that would download them all, or would I
need to do each