In the docs I've seen on wget, I see that I can use wildcards to
download multiple files on ftp sites. So using *.pdf would get me all
the pdfs in a directory. It seems that this isn't possible with http
sites though. For work I often have to download lots of pdfs when
there's new info I
Hi Heiko!
Until now, I linked to your main page.
Would you mind if people short-cut this?
Linking to the directory is bad since people would download
Sorry, I meant linking directly to the latest zip.
However, I personally prefer to read what the provider
(in this case you) has to say
Hi Ron!
If I understand you correctly, you could probably use the
-A acclist
--accept acclist
accept = acclist
option.
So, probably (depending on your site), the syntax should be something like:
wget -r -A *.pdf URL
wget -r -A *.pdf -np URL
or, if you have to recurse through multiple html
problem: wget can't download by command
wget --bind-address=Your.External.Ip.Address -d -c -v -b -a logo.txt
ftp://anonymous:[EMAIL PROTECTED]/incoming/Xenos/knigi/Programming/LinuxUni
x/SHELL/PICTURES/LOGO.GIF
logo.txt contains:
--
DEBUG output created by Wget 1.9.1 on freebsd4.5.
--11:36:30--