On 27/01/14 13:59, Jānis wrote:

Hi!

I do not know wheter it is a bug or no, but in case of such command:

wget -r -l1 -A *pdf* -nd "http://server/script.asp?id=xxx";

in case if the working directory contains .pdf files, pdf file names are treated as is they are URLs and wget fails to download the pdf (along with other trash) expected form the server.

If pdf files are removed, command works as expected.

Janis
Not a bug (and not wget's fault), albeit surprising for new users.
When you write *pdf* in a command line, your shell will expand it to any files containing «pdf». You can view this behavior by executing
 echo *pdf*

So when you run
 wget -r -l1 -A *pdf* …

wget is really receiving a command like
  wget -r -l1 -A resume.pdf paper.pdf http://example.com

So it will accept only files matching the first expanded pdf file (most likely none), and treat subsequent files as other urls to download from.

The solution is to quote the special characters, all of these work:
 wget -r -l1 -A '*pdf*' -nd "http://server/script.asp?id=xxx";
 wget -r -l1 -A "*pdf*" -nd "http://server/script.asp?id=xxx";
 wget -r -l1 -A \*pdf\* -nd "http://server/script.asp?id=xxx";

(I recommend you the first one: to use single quotes, since other characters like $ are still special inside double quotes)

Best regards


Reply via email to