Hi,

  i have found a problem regarding wget --spider.

  It works great for any files over http or ftp, but as soon as one of
  these two conditions occur, wget starts downloading the file:

  1. linked files (i'm not 100% sure about this)
  2. download scripts (i.e. http://www.nothing.com/download.php?file=12345&;)

  i have included one link that starts downloading even if using the
  --spider option:

  
http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587&Section=5&Product=Motherboards&Model=AX59%20Pro&Type=Manual&DownSize=8388
  (MoBo Bios file);

  so this actually starts downloading:

  $ wget --spider 
'http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587&Section=5&Product=Motherboards&Model=AX59%20Pro&Type=Manual&DownSize=8388'

  If there is no conlclusion to this problem using wget can anyone
  recommend another "Link-Verifier"? What i want to do is: check the
  existence of som 200k links store in a database. So far i was trying
  to use "/usr/bin/wget --spider \'" . $link . "\' 2>&1 | tail -2 | head -1"
  in a simple php script.

  Thanks for any help!


----- 
Best Regards,

Andreas Belitz
CIO

TCTK - Database Solutions
Nordanlage 3
35390 Giessen
Germany

Phone    : +49 (0) 641 3019 446
Fax      : +49 (0) 641 3019 535
Mobile   : +49 (0) 176 700 16161

E-mail   : mailto:[EMAIL PROTECTED]
Internet : http://www.tctk.de

Reply via email to