Thank you.
I have noticed what wget has the error in the downloading. 
I didn't make statistics it's errors: servers, files, time.
I must download files two time then.
And compared it.
If I didn't have checksum.
Is in new wget other algorithm continuation downloading (option -c)?
I downloaded much staff from sourceforge.net so I used wget.
Wget can get direct link from soureforge.net.
Other downloader get html-page with link.

I downloaded from other place with the errors.
For example:
http://cdimage.debian.org/cdimage/
ftp://ftp.yandex.ru/
http://files.uk.pfsense.org/mirror/downloads/
ftp://ftp.heanet.ie/pub/arklinux/
ftp://ftp.cc.uoc.gr/mirrors/
ftp://ftp.linux.kiev.ua/
http://ftp.cc.uoc.gr/mirrors/
ftp://ftp.uni-ulm.de/mirrors/
http://distro.ibiblio.org/pub/

I say maybe wget has bad algorithm continuation downloading.

Reply via email to