On Mon, 2006-03-27 at 22:33 +0200, Simon Kellett wrote:
> Iain Buchanan <[EMAIL PROTECTED]> writes:
> 
> > So the question is: why does wget try to download it from the
> > beginning?  wget shouldn't do this even if the server can't resume,
> > wget should just die.
> 
> man wget :-)

ok:

  Beginning with Wget 1.7, if you use -c on a non-empty file, and it
  turns out that the server does not support continued downloading,
  Wget will refuse to start the download from scratch, which would
  effectively ruin existing contents.  If you really want the down-
  load to start from scratch, remove the file

and yet the file did start from scratch...

> "Note that -c only works with FTP servers and with HTTP servers that
> support the "Range" header."
> 
> I am guessing his does not !

actually, it does, as on another file:

$ wget -c
"http://68.106.74.139/mp3/Mike_Baas/4/mp3/Mike_Baas_-_4_-_03_-_Wave.mp3";
--09:56:35--
http://68.106.74.139/mp3/Mike_Baas/4/mp3/Mike_Baas_-_4_-_03_-_Wave.mp3
           => `Mike_Baas_-_4_-_03_-_Wave.mp3'
Connecting to 68.106.74.139:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 13,985,672 (13M), 234,426 (229K) remaining [audio/mpeg]

100%[++++++++++++++++++++++++++++++++++++>] 13,985,672     8.97K/s
ETA 00:00

09:56:57 (11.18 KB/s) - `Mike_Baas_-_4_-_03_-_Wave.mp3' saved
[13985672/13985672]


But even if it didn't this still wouldn't explain why the download is
deleted, and then downloaded again.  -c shouldn't do that.

thanks,
-- 
Iain Buchanan <iaindb at netspace dot net dot au>

If you want to know what god thinks of money, just look at the people he gave
it to.
                -- Dorthy Parker

-- 
gentoo-user@gentoo.org mailing list

Reply via email to