-c option

2008-10-15 Thread Thomas Wolff
Hi, I've just come across the following remark in the wget manual page (1.10.2), about the -c option: > Wget has no way of verifying that the local file is really a valid prefix of > the remote file. This is not quite true. It could at least check the remote and local file time stam

wget-1.10.2 -c option causes duplicate entries in .listing

2005-10-19 Thread Karsten Hopp
Hello, Current wget versions seem to have broken -c: wget --cut-dirs=1 -m -nH ftp://ftp.gnu.org/gnu/wget/ wget --cut-dirs=1 -m -nH -c ftp://ftp.gnu.org/gnu/wget/ The second .listing contains duplicate entries and wget tries to download each file twice. It is smart enough to detect that the fi

Re: Incorrect numbers with -c option

2001-11-11 Thread Hack Kampbjørn
Lukasz Bolikowski wrote: > > Hello! > >I think the -c option in wget results in misleading output. > I have been downloading > > ftp://ftp.kernel.org/pub/dist/superrescue/v2/superrescue-2.0.0a.iso.gz > > which is 514596847 bytes long. I aborted the downloading

Incorrect numbers with -c option

2001-11-10 Thread Lukasz Bolikowski
Hello! I think the -c option in wget results in misleading output. I have been downloading ftp://ftp.kernel.org/pub/dist/superrescue/v2/superrescue-2.0.0a.iso.gz which is 514596847 bytes long. I aborted the downloading after 246345728 bytes and then run: wget -c --passive-ftp -o super-log

Re: -c option, again

2001-05-26 Thread Hrvoje Niksic
Henrik van Ginhoven <[EMAIL PROTECTED]> writes: > On Sat, May 26, 2001 at 01:56:41PM +0200, Hrvoje Niksic wrote: > > Sp00l <[EMAIL PROTECTED]> (d'uh, damn mail) writes: > > > Aah.. Yes, you are right as always. However, once wget determine the > > > (index.html) file to be completely downloaded,

Re: -c option, again

2001-05-26 Thread Henrik van Ginhoven
On Sat, May 26, 2001 at 01:56:41PM +0200, Hrvoje Niksic wrote: > Sp00l <[EMAIL PROTECTED]> (d'uh, damn mail) writes: > > Aah.. Yes, you are right as always. However, once wget determine the > > (index.html) file to be completely downloaded, shouldn't it start going > > through the file for links

Re: -c option, again

2001-05-26 Thread Sp00l
> > The server does not support continued downloads, which conflicts with -c'. > But that's true, isn't it? > The server didn't respond with a `Range' header, hence "continued > download" doesn't work. Since you specified that the download should > be continued, Wget refuses to truncate up your f

Re: -c option, again

2001-05-26 Thread Hrvoje Niksic
Henrik van Ginhoven <[EMAIL PROTECTED]> writes: > $ src/wget -d -c -r -np http://ftp.sunet.se/pub/gnu/wget/ I assume that ftp.sunet.se/pub/gnu/wget/index.html already exists at this point. > /.../ > ---request begin--- > GET /pub/gnu/wget/ HTTP/1.0 > User-Agent: Wget/1.7-pre1 > Host: ftp.sunet.

Re: -c option, again

2001-05-26 Thread Henrik van Ginhoven
On Sat, May 26, 2001 at 01:07:46PM +0200, Hrvoje Niksic wrote: > > But Wget *thinks* that the server doesn't support it. Sending a debug > log of (the relevant part of) the Wget run would probably help in > determining what went wrong. oops, forgot that (actually, I was afraid I had missed some n

Re: -c option, again

2001-05-26 Thread Hrvoje Niksic
Henrik van Ginhoven <[EMAIL PROTECTED]> writes: > Neither english nor networking are my native languages, but with > ``continued downloads'' I take it wget means ``continue on a file > where you left off'', which in this case would be untrue, because > sunet.se does support it. But Wget *thinks*

-c option, again

2001-05-26 Thread Henrik van Ginhoven
Just tried out the latest and greatest cvs version of wget, and I'm a bit confused regarding the way the --continue option work, or doesn't work, now. After doing a mirror with -np -r on http://ftp.sunet.se/pub/gnu/wget/, I run the same command again, this time with -c, and I get: ... @verdandi