On Feb 19, 2008 11:25 PM, Steven M. Schweda <[EMAIL PROTECTED]> wrote:
> From: Charles
>
> > In wget 1.10, [...]
>
>    Have you tried this in something like a current release (1.11, or
> even 1.10.2)?

My wget version is 1.10.2. It isn't really a problem for me, I just
want to know if this is a known problem or if it is not, whether it
could be considered a bug/enhancement.

>       http://ftp.gnu.org/gnu/wget/
>
> > [...] but for some reason (buggy server), [...]
>
>    How should wget know that it's getting a bogus error from your "buggy
> server", and not getting a valid error from a working server?

The problem is that the server does not give error. Normal web server
like apache gives request range unsatisfied error if we try to request
range 10000- for a file which size is 10000 but this webserver give
HTTP 200 OK where wget will happily redownload the file. I think in
this case, if the -c switch is given on the command line and the web
server returns HTTP 200 OK with content-length header of n bytes and
there is already a file with n bytes in the disk, then wget should not
redownload the file. I know the problem is with the webserver and not
in the side of wget, but sometimes we're dealing with buggy webserver
which we don't have control on it.

Reply via email to