On Mon, Dec 27, 2010 at 02:02:08PM +1030, Karl Goetz wrote:
> I've just noticed dget doesn't resume downloads. on packages like linux or
> openoffice this can be quite a problem if you drop out half way through.
> it would be great if it could download using --continue for wget, or
> whatever the curl equivalent is.

This does sound useful.  As described below, though, dget removes/backs
up an existing file if its hash sum isn't what it expected (which would
be the case for a partial download).

When given a URL, the corresponding on-disk file is always
removed/backed up and re-downloaded.  If that was a .dsc/.changes, then
the hash sums of the files listed within are compared against the
corresponding hash sums of the on-disk files and if they don't match,
removed/backed up and re-downloaded.

Maybe it would be good to consolidate the "mis-matched hash" behavior
into an option where one can choose "remove & re-download", "backup &
re-download", or "continue download"?

My concern with that is "continue download" doesn't ensure the existing
file really contains the correct information.  It just continues
downloading from an offset based on the existing file.  If that existing
contents are wrong somehow, then the entire file would need to be
redownloaded anyway.

-- 
James
GPG Key: 1024D/61326D40 2003-09-02 James Vega <[email protected]>

Attachment: signature.asc
Description: Digital signature

Reply via email to