On Thu, 04 Aug 2016 12:08:50 +0200 Tim Ruehsen <[email protected]> wrote:
> On Wednesday, August 3, 2016 2:40:14 PM CEST Matthew White wrote: > > On Wed, 03 Aug 2016 14:05:06 +0200 > > > > Tim Ruehsen <[email protected]> wrote: > > > On Tuesday, August 2, 2016 11:27:28 AM CEST Matthew White wrote: > > > > On Mon, 01 Aug 2016 16:32:30 +0200 > > > > > > > > Tim Ruehsen <[email protected]> wrote: > > > > > On Saturday, July 30, 2016 9:41:48 PM CEST Matthew White wrote: > > > > > > Hello! > > > > > > > > > > > > I think that sometimes it could help to keep downloaded Metalink's > > > > > > files > > > > > > which have a bad hash. > > > > > > > > > > > > The default wget behaviour is to delete such files. > > > > > > > > > > > > This patch provides a way to keep files which have a bad hash > > > > > > through > > > > > > the > > > > > > option --keep-badhash. It appends the suffix .badhash to the file > > > > > > name, > > > > > > except without overwriting existing files. In the latter case, an > > > > > > unique > > > > > > suffix is appended after .badhash. > > > > > > > > > > > > I made this patch working on the following branch: > > > > > > master (latest 20cac2c5ab3d63aacfba35fb10878a2d490e2377) > > > > > > git://git.savannah.gnu.org/wget.git > > > > > > > > > > > > What do you think? > > > > > > > > > > Hi Matthew, > > > > > > > > > > good work ! > > > > > > > > > > While your FSF assignment is underway (PM), we can continue polishing. > > > > > > > > > > Didn't test your code yet, but from what I see, there are just these > > > > > peanuts: > > > > > > > > > > 1. > > > > > + bhash = malloc (strlen (name) + strlen (".badhash") + 1); > > > > > + strcat (strcpy (bhash, name), ".badhash"); > > > > > > > > > > Please consider using concat_strings() from util.h. > > > > > And please leave an empty line between var declaration and code - just > > > > > for > > > > > readability. > > > > > > > > > > 2. > > > > > Please add 'link' and 'unlink' to bootstrap.conf. Gnulib will emulate > > > > > these on platforms where they are not available (or have a slightly > > > > > different behavior). I guess we simply forgot 'unlink' when we > > > > > switched > > > > > to gnulib. > > > > > > > > > > 3. > > > > > The (GNU) commit messages should ideally look like: > > > > > > > > > > one line of short description > > > > > <empty line> > > > > > file changes > > > > > <empty line> > > > > > long description > > > > > > > > > > For example: > > > > > Add new option --keep-badhash > > > > > > > > > > * src/init.c: Add keepbadhash > > > > > * src/main.c: Add keep-badhash > > > > > * src/options.h: Add keep_badhash > > > > > * doc/wget.texi: Add docs for --keep-badhash > > > > > * src/metalink.h: Add prototypes badhash_suffix(), badhash_or_remove() > > > > > * src/metalink.c: New functions badhash_suffix(), badhash_or_remove(). > > > > > > > > > > (retrieve_from_metalink): Call append .badhash() > > > > > > > > > > With --keep-badhash, append .badhash to Metalink's files with checksum > > > > > mismatch, except without overwriting existing files. > > > > > > > > > > Without --keep-badhash, remove downloaded files with checksum mismatch > > > > > (this conforms to the old behaviour). > > > > > > > > > > [This also applies to to your other patches] > > > > > > > > > > 4. > > > > > Please not only write 'keepbadhash', but also what you did (e.g. > > > > > remove, > > > > > rename, add, ...), see my example above. > > > > > > > > > > Those ChangeLog entries should allow finding changes / faults / > > > > > regressions > > > > > etc. even when a versioning system is not at hand, e.g. when unpacking > > > > > the > > > > > sources from a tarball. (Not debatable that consulting commit messages > > > > > directly is much more powerful.) > > > > > > > > > > 5. > > > > > You write "except without overwriting existing files", maybe you > > > > > should > > > > > mention appending a counter instead of overwriting existent files !? > > > > > > > > > > > > > > > Regards, Tim > > > > > > > > Thanks Tim! > > > > > > > > I really needed your guidance. I sent the modified patches to > > > > [email protected]. > > > > > > > > I believe there are more things to fix. > > > > > > > > Consider the following after applying the attached patch: > > > > * src/metalink.c (retrieve_from_metalink): line 124: If the download is > > > > interrupted (output_stream isn't NULL), and there are more urls for the > > > > same file (we are still in the download loop), switching to the next url > > > > should resume the download (instead than start it over). > > > > > > > > In this patch I added a fix to rename/remove the fully downloaded file > > > > (output_stream is NULL), but there was an error (probably checksum) and > > > > we > > > > are still in the download loop (more urls for the same file). But as > > > > said > > > > before, switching to the next url should continue the download: * > > > > src/metalink.c (retrieve_from_metalink): line 131: Rename/remove fully > > > > downloaded file on error > > > > > > > > I still have to investigate the problem... > > > > > > So, I wait with 0001-... !? > > > > > > 0002-... has been pushed (extended with 'symlink'). > > > > > > Thanks for your contribution. > > > > > > Tim > > > > Ok for 0002. Thanks. > > > > There's no problem applying the 0001. > > Applied and pushed ! > > > The "if the stream got interrupted, then restart the download with the next > > url" (output_stream isn't NULL) was already there before the patch 0001. > > > > [debug is required to know when output_stream isn't NULL] > > > > commit 7fad76db4cdb7a0fe7e5aa0dd88f5faaf8f4cdc8 > > * src/metalink.c (retrieve_from_metalink): line 124: 'if (output_stream)' > > remove file and restart download with next url > > > > With the 0001, if --keep-badhash is used the file is renamed instead than > > removed, even when "output_stream is NULL". > > > > I have to look through the "stream interrupted" situation. > > > > I'm guessing that the download is not resumed. > > > > What do you suggest? > > We need a document where we define wanted behavior, create tests and amend > the > code. > > Regards, Tim Ok, I just documented a test. See the attached tarball. Also there is a patch attached (there's a copy in the tarball too), providing the suggested bugfix. Let me know, Matthew -- Matthew White <[email protected]>
>From 80c1b02a08c20d946f0dfd8848c27250edfde34a Mon Sep 17 00:00:00 2001 From: Matthew White <[email protected]> Date: Thu, 4 Aug 2016 11:35:42 +0200 Subject: [PATCH] Bugfix: Continue download when retrying with the next metalink:url * src/metalink.c (retrieve_from_metalink): If output_stream isn't NULL, continue download with the next mres->url Bug: * src/metalink.c (retrieve_from_metalink): If output_stream isn't NULL, restart download with the next mres->url Keep the download progress while iterating metalink:url. In such scenario, closing output_stream means to lose the download progress. --- src/metalink.c | 50 ++++++++++++++++++++++---------------------------- 1 file changed, 22 insertions(+), 28 deletions(-) diff --git a/src/metalink.c b/src/metalink.c index 1799e3a..2e296f9 100644 --- a/src/metalink.c +++ b/src/metalink.c @@ -119,24 +119,6 @@ retrieve_from_metalink (const metalink_t* metalink) retr_err = METALINK_RETR_ERROR; - /* If output_stream is not NULL, then we have failed on - previous resource and are retrying. Thus, rename/remove - the file. */ - if (output_stream) - { - fclose (output_stream); - output_stream = NULL; - badhash_or_remove (filename); - xfree (filename); - } - else if (filename) - { - /* Rename/remove the file downloaded previously before - downloading it again. */ - badhash_or_remove (filename); - xfree (filename); - } - /* Parse our resource URL. */ iri = iri_new (); set_uri_encoding (iri, opt.locale, true); @@ -156,17 +138,29 @@ retrieve_from_metalink (const metalink_t* metalink) /* Avoid recursive Metalink from HTTP headers. */ bool _metalink_http = opt.metalink_over_http; - /* Assure proper local file name regardless of the URL - of particular Metalink resource. - To do that we create the local file here and put - it as output_stream. We restore the original configuration - after we are finished with the file. */ - if (opt.always_rest) - /* continue previous download */ - output_stream = fopen (mfile->name, "ab"); + /* If output_stream is not NULL, then we have failed on + previous resource and are retrying. Thus, continue + with the next resource. Do not close output_stream + while iterating over the resources, or the download + progress will be lost. */ + if (output_stream) + { + DEBUGP (("Previous resource failed, continue with next resource.\n")); + } else - /* create a file with an unique name */ - output_stream = unique_create (mfile->name, true, &filename); + { + /* Assure proper local file name regardless of the URL + of particular Metalink resource. + To do that we create the local file here and put + it as output_stream. We restore the original configuration + after we are finished with the file. */ + if (opt.always_rest) + /* continue previous download */ + output_stream = fopen (mfile->name, "ab"); + else + /* create a file with an unique name */ + output_stream = unique_create (mfile->name, true, &filename); + } output_stream_regular = true; -- 2.7.3
Bugfix-Continue-download-when-retrying-with-the-next.d.tar.xz
Description: application/xz
pgpysEDjMNSsM.pgp
Description: PGP signature
