This is probably technically not a "bug". However, it does leave partial files.
I know that 1.9.1 is rather old. If the subject came up 9000 times already please ignore. --help tells me to write here. I am not interested in your mailing list so if you have something to say please cc to me. The ease of -nc -r may tempt the user to abort a wget run (to kick a frozen TCP connection, or get some sleep) and resume it later with the same command after removing the top target. Isn't the machinery working great, skipping all over the things we already have ... including the file that was ^ced. Ideas to improve: -1 Make -nc -r ignore existence of top level target, or at least make an option to do so. 0. At least, document this somewhere 1. Vanilla unix method of writing to some temp file, then renaming after wget got all of it. Problem: This may leave bodies around unless you make the temp file in $TMP and copy it (yuck). It also means you have to re-download these first 200 megs of the file. 2. Same as 1 but making the temp file something like filename.WGETPART and resume gracefully (much better but now we got in-band filenames) Regards
