If anyone creates a patch for rollback, I'll be your first tester ;)

Justin Piszcz wrote:

> I was curious if lftp or wget will ever support a rollback feature and
> somehow verify the bytes are correct somehow where the file has been
> resumed.
>
> Why?  This is stated below:
>
> LFTP VS WGET EXPERIMENT:
>
> PROBLEM: With lftp, many of my downloads get corrupted.
>          This is because the connection between my satellite link, and
> my ISP
>          gets severed, therefore causing FTP to resume.  Regular
> connection
>          breaks are normal on a satellite connection, they may only last
> 1ms or
>          less, however, they cause the FTP to resume, thus causing
> corruption.
>
> QUESTION: However, does lftp corrupt files more often than say wget?
>
> TEST LFTP: With each 700MB pull with lftp, 6% of files usually are bad.
>            This means 3 to 4 re-downloads.  I've downloaded over 1
> terrabyte
>            of data, the average seems to be about 6%, the more resumes,
> the
>            greater the percentage file problems.
>
> TEST WGET: First 700MB file transfer: 0.00% corruption.
>            Second 700MB file transfer: 0.00% corruption.
>            Third 700MB file transfer: 5.00% corruption.
>            After several more 700MB pulls, it is about the same.
>
> POINT: It is a single character error.  I've done multiple splits and
> diffs.
>        I've found only 1 character is different from the original.
>        This is however, catastrophic for binary files.

Reply via email to