On Mon, 20 Jun 2005, Michael Petuschak wrote:

> Hi David,  
> 
>  >  >>> It offers me all the functions and flexibility I need, although there 
> is
>  >  >>> one thing it lacks to be complete - The ability to resume partialy
>  >  >>> downloaded files.
>  >  >> This already exists! However, whether or not it works I think depends 
> on
>  >  >> your server. Either use get -c filename or reget filename; both do the
>  >  >> same thing.
>  >> 
>  >> What about pget?
>  >> Can I resume a file that I started downloading with pget?
>  > 
>  > The short answer is, "no." You can get help on any command within the
>  > lftp shell. E.g. "help pget". This information is, of course, also
>  > contained in lftp's unusually good man page, in a highly searchable
>  > format.
> 
> Unfortunattely "help pget" gives no answer on whether it's possible to use
> "get -c" on a file, which was partially downloaded with "pget".
> I'm afraid a damaged file will be the result, however not sure...
> 
I believe that Alexander had already explained some time ago that resuming
parallel get would be too unreliable feature, can easily cause file data
corrpution. I.E. how whould you handle the fact that simultaneous
connections can transfer data at different speed? Or, how you have to
remember the number of connections opened to resume, otherwise data would
be rewritten in wrong places? You should propably have some special "state
of download" file to implement this, yet, this file could be
occasionally deleted, for example. Simple reget doesn't have these
implications.
Dmitri

Reply via email to