Here is what I did when I was fighting battles with an unreliable connection
a few months ago.  (It doesn't solve the cost of downloading, but it does
solve the lost connection problem).  Use

wget -c url-of-the-wanted-file

This command line program will allow you to resume the download again and
again when the inevitable happens and the connection is lost.  If you find
that it downloads more reliably if you limit the bandwidth, there is another
option to set a rate limit, namely:

wget --limit-rate=20k  -c url-of-the-wanted-file

(the rate is given in bytes per second, so the above example is 20,000 bytes
per second).

Living at the far end of the DSL reaches, wget is my favorite program!

Regards,

Carol Lerche

On Mon, Jun 9, 2008 at 12:29 AM, James Cameron <[EMAIL PROTECTED]> wrote:

> No, FTP is not more reliable, if you have a problem with HTTP downloads
> of the image you will have the same problems with FTP.
>
> Yes, it can be made available in smaller chunks, but it is best to
> download it with a restartable downloader, which recovers from
> interruptions.  wget on Linux can do this, as can rsync.  There are
> restartable downloaders available for other operating systems.  Ability
> to restart at last known point after an interruption is inherent in the
> HTTP, FTP and rsync protocols.  But many HTTP clients do not use the
> feature.
>
> Yes, there is a better way of making new builds and updates available,
> and that is the olpc-update mechanism.  It restarts from where it was
> disconnected as well, since it uses rsync.
>
> --
> James Cameron    mailto:[EMAIL PROTECTED]     http://quozl.netrek.org/
> _______________________________________________
> Devel mailing list
> [email protected]
> http://lists.laptop.org/listinfo/devel
>



-- 
"Always do right," said Mark Twain. "This will gratify some people and
astonish the rest."
_______________________________________________
Devel mailing list
[email protected]
http://lists.laptop.org/listinfo/devel

Reply via email to