On Tue, Oct 13, 2009 at 10:52 AM, Rich Shepard <[email protected]> wrote:
> On Tue, 13 Oct 2009, wes wrote:
>
>> Yes, the clinic is this Sunday, but FreeGeek's pipe isn't all that big. A
>> typical home cable or DSL connection will bring the file to you in a few
>> hours. Wget even supports resuming interrupted transfers.
>
> wes,
>
>   I thought of sucking it down here on the DSL line, but that slows
> everything while it's working. Perhaps I'll start it using shell overnight.
>
>

 Manual page wget(1) line 394

       --limit-rate=amount
           Limit the download speed to amount bytes per second.  Amount may be
           expressed in bytes, kilobytes with the k suffix, or megabytes with
           the m suffix.  For example, --limit-rate=20k will limit the
           retrieval rate to 20KB/s.  This is useful when, for whatever
           reason, you don't want Wget to consume the entire available
           bandwidth.


Start wget, get a feel for how fast it can go, then cut that in half
or so.  having your pipe saturated makes everything unbearable.
having it under heavy use makes things slower, but I find it very
tolerable.  drop it to 1/4 or so and you hardly notice - it just kinda
blends in with the normal erratic performance of the net.

-- 
Carl K
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to