I think there is still a case for attempting percent limiting.  I agree
with your point that we can not discover the full bandwidth of the
link and adjust to that.  The approach discovers the current available
bandwidth and adjusts to that.  The usefullness is in trying to be
unobtrusive to other users.




On Wed, 10 Oct 2007, Hrvoje Niksic wrote:

> Jim Wright <[EMAIL PROTECTED]> writes:
> 
> >> - --limit-rate will find your version handy, but I want to hear from
> >> them. :)
> >
> > I would appreciate and have use for such an option.  We often access
> > instruments in remote locations (think a tiny island in the Aleutians)
> > where we share bandwidth with other organizations.
> 
> A limitation in percentage doesn't make sense if you don't know
> exactly how much bandwidth is available.  Trying to determine full
> bandwidth and backing off from there is IMHO doomed to failure because
> the initial speed Wget gets can be quite different from the actual
> link bandiwdth, at least in a shared link scenario.  A --limit-percent
> implemented as proposed here would only limit the retrieval speed to
> the specified fraction of the speed Wget happened to get at the
> beginning of the download.  That is not only incorrect, but also quite
> non-deterministic.
> 
> If there were way to query the network for the connection speed, I
> would support the "limit-percent" idea.  But since that's not
> possible, I think it's better to stick with the current --limit-rate,
> where we give the user an option to simply tell Wget how much
> bandwidth to consume.
> 

Reply via email to