Hash: SHA256

Hrvoje Niksic wrote:
> Jim Wright <[EMAIL PROTECTED]> writes:
>> I think there is still a case for attempting percent limiting.  I
>> agree with your point that we can not discover the full bandwidth of
>> the link and adjust to that.  The approach discovers the current
>> available bandwidth and adjusts to that.  The usefullness is in
>> trying to be unobtrusive to other users.
> The problem is that Wget simply doesn't have enough information to be
> unobtrusive.  Currently available bandwidth can and does change as new
> downloads are initiated and old ones are turned off.  Measuring
> initial bandwidth is simply insufficient to decide what bandwidth is
> really appropriate for Wget; only the user can know that, and that's
> what --limit-rate does.

So far, I'm inclined to agree.

For instance, if one just sticks "limit_percent = 25" in their wgetrc,
then on some occasions, Wget will limit to far too _low_ a rate, when
most of the available bandwidth is already being consumed by other things.

Regardless of what we decide on this, though, I like Tony L's suggestion
of some summary data at completion. He had already suggested something
similar to this for a proposed interactive prompt at interrupt.

I'm thinking that there are a lot of other little "nice-to-haves"
related to such a feature, too: someone might want Wget to save previous
download rates to a file, and average them up across invocations, and
base a percentage on this when called upon to do so. Basically, it
smells like one of many possible hacks to perform on --limit-rate, which
makes it a good candidate for a plugin (once we have that infrastructure
in place), rather than Wget proper.

- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer...

Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org


Reply via email to