New to the list.

Wrote a patch that Works For Me to limit to a "percent of measured
bandwidth".  This is useful, like --limit-rate, in cases where an
upstream switch is poorly made and interactive users get locked out
when a single box does a wget, but limit-pct is more "automatic" in
the sense that you don't have to know ahead of time how big your
downstream pipe is.

I.e., what I used to do is (1) wget, look at the bandwidth I was
getting, and then (2) Ctrl-C,  (3) Ctrl-P and edit the line to add -c
--limit-rate nnK (where nn is a bit less than I was getting).

Now I can wget --limit-pct 50 and it will go full-speed for a bit and
then back off till till the average speed is 50% of what the we saw
during that time.

The heuristic I'm using is to download full-speed for 15 seconds and
then back off- that seems to work on my connection (too much less and
measured rate is erratic, too much more and the defective upstream
switch locks interactive folks out long enough that they notice and
complain).  Does that seem reasonable to folks or should it be
parameterized.  I'm not sure I can spend much time on complex
parameter handling etc. right now.

Anyhow, does this seem like something others of you could use?  Should
I submit the patch to the submit list or should I post it here for
people to hash out any parameterization niceties etc first?

Best Regards.
Please keep in touch.

Reply via email to