Don't know if anyone else has this problem, but sometimes when I'm
downloading large files (>500MB; most recently 4.2GB), I want to change the
limit-rate without terminating the download and restarting.

For example, during the "day" when I'm on my computer, I might not want it
to use more than ~1/3rd my bandwidth (~100KB/s), but if I'm going out to
lunch or going to bed for the night, I might want to give it the full

Even then, say I can't sleep (a "too frequent" occurrence), and I want
to get back on and read some website or another -- I'd like to be
able to reduce its bandwidth.

I'm not sure what the best way to "relay" the desired speed to the currently
running program -- I can think of more than one method, but none of them
'grab' me as elegant...

Uh...well there is one, but it's pretty ambitious, perhaps dovetailing
into my other "RFE - multi-threaded downloading".

Suppose wget was "multi-threaded".  I'm saying multithreaded out
of some ignorance -- as it might be more efficient to simply make
it "multi-streamed", and use "poll(3)" to wait on the multiple
streams and process them as they become available.  But whichever,
I believe the idea of multiple downloads going at the same time is
conceptually the same.

Say one runs the first "wget".  Lets say it is a simple 1-DVD download.
Then you start a 2nd download of another DVD.  Instead of 2 copies
of wget running and competing with each other, what if the 2nd copy
"told" the 1st copy about the 2nd download, and the 2nd download
was 'enqueued' in a 'line' behind 1st.

If both downloads are from the same site, there isn't much to do and
the most efficient download would be (I think(?)) finish the first,
then do the 2nd.  Alternatively, if the size of both downloads was known
in advance (as is true for http, though not for ftp), it _could_
download whichever file had less data yet to download.

If the downloads are from different sites, then depending on the
download speed of each file -- if a specific site is "slow" (say
your max rate is 300, and your download is running at an average
speed of 150, then it probably would be "safe" to do another
download (from a different site), and intersperse the writes...

There are so many possible priority or scheduling algorithms,
I can't possibly think of them all.

To complete the idea -- If wget has multiple files it needs
to download (after parsing an HTML file, looking for page-requisites
or deeper recursion (if enabled), it could also use the
multi-threaded download feature to download those in parallel.

A recursive download can quickly 'blossom' as each level of
a directory tree is downloaded and expanded.  Many of these
files may be short files.  A significant delay or wait time
is incurred when a download has to 'pause' and wait for
the server process the request for another file and start
sending it.  Being able to run more than one thread means
your network can still be running "full throttle" in downloading
in an alternate stream, while one thread (or stream) is
waiting for the server to respond.

Anyway...that's about it.  Don't know how difficult it would
be to add.  Seems the 'download' speed change could be
'simply' implemented via some sort of message-passing
mechanism, using a 2nd invocation of wget to send the message
to the first.  But that's still somewhat vague (pass message

Comments?  Good ideas? Bad ideas?  etc...

Reply via email to