Hi all!

My situation: a modem on an unlimited night dialup access.
About 4 people edit their url.txt files to post there files they want.
Then I use cron to divide 9.5 hours of online-time to download all
these files.

I faces with some problems and need for some features:
If there's a working copy of wget in memory - if you try do exec the
second one with all of the keys same as at the first copy - the second
copy must just exit with error message. Here in Russia we call it
"stupid user protection"

Seems like if wget was killed using "kill -KILL", the tail of the
downloaded file is being filled with trash (i'll test it today more
precisely). This could be sovled in many ways:
- correct termination on receiving SIGTERM (now I use KILL because
  there's no reaction on plain `kill ID`);
- cropping the tail of the file if wget was started with -c and the
  file isn't finished yet

It would also be nice to see a couple of there features:
- reading command-line options from specified file... now I use to put
  a special script that generates a really long command lines, i can see
  almost nothing using `ps aux`. if we make it, it'll be enough to put
  just `wget -C /path/user1.conf`; wgetrc are very different thing.
- temporary rename files that are not finished yet. GetRight for Win
  uses this way, it adds .GetRight extension for such files so you can
  always see what's ready.
  
Don't say I need to much. I actually can make almost all of this using
sh-scripts and perl, but writing features realisation is not the thing
that used has to do.

I also program some C, but.. maybe there are some guys who are already
inside-the-code and doesn't need time to look around inside the
sources?
-- 
Best regards,
 dEth

Reply via email to