Frank McCown wrote:
It would be great if wget had a way of limiting the amount of time it took to run so it won't accidentally hammer on someone's web server for an indefinate amount of time. I'm often needing to let a crawler run for a while on an unknown site, and I have to manually kill wget after a few hours if it hasn't finished yet. It would be nice if I could do:

wget --limit-time=120 ...

to make it stop itself after 120 minutes.

Please cc me on any replies.

i don't think we need to add this feature to wget, as it can be achieved with a shell script that launches wget in background, sleeps for the given amount of time and then kills the wget process.

however, if there is a general consensus about adding this feature to wget, i might consider changing my mind.

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi                          http://www.tortonesi.com

University of Ferrara - Dept. of Eng.    http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linux            http://www.deepspace6.net
Ferrara Linux User Group                 http://www.ferrara.linux.it

Reply via email to