Is there a way to at least tell Wget to convert the links of whatever it has 
downloaded and just exit? The process has been in memory since the beginning of 
this month. I would like to just have it stop, but I am afraid to end up with 
HTML with URLs that do not work locally.

________________________________

From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Mon 9/12/2005 4:32 PM
To: Youssef Eldakar
Cc: [email protected]
Subject: Re: Sleeping Processes



Alle 23:25, giovedì 1 settembre 2005, Youssef Eldakar ha scritto:
> I start a bunch of recursive Wget processes in the background with -b. I
> notice that some of the processes remain for a long time with a sleeping
> status. They seem to almost not be doing anything, because I see the total
> number of files downloaded and the total size increase very slowly - a
> whole minute passes by without an additional byte being written to disk.
> Tailing the log file, it seems to be frozen at an incomplete line like this
> one:
>
> Connecting to 216.55.161.137:80... fai
>
> The processes do complete after a real long time. Is there a way to make
> this work better?

it seems that wget freezes while writing an error message to the console.
which operating system are you using?

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi                          http://www.tortonesi.com

University of Ferrara - Dept. of Eng.    http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linux            http://www.deepspace6.net
Ferrara Linux User Group                 http://www.ferrara.linux.it


Reply via email to