Hi,
doing the following:
# /tmp/wget-1.9-beta3/src/wget -r --timeout=5 --tries=1
http://weather.cod.edu/digatmos/syn/
--11:33:16-- http://weather.cod.edu/digatmos/syn/
= `weather.cod.edu/digatmos/syn/index.html'
Resolving weather.cod.edu... 192.203.136.228
Connecting to
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
But if I run in thru' crontab, where will it store the
This problem is not specific to timeouts, but to recursive download (-r).
When downloading recursively, Wget expects some of the specified
downloads to fail and does not propagate that failure to the code that
sets the exit status. This unfortunately includes the first download,
which should
OK, I see.
But I do not agree.
And I don't think it is a good idea to treat the first download special.
In my opinion, exit status 0 means everything during the whole
retrieval went OK.
My prefered solution would be to set the final exit status to the highest
exit status of all individual
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 01, 2003 9:20 PM
Tony Lewis [EMAIL PROTECTED] writes:
Would something like the following be what you had in mind?
301 http://www.mysite.com/
200 http://www.mysite.com/index.html
200
The home page is back, but it says that the TP Robot is dead. I've
contacted Martin Loewis, perhaps he'll be able to provide more info.
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host-directories ftp://username:[EMAIL
Payal Rathod [EMAIL PROTECTED] writes:
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror
I've patched util.c to make run_with_timeout() work on
Windows (better than it does with alarm()!).
In short it creates and starts a thread, then loops querying
the thread exit-code. breaks if != STLL_ACTIVE, else sleep
for 0.1 sec. Uses a wget_timer too for added accuracy.
Tested with
Forgot this in src/Changelog:
2003-10-02 Gisle Vanem [EMAIL PROTECTED]
* utils.c (run_with_timeout): For Windows: Run the 'fun' in
a thread via a helper function. Continually query the
thread's exit-code until finished or timed out.
PS.:
+static DWORD
Gisle Vanem [EMAIL PROTECTED] writes:
I've patched util.c to make run_with_timeout() work on Windows
(better than it does with alarm()!).
Cool, thanks! Note that, to save the honor of Unix, I've added
support for setitimer on systems that support it (virtually everything
these days), so
I've committed this patch, with minor changes, such as moving the code
to mswindows.c. Since I don't have MSVC, someone else will need to
check that the code compiles. Please let me know how it goes.
Hrvoje Niksic [EMAIL PROTECTED] said:
I've committed this patch, with minor changes, such as moving the code
to mswindows.c. Since I don't have MSVC, someone else will need to
check that the code compiles. Please let me know how it goes.
It compiled it with MSVC okay, but crashed somewhere
13 matches
Mail list logo