Herold, I will try using -v first and then -d if -v didn't show anything. 
Thanks for the tips. :)
-- 
"Ants can carry twenty times their weight, which is useful information
   if you're moving out and you need help getting a potato chip across
                                                   town." --Ron Darian
  /\___/\
 / /\ /\ \       Phillip Pi (Ant) @ The Ant Farm: http://antfarm.ma.cx
| |o   o| |       E-mail: [EMAIL PROTECTED] or [EMAIL PROTECTED]
   \ _ /        Be sure you removed ANT from e-mail address if you get
    ( )                                             a returned e-mail.

On Tue, 11 May 2004, Herold Heiko wrote:

> Did you ever run the download with -v ?
> What did the log say when wget seemed to hang or regarding to the missing or
> corrupt files, or regarding the parsing of the directory index (or whatever
> it was) linking to those files ?
> If nothing usefull is logged, try again with -d (but be prepared, a huge
> amount of information will be logged - better redirect it to a file with -o
> log.txt or -a log.txt).
> If still nothing comes up, take a look at the server logs, if you can.
> 
> Heiko
> 
> -- 
> -- PREVINET S.p.A. www.previnet.it
> -- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
> -- +39-041-5907073 ph
> -- +39-041-5907472 fax
> 
> > -----Original Message-----
> > From: Phillip Pi [mailto:[EMAIL PROTECTED]
> > Sent: Monday, May 10, 2004 10:40 PM
> > To: [EMAIL PROTECTED]
> > Subject: wget hangs or downloads end up incomplete in Windows 
> > 2000 & XP.
> > 
> > 
> > Hello. 
> > 
> > I downloaded wget v1.9.1 (complete) from
> > http://xoomer.virgilio.it/hherold/. I am having problems in 
> > downloading
> > almost a GB of files (over 2600 files and 250 folders). Randomly, the
> > download will just stall completely and never resume OR the download
> > completes, but download is corrupted (sometimes missing files and
> > subfolders). I also had this problem with old v1.8.2. The 
> > only way to fix
> > is to set pause one second for each process, but this takes too long 
> > (almost an hour) with so many files and folders! Here's what 
> > I am using in 
> > batch file (download.bat -- changed URLs, account, and passwords for 
> > sample):
> > 
> > @call wget -c -l0 -r -nH -w0 
> > ftp://domain\username:[EMAIL PROTECTED]/Unreleasedbuilds
> > /blah/blah/1/setups/SUBSETUP/*
> > @call wget -c -l0 -r -nH -w0 
> > ftp://domain\username:[EMAIL PROTECTED]/Unreleasedbuilds
> > /blah/blah/1/setups/SUBSETUP/*
> > 
> > I used a Pentium 3 1 Ghz system (512 MB of RAM) with Windows 
> > 2000 SP4 (all
> > updates) and a Pentium 4 3 Ghz (HT enabled) with Windows XP 
> > Home SP1 (all
> > updates). Each computer is using 100mb connection for network.
> > 
> > Doing a copy in Windows' Explorer through network share has 
> > NO problems. I 
> > assume this method is slower transfer compared to wget 
> > command. I'd like 
> > to use script so I don't have to do this manually. ;)
> > 
> > Thank you in advance. :)

Reply via email to