Re: Large file problem

2005-04-11 Thread Hrvoje Niksic
martin grönemeyer [EMAIL PROTECTED] writes:

 I found a problem while downloading a large file via http. If I disable
 verbose output, it works fine.

Versions of Wget released so far don't support large files.  Even
without verbose output, writing the file would probably throw an error
after the 2G limit is reached.

This is fixed in current CVS and will work in the soon-to-be-released
version 1.10.  You can try to grab the CVS sources (see
http://wget.sunsite.dk for instructions) and give them a try.


RE: Large file problem

2005-03-02 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Gisle Vanem [EMAIL PROTECTED] writes:
 
  It doesn't seem the patches to support 2GB files works on
  Windows. Wget hangs indefinitely at the end of transfer.  E.g.
 [...]
 
 I seem to be unable to repeat this.

Me too. I transferred successfully several 3GB files from linux and
solaris/sparc servers to NT4 by ftp and http (on lan though, no restarts).
No help here, sorry.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


Re: Large file problem

2005-03-01 Thread Hrvoje Niksic
Gisle Vanem [EMAIL PROTECTED] writes:

 It doesn't seem the patches to support 2GB files works on
 Windows. Wget hangs indefinitely at the end of transfer.  E.g.
[...]

I seem to be unable to repeat this.

Does this happen with only with large files, or with all files on
large-file-enabled version of Wget?  Does it happen only for FTP
downloads or for HTTP as well?


Re: Large file problem

2005-02-27 Thread Gisle Vanem
Hrvoje Niksic wrote:
Gisle Vanem [EMAIL PROTECTED] writes:
It doesn't seem the patches to support 2GB files works on
Windows. Wget hangs indefinitely at the end of transfer.
Is there a way to trace what syscall Wget is stuck at?  Under Cygwin I
can try to use strace, but I'm not sure if I'll be able to repeat the
bug.
There is strace for Win-NT too. But I dare not install it to find out.
PS. it is quite annoying to get 2 copies of every message. Also,
there should be a Reply-to: header so the replies goes to the list.
Just my 0.02 .
--gv 



Re: Large file problem

2005-02-27 Thread Hrvoje Niksic
Steve Thompson [EMAIL PROTECTED] writes:

 I have found in another context that the Windows C run-time library
 can't handle files larger than 2GB in any context, when using fopen,
 etc. The size of off_t is 4 bytes on IA32.

I know that, but stdio is not necessarily tied to off_t anyway --
except for fseek/ftell, which had to be modified into fseeko/ftello
(exceot on Windows) anyway.

According to google, others have used stdio for large files with some
success.


Re: Large file problem

2005-02-27 Thread Hrvoje Niksic
Gisle Vanem [EMAIL PROTECTED] writes:

 There is strace for Win-NT too. But I dare not install it to find
 out.

Hmm, OK.

 PS. it is quite annoying to get 2 copies of every message.

I'll try to remember to edit the headers to leave your private address
out.

 Also, there should be a Reply-to: header so the replies goes to
 the list.  Just my 0.02 â.¬.

There shouldn't IMHO, because then replies intended to be private end
up on the list, which is really really bad.


Re: Large file problem

2005-02-02 Thread Ulf Härnhammar
Quoting Rainer Zocholl [EMAIL PROTECTED]:

 But today i have to report the first problem:
 the attempt to download a 3.3GB file (Suse Linux Image) 
 failed in several ways:

 wget seems to get an integer wrap arround
 see -931,424,256 as size...

It is a Frequently Asked Question, with the answer that people are working on
it.

// Ulf



Re: Large file problem

2005-02-02 Thread Rainer Zocholl
On Wed, Feb 02, 2005 at 11:13:09AM +0100, Rainer Zocholl wrote:
 
 1) Wget fails on resume
 2) Wget terminates with an assertion
(BTW: squid has problems too)
 
 Assertion failed: bytes = 0, file retr.c, line 292

Meanwhile i found the FAQ (with manually seraching the mailing list!)... ;-)
http://www.gnu.org/software/wget/faq.html#3.1
  Does GNU wget support files larger than 2GB?


Maybe the FAQ should include this error message(s) explicit 
and not just issues a misleading error message.

I searched for the message, but google shows only 2 old 
two hits with that error message.

Too the statement It downloads [only] a portion of the file 
is at least for files smaller than 4GiB not true for 
the windows NT version and some server.
wget downloaded the 3,3GB file intact.