On 10 Jan 2002 at 17:09, Matt Butt wrote:

> I've just tried to download a 3Gb+ file (over a network using HTTP) with
> WGet and it died at exactly 2Gb.  Can this limitation be removed?

In principle, changes could be made to allow wget to be 
configured
for large file support, by using the appropriate data 
types (i.e.
'off_t' instead of 'long').

The logging code would be more complicated as there is 
no portable
way to handle the data type in a printf-style function, 
so these
would have to be converted to strings by a bespoke 
routine and the
converted strings passed to the printf-style function. 
This would
also slow down the operation of wget a little bit.

A version of wget configured for large file support 
would also be
slower in general than a version not configured for 
large file
support - at least on a 32-bit machine.

Large file support should probably be added to the TODO 
list at
least. Quite a few people use wget to download .iso 
images of
CD-ROMs at the moment; in the future, those same people 
are
likely to want to use wget to download DVD-ROM images!

Reply via email to