From: Jonathan Bazemore:

> I've repeatedly tried [...]

   If it's still true that you're "using wget 1.9", you can probably try
until doomsday with little chance of success.  Wget 1.9 does not support
large files.  Wget 1.10.2 does support large files.

>    Try the current version of wget, 1.10.2, which offers large-file
> support on many systems, possibly including your unspecified one.

   Still my advice.

   In the future, it might help if you would supply some useful
information, like the wget version you're using, and the system type
you're using it on.  Also, actual commands used and actual output which
results would be more useful than vague descriptions like "consistently
breaking" and "will not resume".

> I've used a file splitting program to break the
> partially downloaded database file into smaller parts
> of differing size.  Here are my results: [...]

   So, what, you're messing with the partially downloaded file, and you
expect wget to figure out what to do?  Good luck.

> [...] wget (to my knowledge) doesn't do error checking
> in the file itself, it just checks remote and local
> file sizes and does a difference comparison,
> downloading the remainder if the file size is smaller
> on the client side.

   Only if it can cope with a number as big as the size of the file. 
Wget 1.9 uses 32-bit integers for file size, and that's not enough bits
for numbers over 4G.  And if you start breaking up the partially
downloaded file, what's it supposed to use for the size of the data
already downloaded?

> Wikipedia doesn't have tech support, [...]

   Perhaps because they'd get too many questions like this one too many
times.

------------------------------------------------------------------------

   Steven M. Schweda               [EMAIL PROTECTED]
   382 South Warwick Street        (+1) 651-699-9818
   Saint Paul  MN  55105-2547

Reply via email to