Re: wget does not handle sizes over 2GB

2005-01-12 Thread Mauro Tortonesi
Alle 14:33, mercoledì 12 gennaio 2005, hai scritto: > On Wed, 12 Jan 2005, Wincent Colaiuta wrote: > > Daniel really needs to do one of two things: > > Thanks for telling me what to do. > > Your listing wasn't 100% accurate though. Am I not allowed to discuss > technical solutions for wget if that

Re: wget does not handle sizes over 2GB

2005-01-12 Thread Mauro Tortonesi
Alle 14:06, mercoledì 12 gennaio 2005, Wincent Colaiuta ha scritto: > El 11/01/2005, a las 17:28, Daniel Stenberg escribió: > > On Tue, 11 Jan 2005, Leonid wrote: > >> curl does not survive losing connection. Since the probability to > >> lose connection when you download 2Gb+ files is very high

Re: wget does not handle sizes over 2GB

2005-01-12 Thread Wincent Colaiuta
El 12/01/2005, a las 14:33, Daniel Stenberg escribió: On Wed, 12 Jan 2005, Wincent Colaiuta wrote: Daniel really needs to do one of two things: Thanks for telling me what to do. I was just pointing out your hypocrisy because I found it offensive. When you told Leonid to shut up, did he write back

Re: wget does not handle sizes over 2GB

2005-01-12 Thread Daniel Stenberg
On Wed, 12 Jan 2005, Wincent Colaiuta wrote: Daniel really needs to do one of two things: Thanks for telling me what to do. Your listing wasn't 100% accurate though. Am I not allowed to discuss technical solutions for wget if that involves a term from a different Free Software project I am involv

Re: wget does not handle sizes over 2GB

2005-01-12 Thread Wincent Colaiuta
El 11/01/2005, a las 17:28, Daniel Stenberg escribió: On Tue, 11 Jan 2005, Leonid wrote: curl does not survive losing connection. Since the probability to lose connection when you download 2Gb+ files is very high even if you have a fast connection, This mailing list is for wget, not curl. We ca

Re: wget does not handle sizes over 2GB

2005-01-11 Thread Mauro Tortonesi
Alle 21:47, martedì 11 gennaio 2005, Daniel Stenberg ha scritto: > On Tue, 11 Jan 2005, Mauro Tortonesi wrote: > > oh, come on. let's not fall to the "my software is better than yours" > > childish attitude. > > I'm sorry if it came out that way, it was not my intention. I just wanted > to address

Re: wget does not handle sizes over 2GB

2005-01-11 Thread Daniel Stenberg
On Tue, 11 Jan 2005, Mauro Tortonesi wrote: oh, come on. let's not fall to the "my software is better than yours" childish attitude. I'm sorry if it came out that way, it was not my intention. I just wanted to address the misinformation posted here. I have not said and do not think that X is bet

Re: wget does not handle sizes over 2GB

2005-01-11 Thread Mauro Tortonesi
Alle 17:28, martedì 11 gennaio 2005, hai scritto: > On Tue, 11 Jan 2005, Leonid wrote: > > curl does not survive losing connection. Since the probability to lose > > connection when you download 2Gb+ files is very high even if you have a > > fast connection, > > This mailing list is for wget, not

Re: wget does not handle sizes over 2GB

2005-01-11 Thread Leonid
Daniel, I apologize if I hurt your feeling about curl. Last summer I had to download several 10Gb+ files and I tried to use curl and ncftp. After a day or so of work curl was stopping, freezing forever and I was unable to force it to retry and to resume. Maybe I misused curl, did not understand d

Re: wget does not handle sizes over 2GB

2005-01-11 Thread Daniel Stenberg
On Tue, 11 Jan 2005, Leonid wrote: curl does not survive losing connection. Since the probability to lose connection when you download 2Gb+ files is very high even if you have a fast connection, This mailing list is for wget, not curl. We can talk about what curl does and does not on the curl

wget does not handle sizes over 2GB

2005-01-11 Thread Leonid
Denis, curl does not survive losing connection. Since the probability to lose connection when you download 2Gb+ files is very high even if you have a fast connection, and since I had to download 10+ Gb datasets (not DVDs: data) routinely, nothing remained unless to patch wget. You can get the pa

Re: wget does not handle sizes over 2GB

2005-01-09 Thread Daniel Stenberg
On Sun, 9 Jan 2005, Denis Doroshenko wrote: *size = strtol (respline + 4, NULL, 0); where size is defined as "long int *" in the function's declaration. BTW. why's the base given to strtol is "0", not "10"? isn't that too flexible for a defined protocol? Yes it is, SIZE returns a base-10 number.

wget does not handle sizes over 2GB

2005-01-09 Thread Denis Doroshenko
hello, being meant to be such a powerful download tool, it seems to be at least confusing: file src/ftp-basic.c (function ftp_size, 1.9+cvs-dev, line 1153): *size = strtol (respline + 4, NULL, 0); where size is defined as "long int *" in the function's declaration. BTW. why's the base given to