[Fwd: wget 1.9.1 feature suggestions]

2005-02-19 Thread Mat Garrett
-- Mat Garrett [EMAIL PROTECTED] ---BeginMessage--- I have some suggestions for small modifications for enhancing future versions of wget. I understand the need to first retrieve a .listing when using --timestamping with an ftp:// URL, but shouldn't that step be skipped when downloading the

Re: Wget - Interpret Debug Output

2005-02-19 Thread Jonathan Share
Sorry for the Dual post Steven, just realised I hadn't sent it to the list. On Sat, 19 Feb 2005 11:26:16 +, Jonathan Share [EMAIL PROTECTED] wrote: On Fri, 18 Feb 2005 22:43:50 -0600 (CST), Steven M. Schweda [EMAIL PROTECTED] wrote: In case it might be useful, I've included the -d

Re: Large file support

2005-02-19 Thread Hrvoje Niksic
[EMAIL PROTECTED] (Steven M. Schweda) writes: 1. I'd say that code like if ( sizeof(number) == 8 ) should have been a compile-time #ifdef rather than a run-time decision. Where do you see such code? grep 'if.*sizeof' *.c doesn't seem to show such examples. 2. Multiple functions like

GNU Wget 1.9.1 Issues with large files ( 2 GB)

2005-02-19 Thread Erik Ohrnberger
Dear GNU, I was trying to download SuSE version 9.2 from the local mirror site thinking that I could get the entire package as a single DVD image ( 2 GB). So I did the wget command with the appropriate FTP arguments, and run it in the background. The first clue that this was going to have

GNU Wget 1.9.1 Issues with large files ( 2 GB)

2005-02-19 Thread Erik Ohrnberger
Dear GNU, I was trying to download SuSE version 9.2 from the local mirror site thinking that I could get the entire package as a single DVD image ( 2 = GB). So I did the wget command with the appropriate FTP arguments, and run it = in the background. The first clue that this was going to

Gmane

2005-02-19 Thread Hrvoje Niksic
I propose to make this list available via gmane, www.gmane.com. It buys us good archiving, as well as NNTP access. Is there anyone who would object to that?

Re: wget support for large files

2005-02-19 Thread Chris Ross
On Feb 18, 2005, at 9:16 PM, Mauro Tortonesi wrote: On Saturday 12 February 2005 10:29 am, Chris Ross wrote: The wget web page at www.gnu.org has a link for the mailing list that doesn't work. So I'm emailing here. which link? could you please tell me so that i can fix it? Under Request an

Re: Large file support

2005-02-19 Thread Hrvoje Niksic
Roman Bednarek [EMAIL PROTECTED] writes: The Info-ZIP code uses one function with a ring of string buffers to ease the load on the programmer. That makes sense. I assume the print function also receives an integer argument specifying the ring position? The function can have a circular

Re: GNU Wget 1.9.1 Issues with large files ( 2 GB)

2005-02-19 Thread Hrvoje Niksic
Erik Ohrnberger [EMAIL PROTECTED] writes: I was trying to download SuSE version 9.2 from the local mirror site thinking that I could get the entire package as a single DVD image ( 2 GB). So I did the wget command with the appropriate FTP arguments, and run it in the background.

Re: Interpret Debug Output

2005-02-19 Thread Jonathan Share
On Sat, 19 Feb 2005 15:54:42 -0500, Post, Mark K [EMAIL PROTECTED] wrote: That does seem a bit odd. I did a wget www.exeter-airport.co.uk command using 1.9.1, and got this result: wget www.exeter-airport.co.uk --15:52:05-- http://www.exeter-airport.co.uk/ = `index.html'

Re: Interpret Debug Output

2005-02-19 Thread Jonathan Share
On Sat, 19 Feb 2005 18:06:37 -0500, Post, Mark K [EMAIL PROTECTED] wrote: I meant your users' problem seemed a bit odd, not the fact that my attempt worked. Sorry, it's gettin late over here, I misread your message. It really is time I went to bed. Thanks again anyway. Jon Mark Post