Re: size over 2GB

2005-02-14 Thread Ulf Härnhammar
Quoting [EMAIL PROTECTED]: > If downloading files with size over 2GB, the file size displayed is wrong: This is a Frequently Asked Question: http://www.gnu.org/software/wget/faq.html#3.1 // Ulf

Re: Bug: really large files cause problems with status text

2005-02-02 Thread Ulf Härnhammar
Quoting Alan Robinson <[EMAIL PROTECTED]>: > When downloading a 4.2 gig file (such as from > ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o > f_the_4th_ave.mpeg ) cause the status text (i.e. > 100%[+===>] 38,641,328 213.92K/sETA >

Re: Large file problem

2005-02-02 Thread Ulf Härnhammar
Quoting Rainer Zocholl <[EMAIL PROTECTED]>: > But today i have to report the first problem: > the attempt to download a 3.3GB file (Suse Linux Image) > failed in several ways: > wget seems to get an integer wrap arround > see "-931,424,256" as size... It is a Frequently Asked Question, with the

Re: downloading files > 2GB

2005-01-31 Thread Ulf Härnhammar
Quoting Cezary Sliwa <[EMAIL PROTECTED]>: > What about downloading files over 2GB on 32-bit platforms? It is a Frequently Asked Question, with the answer that people are working on it. // Ulf

Re: Possible bug when downloading gzipped content

2005-01-01 Thread Ulf Härnhammar
Quoting Christoph Anton Mitterer <[EMAIL PROTECTED]>: > It seems that the joecartoon.com server sends the gzip file > intentionally with an appended 0xA (perhaps is even an error). Can you check if the additional 0xA byte is included in the Content-Length or not? Does it increase the C-L by one

Re: wget: Arbitrary file overwriting/appending/creating and other vulnerabilities

2004-12-10 Thread Ulf Härnhammar
Quoting Jan Minar <[EMAIL PROTECTED]>: > (2) Use alternative retrieval programs, such as pavuk, axel, or > ncftpget. FWIW pavuk is much worse securitywise than wget. I've been working on patching pavuk for a few months, and it has lots of strcpy() and sprintf() calls that lead to buffer overflows

[PATCH] Crashes when parsing malformed directory listings from FTP server

2004-11-05 Thread Ulf Härnhammar
Hello, I have found that it's possible for a malicious FTP server to crash GNU Wget by sending malformed directory listings. Wget will parse them without checking if they are written in the proper format. It will do a fixed number of strtok() calls and then atoi() calls, and with the wrong format,