bug in wget windows
done. == PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ = ] -673,009,664 113,23K/s Assertion failed: bytes = 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. smime.p7s Description: S/MIME Cryptographic Signature
Re: bug in wget windows
Tobias Koeck wrote: done. == PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ = ] -673,009,664 113,23K/s Assertion failed: bytes = 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. you are probably using an older version of wget, without long file support. please upgrade to wget 1.10.2. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.tortonesi.com University of Ferrara - Dept. of Eng.http://www.ing.unife.it GNU Wget - HTTP/FTP file retrieval tool http://www.gnu.org/software/wget Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net Ferrara Linux User Group http://www.ferrara.linux.it
brief report on NTLM buffer overflow
i am not going to publish a complete security advisory on this topic, but i think wget users deserve a little bit more information about the security vulnerability that was fixed yesterday, october 13th 2005. yesterday i was notified by iDEFENSE of a remotely exploitable buffer overflow in the NTLM authentication code. this vulnerability could allow a malicious website to run arbitrary code on the machine running the wget client. the only two versions of wget vulnerable to this flaw are 1.10 and 1.10.1 with NTLM authentication support enabled. wget binaries compiled without NTLM support are not vulnerable. in addition, NTLM support requires OpenSSL, so wget binaries built without SSL support are not affected by the vulnerability as well. the same vulnerability applies to cURL and libcURL, as the NTLM code in wget was donated by Daniel Stenberg, (lib)cURL's maintainer. Daniel sent me a fix for the flaw which was included in wget 1.10.2, released immediately after i received the vulnerability report and the fix. although there is no known exploit at the time of this writing, i strongly recommend anyone using a wget 1.10 or 1.10.1 binary with NTLM authentication enabled to upgrade to wget 1.10.2 or to recompile their binary without NTLM support. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.tortonesi.com University of Ferrara - Dept. of Eng.http://www.ing.unife.it GNU Wget - HTTP/FTP file retrieval tool http://www.gnu.org/software/wget Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net Ferrara Linux User Group http://www.ferrara.linux.it
Re: brief report on NTLM buffer overflow
On Fri, 14 Oct 2005, Noèl Köthe wrote: The last paragraph says something like: Notable is the fast time of reaction of the Open Source developer: two days ago the problem was reported, yesterday corrected packages were produced and details of the vulnerability were published. Just want to give you very possitive feedback and say thanks.:) I mailed Hrvoje the patch roughly 50 minutes after the notification (and he forwarded it to Mauro). Wget 1.10.2 was relased less than 14 hours after the first mail brought our attention to this problem. Both Hrvoje and Mauro acted swiftly and promptly. Nice going guys! (The plan was originally to coordinate the security fix release with vendors and between the curl and wget projects, but due to mistakes did the notification accidentally become public immediately and we had to work really fast to reduce the impact.) -- -=- Daniel Stenberg -=- http://daniel.haxx.se -=- ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol
A bug or suggestion
I saw that the option "-k, --convert-links" make the links on the root directory, not at the directory you down the pages. For example: if I download a page that the url is www.pageexample.com, the pages I download goes into there. But if i use that option, in the pages the links will link to the root directory. For example: if i download at /home and there is a link to www.pageexample.com/test/index.htm, the link must focus /home/www.pageexample.com/test/index.htm, but it focus /www.pageexample.com/test/index.htm. Well, I haven't tested it on Linux yet, but this problem occurs on cygwin (the root directory becomes the partition where the program is installed, like C:). Thak you for your attention, ConradoO difícil se faz agora... O impossível é apenas uma quesão de tempo.A prática leva à perfeição, exceto na roleta russa. Promoção Yahoo! Acesso Grátis: a cada hora navegada você acumula cupons e concorre a mais de 500 prêmios! Participe!