WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
According to the wget release notes for 1.10 *** Talking to SSL/TLS servers over proxies now actually works. Previous versions of Wget erroneously sent GET requests for https URLs. Wget 1.10 utilizes the CONNECT method designed for this purpose. However, I have tried versions 1.10, 1.10.1, and

Re: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Hrvoje Niksic
. Any suggestions? The bug referred to in the release notes manifested itself differently: Wget would connect to the proxy server, and request the https URL using GET. The proxies (correctly) refused to obey this order, as it would pretty much defeat the purpose of using SSL.

RE: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
This is indeed the solution. I have double checked the wget documentation. There is no mention of the https_proxy parameter. The manual and sample wgetrc that are provided list http_proxy and ftp_proxy - that is all. Apparently, the bug is with the documentation, not the application itself

Re: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Hrvoje Niksic
Schatzman, James (Mission Systems) [EMAIL PROTECTED] writes: I have double checked the wget documentation. There is no mention of the https_proxy parameter. The manual and sample wgetrc that are provided list http_proxy and ftp_proxy - that is all. Apparently, the bug

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Jean-Marc MOLINA
Tony Lewis wrote: The --convert-links option changes the website path to a local file system path. That is, it changes the directory, not the file name. Thanks I didn't understand it that way. IMO, your suggestion has merit, but it would require wget to maintain a list of MIME types and

bug in wget windows

2005-10-14 Thread Tobias Koeck
done. == PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ = ] -673,009,664 113,23K/s Assertion failed: bytes = 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way. Please contact the

Re: bug in wget windows

2005-10-14 Thread Mauro Tortonesi
Tobias Koeck wrote: done. == PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ = ] -673,009,664 113,23K/s Assertion failed: bytes = 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way.

A bug or suggestion

2005-10-14 Thread Conrado Miranda
I saw that the option "-k, --convert-links" make the links on the root directory, not at the directory you down the pages. For example: if I download a page that the url is www.pageexample.com, the pages I download goes into there. But if i use that option, in the pages the links will link to the

a bug about wget

2005-10-04 Thread baidu baidu
That is, there is HTML like this: pClick the following to go to the a href=http://www.something.com/junk.asp?thepageIwant=2;;next page/a./p What I need is for wget to understand that stuff following an ? in a URL indicates that it's a distinctly different page, and it should go recursively

wget bug

2005-10-03 Thread Michael C. Haller
Begin forwarded message: From: [EMAIL PROTECTED] Date: October 4, 2005 4:36:09 AM GMT+02:00 To: [EMAIL PROTECTED] Subject: failure notice Hi. This is the qmail-send program at sunsite.dk. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent

Re: Bug rpt

2005-09-20 Thread Hrvoje Niksic
HonzaCh [EMAIL PROTECTED] writes: My localeconv()-thousands_sep (as well as many other struct members) reveals to empty string () (MSVC6.0). How do you know? I mean, what program did you use to check this? My quick'n'dirty one. See the source below. Your source neglects to

Re: Bug rpt

2005-09-19 Thread Hrvoje Niksic
HonzaCh [EMAIL PROTECTED] writes: Latest version (1.10.1) turns out an UI bug: the thousand separator (space according to my local settings) displays as á (character code 0xA0, see attch.) Although it does not affect the primary function of WGET, it looks quite ugly. Env.: Win2k Pro/Czech

Re: openssl server renogiation bug in wget

2005-08-26 Thread Hrvoje Niksic
Thanks for the report; I've applied this patch: 2005-08-26 Jeremy Shapiro [EMAIL PROTECTED] * openssl.c (ssl_init): Set SSL_MODE_AUTO_RETRY. Index: openssl.c === --- openssl.c (revision 2063) +++ openssl.c (working

openssl server renogiation bug in wget

2005-08-18 Thread Jeremy Shapiro
I believe I've encountered a bug in wget. When using https, if the server does a renegotiation handshake wget fails trying to peek for the application data. This occurs because wget does not set the openssl context mode SSL_MODE_AUTO_RETRY. When I added the line: SSL_CTX_set_mode (ssl_ctx

Malformed Command Line or Bug?

2005-08-03 Thread Jens
Hi wget list! Is it intended that wget -Pd:\goog http://www.google.com/; works, whereas wget -Pd:\goog\ http://www.google.com/; does give the error message wget: missing URL ? Running wget 1.10 on Windows XP. Cheers Jens

[Fwd: Bug#319088: wget: don't rely on exactly one blank char between size and month]

2005-07-20 Thread Noèl Köthe
Hello, giuseppe wrote a patch for 1.10.1.beta1. Full report can be viewed here: http://bugs.debian.org/319088 Weitergeleitete Nachricht Von: giuseppe bonacci [EMAIL PROTECTED] Antwort an: giuseppe bonacci [EMAIL PROTECTED], [EMAIL PROTECTED] An: Debian Bug Tracking System

Bug? gettin file 2 GB fails

2005-07-07 Thread Jogchum Reitsma
Hello, I'm not sure it's a bug, but behaviour descibes below seems strange to me, so I thought it was wise to report it: I'm trying to get a Suse 9.3 ISO from sunsite.informatik.rwth-aachen.de, a file that is 4383158 KB according to the FTP-listing. wget gets about 2.4 GB, than quits

Re: Mingw bug ?

2005-07-02 Thread A . Carkaci
Hrvoje Niksic hniksic at xemacs.org writes: A. Carkaci carkaci at spk.gov.tr writes: ---request begin--- GET /images/spk.ico HTTP/1.0 Referer: http://www.spk.gov.tr/ User-Agent: Wget/1.10 Accept: */* Host: www.spk.gov.tr Connection: Keep-Alive ---request end--- HTTP request

RE: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
, assuming HTTP/0.9 Length: unspecified -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Saturday, July 02, 2005 1:04 AM To: Abdurrahman ÇARKACIOĞLU Cc: wget@sunsite.dk Subject: Re: Mingw bug ? A. Carkaci [EMAIL PROTECTED] writes: ---request begin--- GET /images

Re: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes: Here are the results.. ---request begin--- GET /images/spk.ico HTTP/1.0 Referer: http://www.spk.gov.tr/ User-Agent: Wget/1.10 Accept: */* Host: www.spk.gov.tr Connection: Keep-Alive ---request end--- HTTP request sent, awaiting

Re: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
I believe this patch should fix the problem. Could you apply it and let me know if it fixes things for you? 2005-07-02 Hrvoje Niksic [EMAIL PROTECTED] * http.c (gethttp): Except for head_only, use skip_short_body to skip the non-20x error message before leaving gethttp.

YNT: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
Title: YNT: Mingw bug ? Now, it works. Thanks a lot. But I want to understand what is going on ? Was it a bug ? Will you consider the patch for future release of Wget. -Özgün İleti- Kimden: Hrvoje Niksic [mailto:[EMAIL PROTECTED]] Gönderilmiş: Cmt 02.07.2005 14:06 Kime

Re: YNT: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes: Now, it works. Thanks a lot. But I want to understand what is going on ? Was it a bug ? It was a combination of two Wget bugs, one in actual code and other in MinGW configuration. Wget 1.9.1 and earlier used to close connections to the server

YNT: YNT: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
Title: YNT: YNT: Mingw bug ? -Özgün İleti- Kimden: Hrvoje Niksic [mailto:[EMAIL PROTECTED]] Gönderilmiş: Cmt 02.07.2005 16:00 Kime: Abdurrahman ÇARKACIOĞLU Bilgi: wget@sunsite.dk Konu: Re: YNT: Mingw bug ? Will you consider the patch for future release of Wget. It's already

Re: YNT: YNT: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes: It's already in the repository. I think you forget to put -DHAVE_SELECT statement into makefile.src.mingw at http://svn.dotsrc.org/repo/wget/branches/1.10/windows/. Am I right ? That was published in a separate patch -- specifically,

Mingw bug ?

2005-07-01 Thread Abdurrahman ÇARKACIOĞLU
I succesfully compiled Wget 1.10 using mingw. Although Heiko Herold's wget 1.10 (original wget.exe I mean) (from http://space.tin.it/computer/hherold/) succesfully download the following site, my compiled wget (produced by mingw32-make) hangs immediately forever. Any idea ? wget www.spk.gov.tr

Re: Mingw bug ?

2005-07-01 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes: I succesfully compiled Wget 1.10 using mingw. Although Heiko Herold's wget 1.10 (original wget.exe I mean) (from http://space.tin.it/computer/hherold/) succesfully download the following site, my compiled wget (produced by mingw32-make) hangs

Re: Mingw bug ?

2005-07-01 Thread A . Carkaci
Abdurrahman ÇARKACIOĞLU abdurrahman.carkacioglu at spk.gov.tr writes: I succesfully compiled Wget 1.10 using mingw. Although Heiko Herold's wget 1.10 (original wget.exe I mean) (from http://space.tin.it/computer/hherold/) succesfully download the following site, my compiled wget (produced

Re: Mingw bug ?

2005-07-01 Thread Hrvoje Niksic
A. Carkaci [EMAIL PROTECTED] writes: ---request begin--- GET /images/spk.ico HTTP/1.0 Referer: http://www.spk.gov.tr/ User-Agent: Wget/1.10 Accept: */* Host: www.spk.gov.tr Connection: Keep-Alive ---request end--- HTTP request sent, awaiting response... ---response begin---

RE: ftp bug in 1.10

2005-06-27 Thread Herold Heiko
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] the 64-bit download sum, doesn't work for you. What does this program print? #include stdio.h int main (void) { __int64 n = 100I64; // ten billion, doesn't fit in 32 bits printf(%I64\n, n); return 0; } It should print a

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
Herold Heiko [EMAIL PROTECTED] writes: Downloaded: bytes in 2 files Note missing number of bytes. This would indicate that the %I64 format, which Wget uses to print the 64-bit download sum, doesn't work for you. What does this program print? #include stdio.h int main (void) { __int64 n =

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
Hrvoje Niksic [EMAIL PROTECTED] writes: This would indicate that the %I64 format, which Wget uses to print the 64-bit download sum, doesn't work for you. For what it's worth, MSDN documents it: http://tinyurl.com/ysrh/. Could you be compiling Wget with an older C runtime that doesn't support

Re: ftp bug in 1.10

2005-06-25 Thread Gisle Vanem
Hrvoje Niksic [EMAIL PROTECTED] wrote: It should print a line containing 100. If it does, it means we're applying the wrong format. If it doesn't, then we must find another way of printing LARGE_INT quantities on Windows. I don't know what compiler OP used, but Wget only uses %I64

Re: ftp bug in 1.10

2005-06-25 Thread David Fritz
I64 is a size prefix akin to ll. One still needs to specify the argument type as in %I64d as with %lld.

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
David Fritz [EMAIL PROTECTED] writes: I64 is a size prefix akin to ll. One still needs to specify the argument type as in %I64d as with %lld. That makes sense, thanks for the explanation!

Bug handling session cookies

2005-06-24 Thread Mark Street
Hello folks, I'm running wget v1.10 compiled from source (tested on HP-UX and Linux). I am having problems handling session cookies. The idea is to request a web page which returns an ID number in a session cookie. All subsequent requests from the site must contain this session cookie. I'm

Re: Bug handling session cookies

2005-06-24 Thread Hrvoje Niksic
to cookie code; * Removing the special logic from path_match. With that change your test case seems to work, and so do all the other tests I could think of. Please let me know if it works for you, and thanks for the detailed bug report. 2005-06-24 Hrvoje Niksic [EMAIL PROTECTED

Re: Bug handling session cookies

2005-06-24 Thread Mark Street
Hrvoje, Many thanks for the explanation and the patch. Yes, this patch successfully resolves the problem for my particular test case. Best regards, Mark Street.

Re: Bug handling session cookies

2005-06-24 Thread Hrvoje Niksic
Mark Street [EMAIL PROTECTED] writes: Many thanks for the explanation and the patch. Yes, this patch successfully resolves the problem for my particular test case. Thanks for testing it. It has been applied to the code and will be in Wget 1.10.1 and later.

BUG? using -O effectively disables -N

2005-06-21 Thread Dennis Kaarsemaker
-to-date wget will not re-download the page. Because this behaviour is unexpected and undocumented, I consider it a bug. -- Sincerely, Dennis Kaarsemaker signature.asc Description: This is a digitally signed message part

Re: Bug: wget cannot handle quote

2005-06-21 Thread Hrvoje Niksic
Will Kuhn [EMAIL PROTECTED] writes: Apparentl wget does not handle single quote or double quote very well. wget with the following arguments give error. wget --user-agent='Mozilla/5.0' --cookies=off --header 'Cookie: testbounce=testing;

Re: Small bug in Wget manual page

2005-06-18 Thread Mauro Tortonesi
On Wednesday 15 June 2005 04:57 pm, Ulf Harnhammar wrote: On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote: the web pages (including the documentation) on gnu.org have just been updated. Nice! I have found some broken links and strange grammar, though: * index.html: There

Re: Small bug in Wget manual page

2005-06-18 Thread Mauro Tortonesi
On Wednesday 15 June 2005 05:14 pm, Ulf Harnhammar wrote: On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote: * faq.html ** 3.1 [..] Yes, starting from version 1.10, GNU Wget support files larger than 2GB. (should be supports) ** 2.0 How I compile GNU Wget? (should be How

ftp bug in 1.10

2005-06-15 Thread Herold Heiko
I have a reproducable report (thanks Igor Andreev) about a little verbouse log problem with ftp with my windows binary, is this reproducable on other platforms, too ? wget -v ftp://garbo.uwasa.fi/pc/batchutil/buf01.zip ftp://garbo.uwasa.fi/pc/batchutil/rbatch15.zip (seems to happen with any

Re: ftp bug in 1.10

2005-06-15 Thread Jochen Roderburg
Herold Heiko schrieb: I have a reproducable report (thanks Igor Andreev) about a little verbouse log problem with ftp with my windows binary, is this reproducable on other platforms, too ? wget -v ftp://garbo.uwasa.fi/pc/batchutil/buf01.zip ftp://garbo.uwasa.fi/pc/batchutil/rbatch15.zip

Re: Small bug in Wget manual page

2005-06-15 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: this seems to be already fixed in the 1.10 documentation. Now that 1.10 is released, we should probably update the on-site documentation.

Re: Small bug in Wget manual page

2005-06-15 Thread Mauro Tortonesi
On Wednesday 15 June 2005 02:05 pm, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: this seems to be already fixed in the 1.10 documentation. Now that 1.10 is released, we should probably update the on-site documentation. i am doing it right now. -- Aequam memento rebus in

Re: Small bug in Wget manual page

2005-06-15 Thread Mauro Tortonesi
On Wednesday 15 June 2005 02:16 pm, Mauro Tortonesi wrote: On Wednesday 15 June 2005 02:05 pm, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: this seems to be already fixed in the 1.10 documentation. Now that 1.10 is released, we should probably update the on-site

Re: Small bug in Wget manual page

2005-06-15 Thread Ulf Harnhammar
On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote: the web pages (including the documentation) on gnu.org have just been updated. Nice! I have found some broken links and strange grammar, though: * index.html: There are archives of the main GNU Wget list at ** fly.cc.fer.hr **

Re: Small bug in Wget manual page

2005-06-15 Thread Ulf Harnhammar
On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote: * faq.html ** 3.1 [..] Yes, starting from version 1.10, GNU Wget support files larger than 2GB. (should be supports) ** 2.0 How I compile GNU Wget? (should be How do I) // Ulf

wget bug report

2005-06-13 Thread A.Jones
Sorry for the crosspost, but the wget Web site is a little confusing on the point of where to send bug reports/patches. Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with the following error (once for each wget run): Assertion failed: wget_cookie_jar != NULL, file

Re: Small bug in Wget manual page

2005-06-07 Thread Mauro Tortonesi
On Thursday 02 June 2005 09:33 am, Herb Schilling wrote: Hi, On http://www.gnu.org/software/wget/manual/wget.html, the section on protocol-directories has a paragraph that is a duplicate of the section on no-host-directories. Other than that, the manual is terrific! Wget is wonderful also.

Small bug in Wget manual page

2005-06-02 Thread Herb Schilling
Title: Small bug in Wget manual page Hi, On http://www.gnu.org/software/wget/manual/wget.html, the section on protocol-directories has a paragraph that is a duplicate of the section on no-host-directories. Other than that, the manual is terrific! Wget is wonderful also. I don't know what I

Re: Serious retrieval bug in wget 1.9.1 and newer

2005-05-30 Thread Werner LEMBERG
Wget doesn't recognize the image tag, Aah, thanks. Should Wget support it to be compatible? IMHO yes. Thanks for your help. Werner

Serious retrieval bug in wget 1.9.1 and newer

2005-05-29 Thread Werner LEMBERG
simply doesn't download -- no error message, no warning. My Mozilla browser displays the page just fine. Since wget downloads the first thumbnail picture `../image/ft2-nautilus-thumb.png' without problems I suspect a serious bug in wget. I'm running wget on a GNU/Linux box. BTW

Re: Serious retrieval bug in wget 1.9.1 and newer

2005-05-29 Thread Hrvoje Niksic
. Since wget downloads the first thumbnail picture `../image/ft2-nautilus-thumb.png' without problems I suspect a serious bug in wget. ft2-nautilus-thumb.png is referenced using the regular img tag. BTW, it is not possible for CVS wget to have builddir != srcdir (after creating the configure

RE: bug with password containing @

2005-05-26 Thread Andrew Gargan
Hi wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz is splitting using on the first @ not the second. Is this a problem with the URL standard or a wget issue? Regards Andrew Gargan

Re: bug with password containing @

2005-05-26 Thread Hrvoje Niksic
Andrew Gargan [EMAIL PROTECTED] writes: wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz is splitting using on the first @ not the second. Encode the '@' as %40 and this will work. For example: wget ftp://someuser:[EMAIL PROTECTED]/some_file.tgz Is this a problem

wget ftp mirror sym links bug ?

2005-05-24 Thread Anton Kaifel
on a Solaris 8 box. Is this a bug or is just my command invoking wget wrong a somethins missing. I couldn't find any other options within the help. Thank you very much in advance. Anton

bug in static build of wget with socks

2005-05-16 Thread Seemant Kulleen
Hi, I wanted to alert you all to a bug in wget, reported by one of our (gentoo) users at: https://bugs.gentoo.org/show_bug.cgi?id=69827 I am the maintainer for the Gentoo ebuild for wget. If someone would be willing to look at and help us with that bug, it'd be much appreciated. Thanks

Re: bug in static build of wget with socks

2005-05-16 Thread Hrvoje Niksic
Seemant Kulleen [EMAIL PROTECTED] writes: I wanted to alert you all to a bug in wget, reported by one of our (gentoo) users at: https://bugs.gentoo.org/show_bug.cgi?id=69827 I am the maintainer for the Gentoo ebuild for wget. If someone would be willing to look at and help us

Re: bug in static build of wget with socks

2005-05-16 Thread Hrvoje Niksic
if that really worked. I don't even know if this is a bug in Wget or in the way that the build is attempted by the Gentoo package mechanism. Providing the actual build output might shed some light on this. if use static; then emake LDFLAGS=--static || die I now tried `LDFLAGS=--static

wget -k -K -E -m bug

2005-05-13 Thread Nicolas Mizel
The following command wget --convert-links --backup-converted --html-extension --mirror http.//localhost/index.php downloads index.php.html and back it up as index.php.html.orig before converting the links. When re-mirroring, wget looks for index.php.orig which doesn't exist and thus re-download

wget html-extension bug

2005-05-13 Thread Nicolas Mizel
The following command wget --convert-links --backup-converted --html-extension --mirror http.//localhost/index.php downloads index.php.html and back it up as index.php.html.orig before converting the links. When re-mirroring, wget looks for local index.php.orig which doesn't exist and thus

wget -k -K -E -m bug

2005-05-13 Thread Nicolas Mizel
The following command wget --convert-links --backup-converted --html-extension --mirror http.//localhost/index.php downloads index.php.html and back it up as index.php.html.orig before converting the links. When re-mirroring, wget looks for local index.php.orig which doesn't exist and thus

Is this a bug in wget ? I need an urgent help!

2005-05-06 Thread Will Kuhn
I try to do something like wget http://website.com/ ... login=usernamedomain=hotmail%2ecom_lang=EN But when wget sends the URL out, the hotmail%2ecom becomes hotmail.com !!! Is this the supposed behaviour ? I saw this on the sniffer. I suppose the translation of %2 to . is done by wget. Because

Re: Is this a bug in wget ? I need an urgent help!

2005-05-06 Thread Hrvoje Niksic
Will Kuhn [EMAIL PROTECTED] writes: I try to do something like wget http://website.com/ ... login=usernamedomain=hotmail%2ecom_lang=EN But when wget sends the URL out, the hotmail%2ecom becomes hotmail.com !!! Is this the supposed behaviour ? Yes. I saw this on the sniffer. I suppose

Re: Is this a bug in wget ? I need an urgent help!

2005-05-06 Thread Hrvoje Niksic
Hrvoje Niksic [EMAIL PROTECTED] writes: Can I have it not do the translation ??! Unfortunately, only by changing the source code as described in the previous mail. BTW I've just changed the CVS code to not decode the % sequences. Wget 1.10 will contain the fix.

Re: Bug when downloading large files (over 2 gigs) from proftpd server.

2005-04-27 Thread Hrvoje Niksic
This problem has been fixed for the upcoming 1.10 release. If you want to try it, it's available at ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2

Re: Wget Bug

2005-04-26 Thread Hrvoje Niksic
Arndt Humpert [EMAIL PROTECTED] writes: wget, win32 rel. crashes with huge files. Thanks for the report. This problem has been fixed in the latest version, available at http://xoomer.virgilio.it/hherold/ .

Wget Bug

2005-04-26 Thread Arndt Humpert
Hello, wget, win32 rel. crashes with huge files. regards [EMAIL PROTECTED] ___ Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de== Command Line wget -m

Bug when downloading large files (over 2 gigs) from proftpd server.

2005-04-26 Thread Bijan Soleymani
Hi, When using wget (version 1.9.1 running on Debian Sarge) to download files over 2 gigs from an ftp server (proftpd), wget reports a negative length and keeps downloading, but once the file is successfully downloaded it crashes (and therefore doesn't download the rest of the files). Here is

WGET Bug?

2005-04-04 Thread Nijs, J. de
Title: WGET Bug? # C:\Grabtest\wget.exe -r --tries=3 http://www.xs4all.nl/~npo/ -o C:/Grabtest/Results/log # --16:23:02-- http://www.xs4all.nl/%7Enpo

Re: de Hacene : raport un bug

2005-03-26 Thread Hrvoje Niksic
Jens Rösner [EMAIL PROTECTED] writes: C:\wgetwget --proxy=on -x -r -l 2 -k -x -limit-rate=50k --tries=45 --directory-prefix=AsptDD As Jens said, Wget 1.5.3 did not yet support bandwidth throttling. Also please note that the option is named --limit-rate, not -limit-rate.

de Hacene : raport un bug

2005-03-25 Thread djdl hassene
Bonjour je n'ai pas une longue expérience sur wget (je doit dire que je l’utilise depuis 1 heure) mais il m'a semblé qu'il y a un problème avec son « interpréteur de commande » Voila j’ai transmis la commande Voici la commande : ** //bug ou pas bug //sous Windows XP

de Hacene : raport un bug 2

2005-03-25 Thread djdl hassene
Bonjour je me suis trompé de fichier mon probleme est que wget ne reconer pas l'option `--limit-rate=50k' voici le fichier G:\Documents and Settings\Hacene\Bureau\wgetwget --proxy=on -x -r -l 2 -k --lim it-rate=50k --tries=45 --directory-prefix=AsptDD http://www.gnu.org/software/wg et/manual/

Re: de Hacene : raport un bug

2005-03-25 Thread Jens Rösner
Hallo! Je ne parle pas francais (ou presque pas du tout)... C:\wgetwget --proxy=on -x -r -l 2 -k -x -l imit-rate=50k --tries=45 --directory-prefix=AsptDD Je pense que ce doit être: C:\wgetwget --proxy=on -x -r -l 2 -k -x -limit-rate=50k --tries=45 --directory-prefix=AsptDD dans un ligne de

Re: Bug

2005-03-20 Thread Jens Rösner
Hi Jorge! Current wget versions do not support large files 2GB. However, the CVS version does and the fix will be introduced to the normal wget source. Jens (just another user) When downloading a file of 2GB and more, the counter get crazy, probably it should have a long instead if a int

bug-wget still useful

2005-03-15 Thread Dan Jacobson
Is it still useful to mail to [EMAIL PROTECTED] I don't think anybody's home. Shall the address be closed?

RE: bug-wget still useful

2005-03-15 Thread Post, Mark K
I don't know why you say that. I see bug reports and discussion of fixes flowing through here on a fairly regular basis. Mark Post -Original Message- From: Dan Jacobson [mailto:[EMAIL PROTECTED] Sent: Tuesday, March 15, 2005 3:04 PM To: [EMAIL PROTECTED] Subject: bug-wget still

Re: bug-wget still useful

2005-03-15 Thread Hrvoje Niksic
Dan Jacobson [EMAIL PROTECTED] writes: Is it still useful to mail to [EMAIL PROTECTED] I don't think anybody's home. Shall the address be closed? If you're referring to Mauro being busy, I don't see it as a reason to close the bug reporting address.

Re: bug-wget still useful

2005-03-15 Thread Dan Jacobson
P I don't know why you say that. I see bug reports and discussion of fixes P flowing through here on a fairly regular basis. All I know is my reports for the last few months didn't get the usual (any!) cheery replies. However, I saw them on Gmane, yes.

RE: one bug?

2005-03-04 Thread Tony Lewis
Jesus Legido wrote: I'm getting a file from https://mfi-assets.ecb.int/dla/EA/ea_all_050303.txt: The problem is not with wget. The file on the server starts with 0xFF 0xFE. Put the following into an HTML file (say temp.html) on your hard drive, open it in your web browser, right click on

Wget bug

2005-02-02 Thread Vitor Almeida
OS = Solaris 8 Platform = Sparc Test command = /usr/local/bin/wget -r -t0 -m ftp://root:[EMAIL PROTECTED]/usr/openv/var The directory will count to some sub-direcotry's andfiles to synchronize. Example : # ls -la /usr/openv/total 68462drwxr-xr-x 14 root bin 512 set 1 17:52

Bug: really large files cause problems with status text

2005-02-02 Thread Alan Robinson
When downloading a 4.2 gig file (such as from ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o f_the_4th_ave.mpeg ) cause the status text (i.e. 100%[+===] 38,641,328 213.92K/sETA 00:00) to print invalid things (in this case, that

Re: Bug: really large files cause problems with status text

2005-02-02 Thread Ulf Härnhammar
Quoting Alan Robinson [EMAIL PROTECTED]: When downloading a 4.2 gig file (such as from ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o f_the_4th_ave.mpeg ) cause the status text (i.e. 100%[+===] 38,641,328 213.92K/sETA 00:00)

Bug in SSL Client Cert handling?

2005-01-28 Thread Steven Enderle
Hello List, we are currently setting up a SSL secured Domain with SSL on both sides (client and server). It works fine with any browser, but i do have problems with wget. The login/auth seems to work and apache reports a 200 code with a correct filesize, but wget says Read error (Unknown

BUG: wget trying to mirror an ftp server via an ftp proxy fails

2005-01-25 Thread Graham Leggett
Hi, When trying to mirror an ftp server via an ftp proxy (set in the ftp_proxy environment variable), recursion breaks. The following command should in theory download an index.html from the ftp proxy and parse it, downloading all links. In reality only index.html is downloaded. wget -r -N

Re: Bug (wget 1.8.2): Wget downloads files rejected with -R.

2005-01-22 Thread jens . roesner
Hi Jason! If I understood you correctly, this quote from the manual should help you: *** Note that these two options [accept and reject based on filenames] do not affect the downloading of HTML files; Wget must load all the HTMLs to know where to go at all--recursive retrieval would make no

Bug (wget 1.8.2): Wget downloads files rejected with -R.

2005-01-21 Thread jason cipriani
When the -R option is specified to reject files by name in recursive mode, wget downloads them anyway then deletes them after downloading. This is a problem when you are trying to be picky about the files you are downloading to save bandwidth. Since wget appears to know the name of the file it

wget manual page bug

2005-01-20 Thread Folkert van Heusden
Hi, The wget manual page misses documentation on what returncode is returned by wget in what situation. Folkert van Heusden Op zoek naar een IT of Finance baan? Mail me voor de mogelijkheden! +--+ |UNIX admin? Then give MultiTail

Not sure if bug, trouble with output from wget.

2005-01-20 Thread tjp77
I don't really know if this is a bug or something I am doing wrong, if not a bug then don't really bother getting too involved on this and just point me to where I should be going. Anyway the pages I retrieve using wget are not showing me the related pictures for the page even though

Bug retrieving files from an AS/400 via FTP (wget 1.8.2)

2005-01-19 Thread Weiss, Benjamin
Hello! I've been trying to get wget to retrieve a file off of our AS/400 using ftp. I'm using wget 1.8.2 on a RedHat Enterprise Linux 3.0 box. The debug output is below. I don't know if you're familiar with AS/400's. They use libraries instead of a directory structure. All libraries are

bug?: wget php

2005-01-18 Thread b_b_g
Hi When I try (command in one line, of course): wget -rH -Dvirtualdub.org --exclude-domains forums.virtualdub.org http://www.virtualdub.org/ wget still get other sites i.e. mikecrash.wz.cz, sourceforge.net, www.google.com etc. Is it a bug? wget work bad with php? Yours Sincererly Greg

Re: wget bug: spaces in directories mapped to %20

2005-01-17 Thread Jochen Roderburg
Zitat von Tony O'Hagan [EMAIL PROTECTED]: Original path: abc def/xyz pqr.gif After wget mirroring: abc%20def/xyz pqr.gif (broken link) wget --version is GNU Wget 1.8.2 This was a well-known error in the 1.8 versions of wget, which is already corrected in the 1.9

wget bug: spaces in directories mapped to %20

2005-01-16 Thread Tony O'Hagan
Recently I used the following wget command under a hosted linux account: $ wget -mirror url -o mirror.log The web site contained files and virtual directories that contained spaces in the names. URL encoding translated these spaces to %20. wget correctly URL decoded the file names (creating

wget bug

2005-01-15 Thread Matthew F. Dennis
a negative number so it exits. Of course, this is all speculation on my part about what the code looks like but none the less, the bug does exist on both linux and cygwin. Thanks, Matt --- BTW: great job, really... on wget and all the GNU software in general... THANKS

what causes bug in 1.8?

2005-01-10 Thread Claus Atzenbeck
Hi, I have seen a strange bug: echo test test wget -O - http://www.w3c.org test Actually, wget should append to test, right? Well, it does in version 1.9, but it does not do that in 1.8 (tested with bash 2.x and 3.0). In version 1.8 it overwrites (!) the file. OK, I see

Re: Possible bug when downloading gzipped content

2005-01-01 Thread Ulf Härnhammar
Quoting Christoph Anton Mitterer [EMAIL PROTECTED]: It seems that the joecartoon.com server sends the gzip file intentionally with an appended 0xA (perhaps is even an error). Can you check if the additional 0xA byte is included in the Content-Length or not? Does it increase the C-L by one or

Re: Possible bug when downloading gzipped content

2005-01-01 Thread Christoph Anton Mitterer
something like gzip --decompress --force --stdout joebutton.swf decompressed.swf it works. I also notived that gzip --decompress --force joebutton.swf.gz (same thing without writing to stdout but directly to a file) does not work. Very strange imho. So my solution to the bug is: A very big

<    1   2   3   4   5   6   7   >