Re: WGET bug...

2008-07-11 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 HARPREET SAWHNEY wrote: Hi, I am getting a strange bug when I use wget to download a binary file from a URL versus when I manually download. The attached ZIP file contains two files: 05.upc --- manually downloaded dum.upc---

Re: WGET bug...

2008-07-11 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 HARPREET SAWHNEY wrote: Hi, Thanks for the prompt response. I am using GNU Wget 1.10.2 I tried a few things on your suggestion but the problem remains. 1. I exported the cookies file in Internet Explorer and specified that in the Wget

[fwd] Wget Bug: recursive get from ftp with a port in the url fails

2007-09-17 Thread Hrvoje Niksic
---BeginMessage--- Hi,I am using wget 1.10.2 in Windows 2003.And the same problem like Cantara. The file system is NTFS. Well I find my problem is, I wrote the command in schedule tasks like this: wget -N -i D:\virus.update\scripts\kavurl.txt -r -nH -P d:\virus.update\kaspersky well, after

Re: [fwd] Wget Bug: recursive get from ftp with a port in the url fails

2007-09-17 Thread Micah Cowan
Hrvoje Niksic wrote: Subject: Re: Wget Bug: recursive get from ftp with a port in the url fails From: baalchina [EMAIL PROTECTED] Date: Mon, 17 Sep 2007 19:56:20 +0800 To: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Message-ID: [EMAIL PROTECTED] MIME-Version: 1.0 Content-Type

Re: wget bug?

2007-07-09 Thread Mauro Tortonesi
On Mon, 9 Jul 2007 15:06:52 +1200 [EMAIL PROTECTED] wrote: wget under win2000/win XP I get No such file or directory error messages when using the follwing command line. wget -s --save-headers http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc; %1 = 212BI Any ideas? hi

Re: wget bug?

2007-07-09 Thread Matthias Vill
Mauro Tortonesi schrieb: On Mon, 9 Jul 2007 15:06:52 +1200 [EMAIL PROTECTED] wrote: wget under win2000/win XP I get No such file or directory error messages when using the follwing command line. wget -s --save-headers http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc; %1 = 212BI

wget bug?

2007-07-08 Thread Nikolaus_Hermanspahn
wget under win2000/win XP I get No such file or directory error messages when using the follwing command line. wget -s --save-headers http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc; %1 = 212BI Any ideas? thank you Dr Nikolaus Hermanspahn Advisor (Science) National Radiation

RE: wget bug

2007-05-24 Thread Tony Lewis
Highlord Ares wrote: it tries to download web pages named similar to http://site.com?variable=yesmode=awesome http://site.com?variable=yesmode=awesome Since is a reserved character in many command shells, you need to quote the URL on the command line: wget

wget bug

2007-05-23 Thread Highlord Ares
when I run wget on a certain sites, it tries to download web pages named similar to http://site.com?variable=yesmode=awesome. However, wget isn't saving any of these files, no doubt because of some file naming issue? this problem exists in both the Windows unix versions. hope this helps

RE: wget bug

2007-05-23 Thread Willener, Pat
PROTECTED] On Behalf Of Highlord Ares Sent: Thursday, May 24, 2007 11:41 To: [EMAIL PROTECTED] Subject: wget bug when I run wget on a certain sites, it tries to download web pages named similar to http://site.com?variable=yesmode=awesome. However, wget isn't saving any of these files, no doubt

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
Hi, I am trying to download a Wiki category for off-line browsing, and am using a command-line like this: wget http://wiki/Category:Fish -r -l 1 -k Wiki categories contain colons in their filenames, for example: Category:Fish If I request that wget convert absolute paths to relative links,

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
Hi, I am trying to download a Wiki category for off-line browsing, and am using a command-line like this: wget http://wiki/Category:Fish -r -l 1 -k Wiki categories contain colons in their filenames, for example: Category:Fish If I request that wget convert absolute paths to relative links,

Re: wget bug in finding files after disconnect

2006-11-18 Thread Georg Schulte Althoff
Paul Bickerstaff [EMAIL PROTECTED] wrote in news:[EMAIL PROTECTED]: I'm using wget version GNU Wget 1.10.2 (Red Hat modified) on a fedora core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b on a WinXP laptop. Both display the same faulty behaviour which I don't believe

wget bug

2006-11-01 Thread lord maximus
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.

Re: new wget bug when doing incremental backup of very large site

2006-10-21 Thread Steven M. Schweda
From dev: I checked and the .wgetrc file has continue=on. Is there any way to surpress the sending of getting by byte range? I will read through the email and see if I can gather some more information that may be needed. Remove continue=on from .wgetrc? Consider: -N, --timestamping

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with

Re: new wget bug when doing incremental backup of very large site

2006-10-15 Thread Steven M. Schweda
1. It would help to know the wget version (wget -V). 2. It might help to see some output when you add -d to the wget command line. (One existing file should be enough.) It's not immediately clear whose fault the 416 error is. It might also help to know which Web server is running on the

[WGET BUG] - Can not retreive image from cacti

2006-06-19 Thread Thomas GRIMONET
Hello, We are using version 1.10.2 of wget under Ubuntu and Debian. So we have many scripts that get some images from a cacti site. These scripts ran perfectly with version 1.9 of wget but they can not get image with version 1.10.2 of wget. Here you can find an example of our

Re: Wget Bug: recursive get from ftp with a port in the url fails

2006-04-13 Thread Hrvoje Niksic
Jesse Cantara [EMAIL PROTECTED] writes: A quick resolution to the problem is to use the -nH command line argument, so that wget doesn't attempt to create that particular directory. It appears as if the problem is with the creation of a directory with a ':' in the name, which I cannot do

Wget Bug: recursive get from ftp with a port in the url fails

2006-04-12 Thread Jesse Cantara
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is: wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*Where Directory contains multiple subdirectories, all

wget bug: doesn't CWD after ftp failure

2006-03-05 Thread Nate Eldredge
Hi folks, I think I have found a bug in wget where it fails to change the working directory when retrying a failed ftp transaction. This is wget 1.10.2 on FreeBSD-6.0/amd64. I was trying to use wget to get files from a broken ftp server which occasionally sends garbled responses, causing

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Hrvoje Niksic
[EMAIL PROTECTED] (Steven M. Schweda) writes: and adding it fixed many problems with FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my two-step CWD for a relative path? That is: [...] That should work too. On Unix-like FTP servers,

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Steven M. Schweda
From: Hrvoje Niksic [...] On Unix-like FTP servers, the two methods would be equivalent. Right. So I resisted temptation, and kept the two-step CWD method in my code for only a VMS FTP server. My hope was that some one would look at the method, say That's a good idea, and change the if

wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Hello, current wget seems to have the following bug in the ftp retrieval code: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD into the

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Arne Caspari [EMAIL PROTECTED] writes: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD into the directory first. I think the correct

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Mauro Tortonesi
Hrvoje Niksic wrote: Arne Caspari [EMAIL PROTECTED] writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET A/B/F.X first, then: CWD A/B GET F.X if

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Thank you all for your very fast response. As a further note: When this error occurs, wget bails out with the following error message: No such directory foo/bar. I think it should instead be Could not access foo/bar: Permission denied or similar in such a situation. /Arne Mauro Tortonesi

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: Hrvoje Niksic wrote: Arne Caspari [EMAIL PROTECTED] writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Hrvoje Niksic [EMAIL PROTECTED] writes: That might work. Also don't prepend the necessary prepending of $CWD to those paths. Oops, I meant don't forget to prepend

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic Also don't [forget to] prepend the necessary [...] $CWD to those paths. Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. As you might recall from my changes for VMS FTP servers (if you had ever looked at them), this scheme causes no end

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Daniel Stenberg
On Fri, 25 Nov 2005, Steven M. Schweda wrote: Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. I agree. What good would prepending do? It will most definately add problems such as those Steven describes. -- -=- Daniel Stenberg -=-

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic Prepending is already there, Yes, it certainly is, which is why I had to disable it in my code for VMS FTP servers. and adding it fixed many problems with FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my

wget bug

2005-10-03 Thread Michael C. Haller
Begin forwarded message: From: [EMAIL PROTECTED] Date: October 4, 2005 4:36:09 AM GMT+02:00 To: [EMAIL PROTECTED] Subject: failure notice Hi. This is the qmail-send program at sunsite.dk. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent

wget bug report

2005-06-13 Thread A.Jones
Sorry for the crosspost, but the wget Web site is a little confusing on the point of where to send bug reports/patches. Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with the following error (once for each wget run): Assertion failed: wget_cookie_jar != NULL, file

Re: Wget Bug

2005-04-26 Thread Hrvoje Niksic
Arndt Humpert [EMAIL PROTECTED] writes: wget, win32 rel. crashes with huge files. Thanks for the report. This problem has been fixed in the latest version, available at http://xoomer.virgilio.it/hherold/ .

Wget Bug

2005-04-26 Thread Arndt Humpert
Hello, wget, win32 rel. crashes with huge files. regards [EMAIL PROTECTED] ___ Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de== Command Line wget -m

WGET Bug?

2005-04-04 Thread Nijs, J. de
Title: WGET Bug? # C:\Grabtest\wget.exe -r --tries=3 http://www.xs4all.nl/~npo/ -o C:/Grabtest/Results/log # --16:23:02-- http://www.xs4all.nl/%7Enpo

Wget bug

2005-02-02 Thread Vitor Almeida
OS = Solaris 8 Platform = Sparc Test command = /usr/local/bin/wget -r -t0 -m ftp://root:[EMAIL PROTECTED]/usr/openv/var The directory will count to some sub-direcotry's andfiles to synchronize. Example : # ls -la /usr/openv/total 68462drwxr-xr-x 14 root bin 512 set 1 17:52

Re: wget bug: spaces in directories mapped to %20

2005-01-17 Thread Jochen Roderburg
Zitat von Tony O'Hagan [EMAIL PROTECTED]: Original path: abc def/xyz pqr.gif After wget mirroring: abc%20def/xyz pqr.gif (broken link) wget --version is GNU Wget 1.8.2 This was a well-known error in the 1.8 versions of wget, which is already corrected in the 1.9

wget bug: spaces in directories mapped to %20

2005-01-16 Thread Tony O'Hagan
Recently I used the following wget command under a hosted linux account: $ wget -mirror url -o mirror.log The web site contained files and virtual directories that contained spaces in the names. URL encoding translated these spaces to %20. wget correctly URL decoded the file names (creating

wget bug

2005-01-15 Thread Matthew F. Dennis
It seems that wget uses a signed 32 bit value for the content-length in HTTP. I haven't looked at the code, but it appears that this is what is happening. The problem is that when a file larger than about 2GB is downloaded, wget reports negative numbers for it's size and quits the download

wget bug with large files

2004-12-10 Thread Roberto Sebastiano
I got a crash in wget downloading a large iso file (2,4 GB) newdeal:/pub/isos# wget -c ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso --09:22:17-- ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso = `FC3-i386-DVD.iso' Resolving

I want to report a wget bug

2004-11-24 Thread jiaming
Hello! I am very pleased to use wget to crawl pages. It is an excellent tool. Recently I find a bug in using wget, although I am not sure wether it's a bug or an incorrect usage. I just to want to report here. When I use wget to mirror or recursively download a web site with -O option, I

wget -- bug / feature request (not sure)

2004-09-04 Thread Vlad Kudelin
Hello, Probably I am just too lazy, haven't spent enough time to read the man, and wget can actually do exactly what I want. If so -- I do apologize for taking your time. Otherwise: THANKS for your time!..:-). My problem is: redirects. I am trying to catch them by using, say, netcat

Re: wget bug with ftp/passive

2004-08-12 Thread Jeff Connelly
On Wed, 21 Jan 2004 23:07:30 -0800, you wrote: Hello, I think I've come across a little bug in wget when using it to get a file via ftp. I did not specify the passive option, yet it appears to have been used anyway Here's a short transcript: Passive FTP can be specified in /etc/wgetrc or

wget bug: directory overwrite

2004-04-05 Thread Juhana Sadeharju
Hello. Problem: When downloading all in http://udn.epicgames.com/Technical/MyFirstHUD wget overwrites the downloaded MyFirstHUD file with MyFirstHUD directory (which comes later). GNU Wget 1.9.1 wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Solution: Use of -E

wget bug report

2004-03-26 Thread Corey Henderson
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it bounced and said to try this email address. This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9 rpm -q wget wget-1.8.2-9 When I use a wget with the -S to show the http headers, and I use

wget bug in retrieving large files 2 gig

2004-03-09 Thread Eduard Boer
Hi, While downloading a file of about 3,234,550,172 bytes with wget http://foo/foo.mpg; I get an error: HTTP request sent, awaiting response... 200 OK Length: unspecified [video/mpeg] [ = ] -1,060,417,124 13.10M/s

Re: wget bug with ftp/passive

2004-01-22 Thread Hrvoje Niksic
don [EMAIL PROTECTED] writes: I did not specify the passive option, yet it appears to have been used anyway Here's a short transcript: [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip --21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip =

Re: wget bug

2004-01-12 Thread Hrvoje Niksic
Kairos [EMAIL PROTECTED] writes: $ cat wget.exe.stackdump [...] What were you doing with Wget when it crashed? Which version of Wget are you running? Was it compiled for Cygwin or natively for Windows?

wget bug

2004-01-06 Thread Kairos
$ cat wget.exe.stackdump Exception: STATUS_ACCESS_VIOLATION at eip=77F51BAA eax= ebx= ecx=0700 edx=610CFE18 esi=610CFE08 edi= ebp=0022F7C0 esp=0022F74C program=C:\nonspc\cygwin\bin\wget.exe cs=001B ds=0023 es=0023 fs=0038 gs= ss=0023 Stack trace: Frame Function

Wget Bug

2003-11-10 Thread Kempston
Here is debug output :/FTPD# wget ftp://ftp.dcn-asu.ru/pub/windows/update/winxp/xpsp2-1224.exe -d DEBUG output created by Wget 1.8.1 on linux-gnu. --13:25:55--

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
The problem is that the server replies with login incorrect, which normally means that authorization has failed and that further retries would be pointless. Other than having a natural language parser built-in, Wget cannot know that the authorization is in fact correct, but that the server

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
Kempston [EMAIL PROTECTED] writes: Yeah, i understabd that, but lftp hadles it fine even without specifying any additional option ;) But then lftp is hammering servers when real unauthorized entry occurs, no? I`m sure you can work something out Well, I'm satisfied with what Wget does now.

Re: dificulty with Debian wget bug 137989 patch

2003-09-30 Thread Hrvoje Niksic
jayme [EMAIL PROTECTED] writes: [...] Before anything else, note that the patch originally written for 1.8.2 will need change for 1.9. The change is not hard to make, but it's still needed. The patch didn't make it to canonical sources because it assumes `long long', which is not available on

dificulty with Debian wget bug 137989 patch

2003-09-29 Thread jayme
I tried the patch Debian bug report 137989 and didnt work. Can anybody explain: 1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one wget-1.8.2 ? 2 - why after compilation the wget still cant download the file 2GB ? note : I cut the patch for debian use ( the first

wget bug

2003-09-26 Thread Jack Pavlovsky
It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg -- The human knowledge belongs to the world

Re: wget bug

2003-09-26 Thread DervishD
Hi Jack :) * Jack Pavlovsky [EMAIL PROTECTED] dixit: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg

Re: wget bug

2003-09-26 Thread Hrvoje Niksic
Jack Pavlovsky [EMAIL PROTECTED] writes: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg Thanks for the report.

wget bug

2002-11-05 Thread Jing Ping Ye
Dear Sir: I tried to use "wget" download data from ftp site but got error message as following: > wget ftp://ftp.ngdc.noaa.gov/pub/incoming/RGON/anc_1m.OCT Screen show:

wget bug (?): --page-requisites should supercede robots.txt

2002-09-22 Thread Jamie Flournoy
Using wget 1.8.2: $ wget --page-requisites http://news.com.com ...fails to retrieve most of the files that are required to properly render the HTML document, because they are forbidden by http://news.com.com/robots.txt . I think that use of --page-requisites implies that wget is being used

Wget Bug: Re: not downloading everything with --mirror

2002-08-15 Thread Max Bowsher
Funk Gabor wrote: HTTP does not provide a dirlist command, so wget parses html to find other files it should download. Note: HTML not XML. I suspect that is the problem. If wget wouldn't download the rest, I'd say that too. But 1st the dir gets created, the xml is dloaded (in some other

Wget bug: 32 bit int for bytes downloaded.

2002-08-04 Thread Rogier Wolff
It seems wget uses a 32 bit integer for the bytes downloaded: [...] FINISHED --17:11:26-- Downloaded: 1,047,520,341 bytes in 5830 files cave /home/suse8.0# du -s 5230588 . cave /home/suse8.0# As it's a once per download variable I'd say it's not that performance critical...

WGET BUG

2002-07-07 Thread Kempston
Hi, i have a problem and would really like you to help me. i`m using wget for downloading list of file urlsvia http proxy. When proxy server goes offline - wget doesn`t retry downloading of files. Can you fix that or can you tell me how can i fix that ?

WGET BUG

2002-07-07 Thread Kempston
:15003/Dragon = `dragon.004 Connecting to 195.108.41.140:3128... failed: Connection refused. FINISHED --01:19:23-- Downloaded: 150,000,000 bytes in 10 files - Original Message - From: Kempston To: [EMAIL PROTECTED] Sent: Monday, July 08, 2002 12:50 AM Subject: WGET BUG

Re: wget bug (overflow)

2002-04-15 Thread Hrvoje Niksic
I'm afraid that downloading files larger than 2G is not supported by Wget at the moment.

wget bug (overflow)

2002-02-26 Thread Vasil Dimov
fbsd1 --- http wget eshop.tar (3.3G) --- fbsd2 command was: # wget http://kamenica/eshop.tar at the second G i got the following: 2097050K .. .. .. .. .. 431.03 KB/s 2097100K .. .. .. .. ..8.14 MB/s 2097150K

Re: wget bug?!

2002-02-19 Thread TD - Sales International Holland B.V.
On Monday 18 February 2002 17:52, you wrote: That would be great. The prob is that I'm using it to retrieve files mostly on servers that are having too much users. No I don't want to hammer the server but I do want to keep on trying with reasonable intervals until I get the file. I think the

Re: wget bug?!

2002-02-18 Thread Ian Abbott
[The message I'm replying to was sent to [EMAIL PROTECTED]. I'm continuing the thread on [EMAIL PROTECTED] as there is no bug and I'm turning it into a discussion about features.] On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote: I've tried -w 30 --waitretry=30 --wait=30

Re: [Wget]: Bug submission

2001-12-29 Thread Hrvoje Niksic
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ] Nuno Ponte [EMAIL PROTECTED] writes: I get a segmentation fault when invoking: wget -r http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html My Wget version is 1.7-3, the one which is

wget bug

2001-10-10 Thread Muthu Swamy
HI, When I try to send a page to Nextel mobileusing the following command from unix box, "wget http://www.nextel.com/cgi-bin/sendPage.cgi?to01=4157160856%26message=hellothere%26action=send" The wget returns the following message but the page is not reaching the phone. "--15:59:16--

wget bug

2001-10-08 Thread Dmitry . Karpov
Dear sir. When I out to my browser (NN'3) line http://find.infoart.ru/cgi-bin/yhs.pl?hidden=http%3A%2F%2F194.67.26.82word=FreeBSD wget working correctly. When I put this line to wget, wget change this line; argument hidden is http:/194.67.26.82word, argument word is empty. Where I am wrong?

Re: maybe wget bug

2001-04-23 Thread Hrvoje Niksic
Hack Kampbjørn [EMAIL PROTECTED] writes: You have hit one of Wget features, it is overzealous in converting URLs into canonical form. As you have discovered Wget first converts all encoded characters back to their real value and then encodes all those that are unsafe sending in URLs. It's a

maybe wget bug

2001-04-04 Thread David Christopher Asher
Hello, I am using wget to invoke a CGI script call, while passing it several variables. For example: wget -O myfile.txt "http://user:[EMAIL PROTECTED]/myscript.cgi?COLOR=blueSHAPE=circle" where myscript.cgi say, makes an image based on the parameters "COLOR" and "SHAPE". The problem I am

wget bug - after closing control connection

2001-03-08 Thread Cezary Sobaniec
Hello, I've found a (less important) bug in wget. I've been dowloading a file from FTP server and the control connection of the FTP service was closed by the server. After that wget started to print incorrectly progress information (beyond 100%). The log follows:

Re: wget bug - after closing control connection

2001-03-08 Thread csaba . raduly
Which version of wget do you use ? Are you aware that wget 1.6 has been released and 1.7 is in development (and they contain a workaround for the "Lying FTP server syndrome" you are seeing) ? -- Csaba Rduly, Software Engineer Sophos Anti-Virus email: [EMAIL PROTECTED]