bug in wget 1.8.1/1.8.2

2003-09-16 Thread Dieter Drossmann
Hello, I use a extra file with a long list of http entries. I included this file with the -i option. After 154 downloads I got an error message: Segmentation fault. With wget 1.7.1 everything works well. Is there a new limit of lines? Regards, Dieter Drossmann

Re: bug in wget 1.8.1/1.8.2

2003-09-16 Thread Hrvoje Niksic
no built-in line limit, what you're seeing is a bug. I cannot see anything wrong inspecting the code, so you'll have to help by providing a gdb backtrace. You can get it by doing this: * Compile Wget with `-g' by running `make CFLAGS=-g' in its source directory (after configure, of course.) * Go

Re: possible bug in exit status codes

2003-09-15 Thread Aaron S. Hawley
I can verify this in the cvs version. it appears to be isolated to the recursive behavior. /a On Mon, 15 Sep 2003, Dawid Michalczyk wrote: Hello, I'm having problems getting the exit status code to work correctly in the following scenario. The exit code should be 1 yet it is 0

possible bug in exit status codes

2003-09-14 Thread Dawid Michalczyk
Hello, I'm having problems getting the exit status code to work correctly in the following scenario. The exit code should be 1 yet it is 0 [EMAIL PROTECTED]:~$ wget -d -t2 -r -l1 -T120 -nd -nH -R gif,zip,txt,exe,wmv,htmll,*[1-99] www.cnn.com/foo.html DEBUG output created by Wget 1.8.2 on

bug in wget - wget break on time msec=0

2003-09-13 Thread Boehn, Gunnar von
Hello, I think I found a bug in wget. My GNU wget version is 1.82 My system GNU/Debian unstable I use wget to replay our apache logfiles to a test webserver to try different tuning parameters. Wget fails to run through the logfile and give out the error message that msec =0 failed

Re: bug in wget - wget break on time msec=0

2003-09-13 Thread Hrvoje Niksic
Boehn, Gunnar von [EMAIL PROTECTED] writes: I think I found a bug in wget. You did. But I believe your subject line is slightly incorrect. Wget handles 0 length time intervals (see the assert message), but what it doesn't handle are negative amounts. And indeed: gettimeofday({1063461157

Maybe a bug in wget?

2003-09-09 Thread n_fujikawa
Dear Sir; We are using wget-1.8.2 and it's very convinient for our routine program. By the way, now we have a trouble with the return code from wget in case of trying to use it with -r option, When wget with -r option fails in a ftp connection, wget returns a code 0. If no -r option, it

*** Workaround found ! *** (was: Hostname bug in wget ...)

2003-09-05 Thread webmaster
! Regards Klaus --- Forwarded message follows --- From: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Date sent: Thu, 4 Sep 2003 12:53:39 +0200 Subject:Hostname bug in wget ... Priority: normal ... or a silly sleepless

Re: *** Workaround found ! *** (was: Hostname bug in wget ...)

2003-09-05 Thread Hrvoje Niksic
[EMAIL PROTECTED] writes: I found a workaround for the problem described below. Using option -nh does the job for me. As the subdomains mentioned below are on the same IP as the main domain wget seems not to compare their names but the IP only. I believe newer versions of Wget don't do

Hostname bug in wget ...

2003-09-04 Thread webmaster
... or a silly sleepless webmaster !? Hi, Version == I use the GNU wget version 1.7 which is found on OpenBSD Release 3.3 CD. I use it on i386 architecture. How to reproduce == wget -r coolibri.com (adding the span hosts option did not improve) Problem category =

recursive no-parent bug in 1.8.2

2003-09-01 Thread John Wilkes
I recently upgraded to wget 1.8.2 from an unknown earlier version. In doing recursive http retrievals, I have noticed inconsistent behavior. If I specifiy a directory without the trailing slash in the url, the --no-parent option is ignored, but if the trailing slash is present, it works as

RE: Bug in total byte count for large downloads

2003-08-26 Thread Herold Heiko
Recksiegel [mailto:[EMAIL PROTECTED] Sent: Monday, August 25, 2003 6:49 PM To: [EMAIL PROTECTED] Subject: Bug in total byte count for large downloads Hi, this may be known, but [EMAIL PROTECTED]:/scratch/suse82 wget --help GNU Wget 1.5.3, a non-interactive network retriever. gave me

Bug, feature or my fault?

2003-08-14 Thread DervishD
Hi all :)) After asking in the wget list (with no success), and after having a look at the sources (a *little* look), I think that this is a bug, so I've decided to report here. Let's go to the matter: when I download, thru FTP, some hierarchy, the spaces are translated as '%20

bug: no check accept domain when server redirect

2003-08-14 Thread
I use wget 1.8.2: -r -nH -P /usr/file/somehost.com somehost.com http://somehost.com Bug description: If some script http://somehost.com/cgi-bin/rd.cgi return http header with status 302 and redirect to http://anotherhost.com then the first page of http://anotherhost.com/index.html accepted

bug: no check accept domain when server redirect

2003-08-14 Thread
I use wget 1.8.2: -r -nH -P /usr/file/somehost.com somehost.com http://somehost.com Bug description: If some script http://somehost.com/cgi-bin/rd.cgi return http header with status 302 and redirect to http://anotherhost.com then the first page of http://anotherhost.com/index.html accepted

bug in --spider option

2003-08-11 Thread dEth
Hi everyone! I'm using wget to check if some files are downloadable, I also use to determine the size of the file. Yesterday I noticed that wget ignores --spider option for ftp addresses. It had to show me the filesize and other parameters, but it began to download the file :( That's too bad. Can

Wget 1.8.2 timestamping bug

2003-08-10 Thread Angelo Archie Amoruso
Hi All, I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with Athlon 550 MHz cpu, 128 MB Ram. I've encountered a strange issue, which seem really a bug, using the timestamping option. I'm trying to retrieve the http://www.nic.it/index.html page. The HEAD HTTP method returns that page is 2474

RE: Wget 1.8.2 timestamping bug

2003-08-06 Thread Post, Mark K
1.8.2 timestamping bug Hi All, I'm using Wget 1.8.2 on a Redhat 9.0 box equipped with Athlon 550 MHz cpu, 128 MB Ram. I've encountered a strange issue, which seem really a bug, using the timestamping option. I'm trying to retrieve the http://www.nic.it/index.html page. The HEAD HTTP method

wget for win32; small bug

2003-06-21 Thread Mark
although this is a windows bug, it effects this program. when leeching files with the name prn com1 eg. prn.html wget will freeze up becuse windows will not allow it to save a file with that name. A possable soultion, saving the file as prn_.html just a suggestion. -pionig

Re: trailing '/' of include-directories removed bug

2003-06-16 Thread Aaron S. Hawley
you're right, the include-directories option operates much the same way (my guess in the interest of speed) as the rest of the accept/reject options. which (others have also noticed) is a little flakey. /a On Fri, 13 Jun 2003, wei ye wrote: Did you test your patch? I patched it on my source

Re: trailing '/' of include-directories removed bug

2003-06-13 Thread Aaron S. Hawley
no, i think your original idea of getting rid of the code that removes the trailing slash is a better idea. i think this would fix it but keep the degenerate case of root directory (whatever that's about): Index: src/init.c === RCS

Re: trailing '/' of include-directories removed bug

2003-06-13 Thread wei ye
Did you test your patch? I patched it on my source code and it doesn't work. There are lot of files under http://biz.yahoo.com/edu/, but the patched code only downloaded the index.html. [EMAIL PROTECTED] src]$ ./wget -r --domains=biz.yahoo.com -I /edu/ http://biz.yahoo.com/edu/ [EMAIL

Re: trailing '/' of include-directories removed bug

2003-06-12 Thread Aaron S. Hawley
'/' of include-directories '/r/'. It's a minor bug, but I hope it could be fix in next version. Thanks! static int cmd_directory_vector(...) { ... if (len 1) { if ((*t)[len - 1] == '/') (*t)[len - 1] = '\0'; } ... } = Wei

Re: trailing '/' of include-directories removed bug

2003-06-12 Thread Aaron S. Hawley
oh, i understand your problem. your request seems reasonable. i was trying to see if anyone had an idea why it seemed to be more of a feature than a bug. On Thu, 12 Jun 2003, wei ye wrote: Please take a look this example: $ \rm -rf biz.yahoo.com $ ls biz.yahoo.com $ wget -r --domains

Re: trailing '/' of include-directories removed bug

2003-06-12 Thread wei ye
/. Is it an expected result or a bug? Thanks alot! --- Aaron S. Hawley [EMAIL PROTECTED] wrote: above the code segment you submitted (line 765 of init.c) the comment: /* Strip the trailing slashes from directories. */ here are the manual notes on this option: (from Recursive Accept/Reject Options

Re: trailing '/' of include-directories removed bug

2003-06-12 Thread wei ye
of a feature than a bug. On Thu, 12 Jun 2003, wei ye wrote: Please take a look this example: $ \rm -rf biz.yahoo.com $ ls biz.yahoo.com $ wget -r --domains=biz.yahoo.com -I /r/ 'http://biz.yahoo.com/r/' $ ls biz.yahoo.com/ r/ reports/research/ $ I want only '/r

Bug, feature or my fault?

2003-06-11 Thread DervishD
Hi all :)) This is my first message on this list and as usual is a call for help ;) Well, the question is that I don't know if this is a bug (haven't look at the sources yet) and I can't find nothing in the documentation. So, prior to send a bug report, I want to make sure

Re: trailing '/' of include-directories removed bug

2003-06-11 Thread wei ye
'/' of include-directories '/r/'. It's a minor bug, but I hope it could be fix in next version. Thanks! static int cmd_directory_vector(...) { ... if (len 1) { if ((*t)[len - 1] == '/') (*t)[len - 1] = '\0

trailing '/' of include-directories removed bug

2003-06-11 Thread wei ye
I'm trying to crawl url with --include-directories='/r/' parameter. I expect to crawl '/r/*', but wget gives me '/r*'. By reading the code, it turns out that cmd_directory_vector() removed the trailing '/' of include-directories '/r/'. It's a minor bug, but I hope it could be fix in next

[bug] wget always spans hosts for https:// links

2003-05-31 Thread Toby Corkindale
I appear to have found a bug in wget 1.8.2, and I couldn't find any references to it via google. Is this a real bug? I have trouble believing it can't have been hit before; but on the other hand, I can't figure out any reason why it should be occuring to me. If I use wget -r http://myhost.com

bug report : 302 server response forces host spanning even without-H

2003-04-02 Thread Yaniv Azriel
://original/ WGET still browses the redirect site And by the way - multiple dependcy files are downloaded from the redirect site - so this is a mojor bug i think

not a bug but I have...

2003-03-30 Thread Nick Shafff
Hello Sorry, but I didn't find other mails I have wget man page translated in russian (I only have to do spell check) --- Nick Shafff mailto:[EMAIL PROTECTED]

wget future (was Re: Not 100% rfc 1738 complience for FTP URLs =bug

2003-03-17 Thread Aaron S. Hawley
On Thu, 13 Mar 2003, Max Bowsher wrote: David Balazic wrote: So it is do it yourself , huh ? :-) More to the point, *no one* is available who has cvs write access. what if for the time being the task of keeping track of submissions for wget was done with its debian package?

Not 100% rfc 1738 complience for FTP URLs = bug

2003-03-13 Thread David Balazic
As I got no response on [EMAIL PROTECTED], I am resending my report here. -- Hi! I noticed that wget ( 1.8.2 ) does not conform 100% to RFC 1738 when handling FTP URLs : wget ftp://user1:[EMAIL PROTECTED]/x/y/foo does this : USER user1 PASS secret1 SYST PWD ( let's say this returns

Re: Not 100% rfc 1738 complience for FTP URLs = bug

2003-03-13 Thread Max Bowsher
David Balazic wrote: As I got no response on [EMAIL PROTECTED], I am resending my report here. One forwards to the other. The problem is that the wget maintainer is absent, and likely to continue to be so for several more months. As a result, wget development is effectively stalled. Max.

Re: Not 100% rfc 1738 complience for FTP URLs = bug

2003-03-13 Thread Max Bowsher
David Balazic wrote: Max Bowsher wrote: David Balazic wrote: As I got no response on [EMAIL PROTECTED], I am resending my report here. One forwards to the other. The problem is that the wget maintainer is absent, and likely to continue to be so for several more months. As a result,

I dont know whats that... it mey be bug

2003-03-10 Thread Slawek
Hi I have wget 1.8 and everything be ok but today when I want to download file from ftp serwer wget show some error. I was do: wget 'ftp://user:[EMAIL PROTECTED]:port/directory/file with space.extension' port number was 1001 And wget display this: Connecting to 68.65.247.59:1001... connected.

A small bug in wget

2003-02-28 Thread Håvar Valeur
The bug appers if you use another output file and try to convert the url's at the same time. If you try to execute the following: wget -k -O myFile http://www.stud.ntnu.no/index.html The file will not convert, becuse wget do not locate the file index.html since the output-file is not index.html

Bug in wget version 1.8.1

2003-02-24 Thread Micha Byrecki
Hello. In version wget 1.8.1 i got a segfault after executing: $wget -c -r -k http://www.repairfaq.orghttp://www.repairfaq.org The bug is probably with two https in command line. I've attached strace output, but there's rather noting usefull. I have no source code of such version of wget, so i'm

Bug in wget version 1.8.1

2003-02-24 Thread Micha Byrecki
Hello again. Matter about version wget 1.8.1 I downloaded source code of wget 1.8.1, so i can tell you more for now about this bug :) Here's more data: (gdb) set args -c -r -k http://www.repairfaq.orghttp://www.repairfaq.org (gdb) run Starting program: /home/byrek/testy/wget-1.8.1/src/wget -c -r

bug report

2003-02-22 Thread Jirka Klaue
1/ (serious) #include config.h needs to be replaced by #include config.h in several source files. The same applies to strings.h. 2/ #ifdef WINDOWS should be replaced by #ifdef _WIN32. With these two changes it is even possible to compile wget with MSVC[++] and Intel C[++]. :-) Jirka

recursive retrieval bug/inconsistency

2003-02-18 Thread Pete Stevens
Hi, I'm trying to recursively retrieve the contents of a few subdirectories, however I've discovered that wget -r --directory-prefix=/files --no-host-directories --no-parent http://myserver.com/files works fine, but wget -r --directory-prefix=/files --no-host-directories --no-parent

RE: recursive retrieval bug/inconsistency

2003-02-18 Thread Herold Heiko
Mogliano V.to (TV) fax x39-041-5907472 -- ITALY -Original Message- From: Pete Stevens [mailto:[EMAIL PROTECTED]] Sent: Tuesday, February 18, 2003 2:53 PM To: [EMAIL PROTECTED] Subject: recursive retrieval bug/inconsistency Hi, I'm trying to recursively retrieve the contents

Bug or Feature

2003-02-16 Thread Dileep M. Kumar
Hello Friends, I use wget 1.8.2 and 1.5.3. When I use wget 1.8.2 I get this output wget -v -c --no-host-directories ftp://user:[EMAIL PROTECTED]/OmniTracker/file.txt --10:16:18-- ftp://user:*password*@194.153.x.x/OmniTracker/file.txt = `file.txt' Connecting to

possible bug in wget?

2003-02-08 Thread unicorn76
error-description wget aborts with segmentation violation while i try to get some files recursively. wget -r -l1 http://somewhere/somewhat.htm (gdb) where #0 0x080532a2 in fnmatch () #1 0x08065788 in fnmatch () #2 0x0805e523 in fnmatch () #3 0x08060da7 in fnmatch () #4

Bug with specials characters : can't write output file

2003-02-08 Thread Yves Pratter
Hello, I have found the following bug with wget 1.8.1 (windows) : I try to download picture of CD audio from this URL : wget could get this picture from the web server, but can't write the output file : - http://www.aligastore.com/query.dll/img?gcdFab=8811803124type=0 = `img

Re: Bug with specials characters : can't write output file

2003-02-08 Thread Kalin KOZHUHAROV
Hello! I have found the following bug with wget 1.8.1 (windows) : I try to download picture of CD audio from this URL : wget could get this picture from the web server, but can't write the output file : - http://www.aligastore.com/query.dll/img?gcdFab=8811803124type=0

bug report about running wget in BSDI 3.1

2003-02-05 Thread julian yin
Hello, I'v downloaded wget-1.5.3 from http://ftp.gnu.org/gnu/wget into our BSDI version 3.1 OS and used following commands: % gunzip wget-1.5.3.tar.gz % tar -xvf wget-1.5.3.tar % cd wget-1.5.3 % ./configure % ./make -f Makefile % ./make install But the following error message was displayed:

Bug report / feature request

2003-01-28 Thread Stas Ukolov
Hi! Wget 1.5.3 uses /robots.txt to skip some parts of web-site. But it doesn't use META NAME=ROBOTS CONTENT=NOFOLLOW tag, which serves to the same purpose. I believe that Wget must also parse and use META NAME='ROBOTS' ... tags WBR Stas mailto:[EMAIL PROTECTED]

Re: Bug in relative URL handling

2003-01-26 Thread Kalin KOZHUHAROV
Gary Hargrave wrote: --- Kalin KOZHUHAROV [EMAIL PROTECTED] wrote: Well, I am sure it is wrong URL, but took some time till I pinpoint it in RFC1808. Otherwise it would be very difficult to code URL parser. Ooops :-) It seems I was wrong... BTW, did you try to click in your browser on that

Bug when recursively downloading: https urls span sites

2003-01-24 Thread root
Moin! Problem: ssl-linked wget spans hosts even when it shouldn't when encountering a https:// link: - Deciding whether to enqueue http://www.egalwashierstehterversuchtesnichtzuladen.de/;. This is not the same hostname as

Re: Bug in relative URL handling

2003-01-24 Thread Gary Hargrave
--- Kalin KOZHUHAROV [EMAIL PROTECTED] wrote: Well, I am sure it is wrong URL, but took some time till I pinpoint it in RFC1808. Otherwise it would be very difficult to code URL parser. What you actually try to convince us is that you can omit the net-location (i.e. usually comes in the

Bug in relative URL handling

2003-01-23 Thread Gary Hargrave
wget does not seem to handle relative links in web pages of the form http:page3.html According to my understanding of rfc1808 this is a valid URL. When recursively retrieving html pages wget ignores these links with out displaying an error or warning. Gary

Re: Bug in relative URL handling

2003-01-23 Thread Kalin KOZHUHAROV
I just realized, I didn't send this and some other post to the list, but directly to the replier... Gary Hargrave wrote: wget does not seem to handle relative links in web pages of the form http:page3.html According to my understanding of rfc1808 this is a valid URL. When recursively

BUG on multiprocessor systems

2003-01-09 Thread Grzegorz Dzigielewski
Hello! While wget is used on dualcpu machine the assert(msecs=0) from calc_rate() broke program execution with this: wget: retr.c:262: calc_rate: Warunek `msecs = 0' nie zosta speniony. (Polish locale - sorry) We think that bug is in wtimer_elapsed() function. Probably it's a problem

Re: bug or limitation of wget used to access VMS servers

2003-01-08 Thread Max Bowsher
- Original Message - From: Ken Senior [EMAIL PROTECTED] There does not seem to be support to change disks when accessing a VMS server via wget. Is this a bug or just a limitation? Wget does plain old HTTP and FTP. I know nothing about VMS. Does it have some strange syntax for discs

Bug continuing download in combination with -O option

2002-12-24 Thread Heiner Steven
I downloaded a file using wget -O tmp.out http://host/input If I now try to resume the download using wget -c -O tmp.out http://host/input I get an error message. What should have happened: wget should get the size of tmp.out, and then retrieve the file input starting with

Bug with CSS

2002-12-15 Thread arthur.chereau
Hi, It seems that wget is not aware of CSS called by @import. Just an example: wget --page-requisites --span-hosts --html-extension --convert-links --backup-converted http://linuxfr.org/2002/12/09/10606.html will lose all the page formatting. Has it been fixed ?

Bug

2002-12-12 Thread Razvan Petrescu
Some display errors (see picture). I also noticed a bug using 'c' flag (continue) in conjunction with 'O' flag (output file) - It doesn't resume, it starts from the begining. Otherwise, great and needed tool. Thanx. Razvan Petrescu inline: wget-debian.jpg

Re: [RHSA-2002:229-10] Updated wget packages fix directorytraversal bug (fwd)

2002-12-11 Thread Noèl Köthe
On Wed, 2002-12-11 at 08:26, Daniel Stenberg wrote: I find it mildly annoying that I have not seen this discussed or even mentioned in here. Or am I just ignorant? No, you aren't. See http://archives.neohapsis.com/archives/vulnwatch/2002-q4/0102.html ... wget (CVE: CAN-2002-1344)

Bug? Timestamping directories in --recursive wget does not work?

2002-12-11 Thread Kalin KOZHUHAROV
Hi all! I am not 100% sure why this is so, but it is reproducable on my several linux systems. So: 1. Create a new directory and cd to it (mkdir /tmp/mydir /tmp/mydir) 2. Run wget with an ftp site to get a dir (wget --recursive ftp://ftp.gnu.org/pub/gnu/xinfo*) for example 3. See the time of

bug ???

2002-11-15 Thread Vikul Gupta
Hi there , I am using wget 1.7 on linux 2.4.X . Whenever I download a page from a website i don't see the Elapsed time for the download. Do i need to set something for this . i had a previous binary for solaris version 1.4.5 which by default showed Elapsed time. rgds Vikul

Bug? -p Option doesn't work with redirect

2002-11-06 Thread Achim Dumberger
Hi I don't know, if this is a bug, but when i use wget with the -p option I do not get the content of files (images ...) when the page is redirected Example: try wget -p http://www.linuxtoday.com and wget -p http://linuxtoday.com In the first case I do not get any of the images, in the second

wget bug

2002-11-05 Thread Jing Ping Ye
Dear Sir: I tried to use "wget" download data from ftp site but got error message as following: > wget ftp://ftp.ngdc.noaa.gov/pub/incoming/RGON/anc_1m.OCT Screen show:

Pathname bug?

2002-10-24 Thread Andy Arbon
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hello, I am running in to something in wget that, if not a bug, may be possibly an oversight. If this is covered anywhere in the documentation I apologise, but I have been through it and I can't find any mention of this behaviour. Trying to connect

meta crash bug

2002-10-19 Thread Ivan A. Bolsunov
version: 1.8.1 in file: html-url.c in function: tag_handle_meta() { ... skipped ... char *p, *refresh = find_attr (tag, content, attrind); int timeout = 0; for (p = refresh; ISDIGIT (*p); p++) ... skipped ... } BUG description: find_attr() MAY return NULL

Re: meta crash bug

2002-10-19 Thread Max Bowsher
(*p); p++) ... skipped ... } BUG description: find_attr() MAY return NULL, but this NOT checked in code listed above, JUST USING POINTERS WITHOUT NULL CHECKING, do you understand me??? :) For example: Wget CRASH when trying grab URL from this MALFORMED BUT POSSIBLE tag: meta http

wget 1.8.2 bug

2002-10-19 Thread Curtis H. Wilbar Jr.
I have found that -k option does not work on downloaded ftp files. The key problem seems to be that register_download is never called on ftp files downloaded as local_file is never set for calls to ftp_loop like they are on calls to http_loop. So, I added local_file as a parameter to ftp_loop

wget file size bug

2002-09-27 Thread Andrew Marlow
I believe I have found a bug in wget. The file size seems to be limited to MAXINT. Has this already been reported? Is this already being worked on? Do you want me to look into supplying a fix? Regards, Andrew Marlow.

Possible bug : hosts spanned by default

2002-09-27 Thread Andre Majorel
I've just had a recursive wget do something unexpected : it spanned hosts even though I didn't give the -H option. The command was : wget -r -l20 http://www.modcan.com/page2.html http://www.modcan.com/pg2_main.html contains a link to www.paypal.com, and that link was followed. That was Wget

wget bug (?): --page-requisites should supercede robots.txt

2002-09-22 Thread Jamie Flournoy
Using wget 1.8.2: $ wget --page-requisites http://news.com.com ...fails to retrieve most of the files that are required to properly render the HTML document, because they are forbidden by http://news.com.com/robots.txt . I think that use of --page-requisites implies that wget is being used

Bug/feature request

2002-09-17 Thread mjbauer
Greetings: There does not appear to be a way to refuse redirects in wget. This is a problem because certain sites use local click-count CGIs which return redirects to advertisers. A common form is http://desired.web.site/clickcount.cgi?http://undesired.advertiser.site/, which produces a

Bug with user:pass in URL

2002-09-16 Thread Nikolay Kuzmin
There is a bug in wget1.8.2 when username or password contains symbol ''. I think you should change code in file src/url.c from int url_skip_uname (const char *url) { const char *p; /* Look for '' that comes before '/' or '?'. */ p = (const char *)strpbrk (url, /?); if (!p || *p

Re: Bug with user:pass in URL

2002-09-16 Thread Daniel Stenberg
On Tue, 17 Sep 2002, Nikolay Kuzmin wrote: There is a bug in wget1.8.2 when username or password contains symbol ''. I think you should change code in file src/url.c from I disagree. The name and password fields must never contain a letter, as it is a reserved letter in URL strings. If your

bug?

2002-09-11 Thread Mats Andrén
I found this problem when fetching files recursively: What if the filenames of linked files from a www-page contains the []-characters? They are treated as some kind of patterns, and not just the way they are. Clearly not desirable! Since wget just fetches the filenames from the www-page,

Re: bug?

2002-09-11 Thread Thomas Lussnig
Mats Andrén wrote: I found this problem when fetching files recursively: What if the filenames of linked files from a www-page contains the []-characters? They are treated as some kind of patterns, and not just the way they are. Clearly not desirable! Since wget just fetches the

bug: expanding of %0A %0D in urlencoded strings

2002-09-04 Thread Adrian Dabrowski
I'm using GNU Wget 1.5.3 - there seem to be a bug when I use a path with urlencoded paramters in it. wget will expand \n and \r (%0A and %0D) from the urlencoded string, and sends a wrong request to the server. bye, adrian dabrowski

Re: GNU wget - SSL accepts bad certs, similiar to IE bug

2002-08-20 Thread Thomas Lussnig
If Openssl is broken, e.g. no certs installed, this will cause wget not to work. Do not know what version, but my version worked without installed certs. Also bevore my Patch was not even any cert routine, only ssl encapsulation I know it's not perfect but i worked on request on an

Re: GNU wget - SSL accepts bad certs, similiar to IE bug

2002-08-18 Thread tz
IE had a bug reported: http://online.securityfocus.com/archive/1/286895/2002-08-08/2002-08-14/1 http://www.theregister.co.uk/content/4/26620.html The problem exists in wget. Openssl doesn't install the certs in the proper directory by default. Use openssl ca to find the directory - the path

Wget Bug: Re: not downloading everything with --mirror

2002-08-15 Thread Max Bowsher
it is either a missing feature (shall I say, a bug as wget can't do the mirror which it could've) or I was unable to find some switch which makes it happen at once. Hmm, now I see. The vast majority of websites are configured to deny directory viewing. That is probably why wget doesn't bother to try

BUG ???????

2002-08-13 Thread Thushi
Hallo, I have a problem to download this link. http://linuxland.itam.nsc.ru/cgi-bin/download/c.cgi?eng/ps/RedHatBible.pdf.gz But browser works well. My wget version is 1.7 Regards, Thushi.

Re: [BUG] assert test msecs

2002-08-04 Thread Colin 't Hart
I have run across this problem too. It is because with Linux 2.4.18 (and other versions??) in certain circumstances, gettimeofday() is broken and will jump backwards. See http://kt.zork.net/kernel-traffic/kt20020708_174.html#1. Is there any particular reason for this assert? If there is,

Wget bug: 32 bit int for bytes downloaded.

2002-08-04 Thread Rogier Wolff
It seems wget uses a 32 bit integer for the bytes downloaded: [...] FINISHED --17:11:26-- Downloaded: 1,047,520,341 bytes in 5830 files cave /home/suse8.0# du -s 5230588 . cave /home/suse8.0# As it's a once per download variable I'd say it's not that performance critical...

Re: [BUG] assert test msecs

2002-08-01 Thread Max Bowsher
Hartwig, Thomas wrote: I got a assert exit of wget in retr.c in the function calc_rate because msecs is 0 or lesser than 0 (in spare cases). I don't know how perhaps because I have a big line to the server or the wrong OS. To get worked with this I patched retr.c setting msecs = 1 if equal

exit status bug

2002-07-29 Thread Edward V. Hynan
Hello, I am sure I have found a bug in wget 1.8.2 and earlier. Symptom: exit status is 0 when recursive ftp fails due to failed login. An ftp server may refuse login for temporary reasons, such as maximum logins, too busy, etc.. I don't mean that wget will detect such reasons, only

WGET BUG

2002-07-07 Thread Kempston
Hi, i have a problem and would really like you to help me. i`m using wget for downloading list of file urlsvia http proxy. When proxy server goes offline - wget doesn`t retry downloading of files. Can you fix that or can you tell me how can i fix that ?

WGET BUG

2002-07-07 Thread Kempston
:15003/Dragon = `dragon.004 Connecting to 195.108.41.140:3128... failed: Connection refused. FINISHED --01:19:23-- Downloaded: 150,000,000 bytes in 10 files - Original Message - From: Kempston To: [EMAIL PROTECTED] Sent: Monday, July 08, 2002 12:50 AM Subject: WGET BUG

Re: Bug with specific URLs

2002-06-21 Thread Kai Schaetzl
or use version 1.8.2 I see. As I said, I couldn't get it to work on that day and the NEWS file doesn't list this bug. I was able to test this now with 1.8.2 and see that it works. However, shouldn't it grab this header and change the file name, anyway? Content-Disposition: inline; filename=147945

Re: Bug with specific URLs

2002-06-21 Thread Kai Schaetzl
Your message of Thu, 20 Jun 2002 15:49:52 +0200: I supposed people would read the index.html. Since this is becoming something of a faq I've now I've put a 00Readme.txt on the ftp server and a Readme.txt in the binary archives, we'll see if that helps. It should :-) Kai -- Kai Schätzl,

RE: Bug with wget ? I need help.

2002-06-21 Thread Herold Heiko
-5907073 -- I-31021 Mogliano V.to (TV) fax x39-041-5907472 -- ITALY -Original Message- From: Cédric Rosa [mailto:[EMAIL PROTECTED]] Sent: Friday, June 21, 2002 4:37 PM To: [EMAIL PROTECTED] Subject: Bug with wget ? I need help. Hello, First, scuse my english but I'm french

Fwd: Bug with wget ? I need help.

2002-06-21 Thread Cédric Rosa
this problem ? Date: Fri, 21 Jun 2002 16:37:02 +0200 To: [EMAIL PROTECTED] From: Cédric Rosa [EMAIL PROTECTED] Subject: Bug with wget ? I need help. Hello, First, scuse my english but I'm french. When I try with wget (v 1.8.1) to download an url which is behind a router, the software wait for ever even

Re: Bug with wget ? I need help.

2002-06-21 Thread Hack Kampbjørn
Cédric Rosa wrote: Hello, First, scuse my english but I'm french. When I try with wget (v 1.8.1) to download an url which is behind a router, the software wait for ever even if I've specified a timeout. With ethereal, I've seen that there is no response from the server (ACK never

Re: Bug with wget ? I need help.

2002-06-21 Thread Cédric Rosa
thanks for your help :) I'm installing version 1.9 to check. I think this update may solve my problem. Cedric Rosa. - Original Message - From: Hack Kampbjørn [EMAIL PROTECTED] To: Cédric Rosa [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Sent: Friday, June 21, 2002 7:27 PM Subject: Re: Bug

(fwd) Bug#149075: wget: option for setting tcp window size

2002-06-16 Thread Noel Koethe
Hello, I got this feature request: http://bugs.debian.org/149075 - Forwarded message from Erno Kuusela [EMAIL PROTECTED] - hello, it would be really useful to be able to set the tcp window size for wget, since the default window size can be much too small for long latency links. also

Re: interesting bug

2002-06-09 Thread Hack Kampbjørn
[EMAIL PROTECTED] wrote: I was using wget to suck a website, and found an interesting problem some of the URLs it found contained a question mark, after which it responded with cannot write to '... insert file/URL here?more text ...' (invalid argument). And - it didn't save

interesting bug

2002-06-07 Thread alex
I was using wget to suck a website, and found an interesting problem some of the URLs it found contained a question mark, after which it responded with cannot write to '... insert file/URL here?more text ...' (invalid argument). And - it didn't save any of those URLs to files (on

Bug ?

2002-06-02 Thread $B4X@n(B $B$"$$(B
6), in addition, to do wget-1.8.1-sol26-sparc-local.gz. Every wget versions dumped core file before connection! Environment related to gcc etc on my Solaris2.6 System is so wrong ??? or what, Is this wget's bug ??? Please let me know, when you get time. I would greately appreciate any help yo

Re: Bug ?

2002-06-02 Thread Hrvoje Niksic
I don't know why Wget dumps core on startup. Perhaps a gettext problem? I have seen reports of failure on startup on Solaris, and it strikes me that Wget could have picked up wrong or inconsistent gettext. Try unsetting the locale-related evnironment variables and seeing if Wget works then.

Re: arguably a bug

2002-05-25 Thread Hrvoje Niksic
Henrik van Ginhoven [EMAIL PROTECTED] writes: problem, I agree. On large networks some evil-minded person could write a tiny cron-script that ran once every 5 minutes or so to parse ps-output looking for nothing but passwords, Note that the standard workaround for this problem, which is now

Re: arguably a bug

2002-05-25 Thread Thomas Lussnig
Note that the standard workaround for this problem, which is now even documented in the manual, is to use the `-i -' option. For example: wget -i - http://user:[EMAIL PROTECTED]/directory/file ^D But I agree that's just a workaround. I'm now more open to the idea of introducing a prompting

<    1   2   3   4   5   6   7   >