On Thu, 23 May 2002, Henrik van Ginhoven wrote:
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to ME. There really SHOULD be an alternate means of
prompting the user for a password
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to ME. There really SHOULD be an alternate means of
prompting the user for a password (i.e. something which is not readily
visible through ps, (or saved to a history file) I mean REALLY. wget
shows username:*password*
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to ME. There really SHOULD be an alternate means of
prompting the user for a password (i.e. something which is not readily
visible through
On Thu, 23 May 2002, Henrik van Ginhoven wrote:
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to ME. There really SHOULD be an alternate means of
prompting the user for a password
On Wed, May 22, 2002 at 09:20:57PM -0700, Doug Kaufman wrote:
On Thu, 23 May 2002, Henrik van Ginhoven wrote: (no I did not ;)
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney, System Admin wrote:
On Tue, 21 May 2002, Dan Mahoney, System Admin wrote:
Now, something occurs to
To: Doug Kaufman
Cc: Dan Mahoney, System Admin; [EMAIL PROTECTED]
Subject: Re: arguably a bug
On Wed, May 22, 2002 at 09:20:57PM -0700, Doug Kaufman wrote:
On Thu, 23 May 2002, Henrik van Ginhoven wrote: (no I did not ;)
On Wed, May 22, 2002 at 11:49:42AM -0400, Dan Mahoney,
System
Unlike ncftpget and ncftpput, wget appears to have no mechanism for controlling ftp
authentication. It always appears to perform an anonymous ftp transfer with no
supplied password.
However, it would be desirable to allow a user and password override to allow wget to
retrieve from
On Tue, 21 May 2002, Andrew Mayo wrote:
Unlike ncftpget and ncftpput, wget appears to have no mechanism for
controlling ftp authentication. It always appears to perform an
anonymous ftp transfer with no supplied password.
However, it would be desirable to allow a user and password override
In message Re: bug report and patch, HTTPS recursive get,
Ian Abbott wrote...
Thanks again for the bug report and the proposed patch. I thought some
of the scheme tests in recur.c were getting messy, so propose the
following patch that uses a function to check for similar schemes.
Thanks
Hi,
I have tried to download this page [1] following the links. The initial
page is saved correctly. But then this link [2] shall be loaded which
results in this http-query [3]. The actual problem is that '%2F' is decoded
to '@2F' (whereas e.g. '%5F' is correctly decoded to '_').
René
PS: I
On Wed, 15 May 2002 18:44:19 +0900, Kiyotaka Doumae [EMAIL PROTECTED]
wrote:
I found a bug of wget with HTTPS resursive get, and proposal
a patch.
Thanks for the bug report and the proposed patch. The current scheme
comparison checks are getting messy, so I'll write a function to check
schemes
On Fri, 3 May 2002 18:37:22 +0200, Emmanuel Jeandel
[EMAIL PROTECTED] wrote:
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
Patient: Doctor, it hurts when I do this
Doctor: Well don't do that then!
Seriously, this is already fixed in CVS.
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
ejeandel@yoknapatawpha:~$
I encounter this bug while i wanted to do wget ftp://a:b@c/, forgetting the
ftp://
The bug is not present when -r is not there (a:b: Unsupported scheme)
Emmanuel
is not persistent (which in this case
it isn't), it should close immediately after the data is received.
There is/was no problem with wget. Here is the solution/answer
from the bug reporter
--8--quote--8--
This bug is to do with `transparent' web proxying in our College (Abstract livesdown
the hall
I'm afraid that downloading files larger than 2G is not supported by
Wget at the moment.
Noel Koethe [EMAIL PROTECTED] writes:
the wget 1.8.1 manpage tells me:
--progress=type
Select the type of the progress indicator you wish to
use. Legal indicators are ``dot'' and ``bar''.
The ``dot'' indicator is used by default. It traces
Unfortunately, this bug is not easy to fix. The problem is that `-O'
was originally invented for streaming, i.e. for `-O -'. As a result,
many places in Wget's code assume that they can freely operate on the
file names, and -O seems more like an afterthought.
On the other hand, many people
[ Cc'ing to [EMAIL PROTECTED], as requested by Guillaume. ]
Guillaume Morin [EMAIL PROTECTED] writes:
this is from the advanced usage section of examples (info docs):
* If you want to encode your own username and password to HTTP or
FTP, use the appropriate URL syntax (*note URL
Guillaume Morin [EMAIL PROTECTED] writes:
When getting a file in a non-root directory from FTP with wget, wget
always tries CWD to that directory before getting the
file. Unfortunately sometimes you're not allowed to CWD to a
directory, but you're all allowed to list or download files from
I believe this is already on the todo list. However, this is made
harder by the fact that, to implement this kind of reject, you have to
start downloading the file. This is very different from the
filename-based rejection, where the decision can be made at a very
early point in the download
Guillaume Morin [EMAIL PROTECTED] writes:
if I use 'wget ftp://site.com/file1.txt ftp://site.com/file2.txt',
wget will no reuse the ftp connection, but will open one for each
document downloaded from the same site...
Yes, that's how Wget currently behaves. But that's not a bug, or at
least
Good point there. I wonder... is there a legitimate reason to require
atime to be set to the mtime time? If not, we could just make the
change without the new option. In general I'm careful not to add new
options unless they're really necessary.
Guillaume Morin [EMAIL PROTECTED] writes:
If wget fetches a url which redirects to another host, wget
retrieves the file, and there's nothing that can be done to turn
that off.
So, if you do wget -r on a machine that happens to have a redirect to
www.yahoo.com you'll wind up trying to pull
Guillaume Morin [EMAIL PROTECTED] writes:
For example if a link to the URL /foo?bar is seen then the correct
file is downloaded and saved with the name foo?bar. When viewing
the pages with Netscape the '?' character is seen to separate the
URL and the arguments. This makes the link fail.
the
recursion depth and the maximum amount to download if repeating the
test!)
I found out that there have to be more circumstances fullfilled in order
to reproduce this bug. I have a local copy of the website where it
occurred, and it seems necessary to have nearly all the files in the
web server dir
Dear wget team,
recently found a bug in the version 1.8 of the wget program (recursive
retrieval) that did not occur in earlier versions (at least as far as
I can see, 1.7 is definitly not affected).
The new wget version treats single ?xxx hrefs the same way as hrefs to
anchors (#xxx). So e.g
://www.arrl.org/)
There haven't been any releases since 1.8.1, but this bug is fixed
in the current CVS version.
On 4 Apr 2002 at 13:21, Robert Mücke wrote:
So it seems to be important to correct this behaviour. I think you only need
to set up a test site (maybe with some subdirs) containing one file with
an errorous href= tag to reproduce this (maybe only in parts
depending on your server
I'm using the NT port of WGET 1.8.1.
FTP retrieval of files works fine, retrieval of directory listings fails.
The problem happens under certain conditions when connecting to OS2 FTP
servers.
For example, if the current directory on the FTP server at login time is
e:/abc, the command wget
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hallo specialists,
I used wget 1.8.1 on my system to mirror the site www.europa.eu.int.
Transfer was throug a proxy and DSL over night.
After about 12-13 hours I found following situation:
Totally download about 1.8GB data.
wget process was
Hello,
I got another bug reported (http://bugs.debian.org/139059):
Examples:
roadkill:~/wget-1.8.1/src# wget -r http://www:s/
Segmentation fault
roadkill:~/wget-1.8.1/src# wget -r iftp://www.example.org/
Segmentation fault
Doesn't look to be exploitable, I think. When recursive mode
I found a serious bug in wget, all versions
affected.
Description: It is highly addictive
Solution:You should include a warning about this
somewhere in the product :)
a windows user
Hello,
the wget 1.8.1 manpage tells me:
--progress=type
Select the type of the progress indicator you wish to
use. Legal indicators are ``dot'' and ``bar''.
The ``dot'' indicator is used by default. It traces
the retrieval by printing dots
I'm using -html-extension to append files with the html extension.
Debug log is below. I'm not getting the expected result, and I'm hoping
someone can determine the problem. For testing purposes, I've got a cgi
script that generates the html for a page. The server, that the cgi is
running on,
fbsd1 --- http wget eshop.tar (3.3G) --- fbsd2
command was:
# wget http://kamenica/eshop.tar
at the second G i got the following:
2097050K .. .. .. .. .. 431.03 KB/s
2097100K .. .. .. .. ..8.14 MB/s
2097150K
On Thursday 21 February 2002 05:44, Ian Abbott wrote:
On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
You can't get it to work for timing out a socket connection, because
that is a bit of code that hasn't been implemented yet.
If no one else wants to, I can work up a patch for this next
I might be wrong but I believe there is a bug in the
--timeout=whatever syntax. I just can't get the
program to obey it under any circumstances, I put in
an ip that i know is non-existant and it takes it
forever to figure that out. I've even tried
changing it in init.c with no better results
On Wed, Feb 20, 2002 at 10:20:38PM -0800, Partycrew Industries wrote:
I might be wrong but I believe there is a bug in the
--timeout=whatever syntax. I just can't get the
program to obey it under any circumstances, I put in
No, that is not a correct statement. The program obeys the timeout
to was sent to [EMAIL PROTECTED]. I'm
continuing the thread on [EMAIL PROTECTED] as there is no bug and
I'm turning it into a discussion about features.]
On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
I've tried -w 30
--waitretry=30
--wait=30 (I think this one is for multiple
Hello,
http://bugs.debian.org/134765
--8--
Package: wget
Version: 1.8.1-0.2
I have a ~/.wgetrc that contains the line
login = anonymous
and a .netrc that contains, among other things,
default login raj password xxx
The man page says that wget reads .wgetrc, but actually it also
[The message I'm replying to was sent to [EMAIL PROTECTED]. I'm
continuing the thread on [EMAIL PROTECTED] as there is no bug and
I'm turning it into a discussion about features.]
On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
I've tried -w 30
--waitretry=30
--wait=30
Peteris Krumins [EMAIL PROTECTED] writes:
GNU Wget 1.8
get: progress.c:673: create_image: Assertion `p - bp-buffer = bp-width' failed.
This problem has been fixed in Wget 1.8.1. Please upgrade.
Hi,
I am forwarding you Debian bug 65971
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=65791repeatmerged=yes
I can reproduce this problem with 1.8.1
With the '-k' option or the 'convert_links = on' option in .wgetrc the
links in
the downloaded HTML pages are modified to be relative
Hi,
I am forwarding to you Debian bug 88176.
(http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=88176repeatmerged=yes)
I can reproduce the problem with 1.8.1
The following transcript shows that the wget can do the Bad Thing with
-O when timestamping.
It can result on a 0 byte long result
Hi,
I am forwarding Debian bug 106391.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=106361repeatmerged=yes
The bug still applies.
this is from the advanced usage section of examples (info docs):
* If you want to encode your own username and password to HTTP or
FTP, use
Hi,
I am forwarding Debian bug 113281
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=113281repeatmerged=yes
It still applies to 1.8.1. I am sure it is a bug though
wget doesn't wait when retrying to connect to an FTP server. Not sure
if
this affects HTTP downloads.
In the case shown
Hi,
I am forwarding Debian bug #131851
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=131851repeatmerged=yes
I can reproduce it on 1.8.1.
When getting a file in a non-root directory from FTP with wget, wget
always
tries CWD to that directory before getting the file. Unfortunately
Hi,
I am forwarding Debian wishlist bug 21148
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21148repeatmerged=yes
While wget allows me to include/exclude documents based on their
extension,
it doesn't allow me to do the same based on mime type (for example,
if I only want to save text
Forward of
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21588repeatmerged=yes
If I access a server not on the default port, wget does not write that
port in the name of the directory it creates. Here is an example:
--13:43:40-- http://www.center.osaka-u.ac.jp:7080/center
Debian wishlist bug 104122
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=104122repeatmerged=yes
It would be extremly useful to have a 'quirks' mode which would do the
following (for instance, other things can be added):
- If a URL with \ characters gets a 404, try again with s
Debian wishlist bug 105278
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=105278repeatmerged=yes
It would be nice if, upon noticing that it's getting a lot of invalid
port errors, wget would automatically try a passive FTP download unless
there had been some explicit configuration
Guillaume Morin wrote:
Forward of
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21588repeatmerged=yes
If I access a server not on the default port, wget does not write that
port in the name of the directory it creates. Here is an example:
--13:43:40-- http
]
If you wish to continue to submit further information on your problem,
please send it to [EMAIL PROTECTED], as before.
Please do not reply to the address at the top of this message,
unless you wish to report a problem with the Bug-tracking system.
Debian bug tracking system administrator
Hi,
I am forwarding you this bug. I can reproduce this on 1.8.1
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=117774repeatmerged=yes
---
wget seems to always return 0 as return code even when it fails, but
only
AFAIK when using some wildcard char in the URL. For example:
spiney:~ $ wget
I am forwarding you 15844. I can reproduce it on 1.8.1.
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=15844repeatmerged=yes
-
I observed this behavior which is not documented, and I don't think
should
happen.
Suppose ftp://host.com/filename is a symlink to
ftp://host.com/realfilename
/robots.txt !!
Wget is the best downloading program, so I hope this bug get fixed
very soon. Thank you
After the https/robots.txt bug, doing a recursive wget to an https-only server
gives me this error: it searches for http://servername/index.html but there
is no server on port 80, so wget receives a Connection refused error and
quits. It should search for https://servername/index.html
On 01/02/2002 12:10:59 Mr.Fritz wrote:
After the https/robots.txt bug, doing a recursive wget to an https-only
server
gives me this error: it searches for http://servername/index.html but
there
is no server on port 80, so wget receives a Connection refused error and
quits. It should search
' function.
Thanks for the report; this patch should fix the bug:
2002-02-01 Hrvoje Niksic [EMAIL PROTECTED]
* html-url.c (tag_handle_meta): Don't crash on meta
http-equiv=refresh where content is missing.
Index: src/html-url.c
On 17 Jan 2002 at 2:15, Hrvoje Niksic wrote:
Michael Jennings [EMAIL PROTECTED] writes:
WGet returns an error message when the .wgetrc file is terminated
with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
command-line language for all versions of Windows, so ignoring the
WGet returns an error message when the .wgetrc file is terminated
with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
command-line language for all versions of Windows, so ignoring the
end-of-file mark would make sense.
Ouch, I never thought of that. Wget opens files in binary mode and
On 21 Jan 2002 at 14:56, Thomas Lussnig wrote:
Why not just open the wgetrc file in text mode using
fopen(name, r) instead of rb? Does that introduce other
problems?
I think it has to do with comments because the defeinition is that
starting with '#' the rest of the line
is ignored. And
On 17/01/2002 07:34:05 Herold Heiko wrote:
[proper order restored]
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 17, 2002 2:15 AM
To: Michael Jennings
Cc: [EMAIL PROTECTED]
Subject: Re: Bug report: 1) Small error 2) Improvement to Manual
Herold Heiko [EMAIL PROTECTED] writes:
My personal idea is:
As a matter of fact no *windows* text editor I know of, even the
supplied windows ones (notepad, wordpad) AFAIK will add the ^Z at the
end of file.txt. Wget is a *windows* program (although running in
console mode), not a *Dos*
-
Obviously, this is completely your decision. You are right, only DOS editors make the
mistake. (It should be noted that DOS is MS Windows only command line language. It
isn't going away; even Microsoft supplies command line utilities with all versions of
its OSs. Yes, Windows will probably
From: Michael Jennings [mailto:[EMAIL PROTECTED]]
Obviously, this is completely your decision. You are right,
only DOS editors make the mistake. (It should be noted that
DOS is MS Windows only command line language. It isn't going
away; even Microsoft supplies command line utilities with
Ivan Buttinoni [EMAIL PROTECTED] writes:
- for recursive retrieval, multiple simultaneus gets
This is very hard to do, not easy at all.
- last but not the least: javascrip support (eheheh)
And this is even harder. Javascript is a full programming language
which, as used by the sites,
Ryan Daniels [EMAIL PROTECTED] writes:
The following command line causes a Segfault on my system:
wget -spider http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But the bug you've uncovered is real: you can
Peter Gucwa @ IIS-RTP [EMAIL PROTECTED] writes:
option -k does not work in following call:
wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Brendan Ragan [EMAIL PROTECTED] writes:
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirtymonth=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent,
¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë¡ë
bug-wget´Ô
¾È³çÇϼ¼¿ä?
ÀúÈñ´Â
Àüȸ¦ ÀÌ¿ëÇؼ
°»ç¿Í 1 : 1 ·Î ¿Ü
On Wednesday 09 January 2002 22:07, you wrote:
Hi ...
... HTTP persistent connections are supported since wget 1.7.1 (I think).
So download wget 1.8.1 and try again ;-)
Bye
Stefan
On Wed, 9 Jan 2002, Ivan Buttinoni wrote:
Hi,
will be nice if wget can use the web server keep-alive
- https support
It has that, too. Since 1.7. You may need to recompile it yourself with
the openssl libs.
---
Kim Scarborough http://www.unknown.nu/kim/
On Thursday 10 January 2002 16:51, you wrote:
- https support
It has that, too. Since 1.7. You may need to recompile it yourself with
the openssl libs.
Ack! I found these nice options:
--sslcertfile=FILE optional client certificate.
--sslcertkey=KEYFILE optional keyfile
Hi,
will be nice if wget can use the web server keep-alive feature, sending more GETs on
the same connection
instead of close the connection and re-open it.
Ivan
--
=
Bware Technologies - http://www.bware.it - via
My wget start on computers startup
wget -bc -i list -a log
and if i don`t erease list file Wget geting file from
Internet ever it allready on my PC.
Here is log
--13:44:42-- http://www.mds.ru:80/models/0040.32k.wma
= `0040.32k.wma'
Connecting to www.mds.ru:80... connected!
Hello bug-wget,
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirtymonth=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent, awaiting response... 302 Found
Location
option -k does not work in following call:
wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
Regards
Peter Gucwa
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ]
Nuno Ponte [EMAIL PROTECTED] writes:
I get a segmentation fault when invoking:
wget -r
http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html
My Wget version is 1.7-3, the one which
Jean-Edouard BABIN [EMAIL PROTECTED] writes:
I found a little bug when we download from an deleted directory:
[...]
Thanks for the report.
I wouldn't consider it a real bug. Downloading things into a deleted
directory is bound to produce all kinds of problems.
The diagnostic message could
Hi,
I found a little bug when we download from an deleted directory:
[xnu:~/wget-1.8.1] jeb% make
[xnu:~/wget-1.8.1] jeb% su
[xnu:/Users/jeb/wget-1.8.1] root# make install
[xnu:/Users/jeb] root# rm -fr wget-1.8.1/
[xnu:/Users/jeb] root# exit
[xnu:~/wget-1.8.1] jeb% wget ftp://ftp.gnu.org/gnu
Hello ,
wget --version
GNU Wget 1.7
cut from wget-log
37950K .. .. .. .. .. 87% @ 7.51 KB/s
38000K .. .. .. .. .. 87% @ 7.37 KB/s
38050K .. .. .. .. .. 87% @ 7.50 KB/s
Vladimir Volovich [EMAIL PROTECTED] writes:
while downloading some file (via http) with wget 1.8, i got an error:
assertion failed: p - bp-buffer = bp-width, file progress.c, line 673
Abort (core dumped)
Thanks for the report. It's a known problem in 1.8, fixed by this
patch.
Index:
Peng GUAN [EMAIL PROTECTED] writes:
Maybe a bug in file fnmatch.c, line 54:
( n==string || (flags FNM_PATHNAME) n[-1] == '/'))
the n[-1] should be change to *(n-1).
In C n[-1] is exactly the same as *(n-1).
On 14 Dec 2001 at 14:49, Peng GUAN wrote:
Maybe a bug in file fnmatch.c, line 54:
( n==string || (flags FNM_PATHNAME) n[-1] == '/'))
the n[-1] should be change to *(n-1).
I like the easy ones. Those are equivalent in C. As to which of the
too looks the nicest is a matter of aesthetics
Hello bug-wget,
$ wget --version
GNU Wget 1.8
$ wget
ftp://password:[EMAIL PROTECTED]:12345/Dir%20One/This.Is.Long.Name.Of.The.Directory/*
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
Please, fix it.
Thank you,
--
Best regards from future,
HillDale
Pavel Stepchenko [EMAIL PROTECTED] writes:
Hello bug-wget,
$ wget --version
GNU Wget 1.8
$ wget
ftp://password:[EMAIL PROTECTED]:12345/Dir%20One/This.Is.Long.Name.Of.The.Directory/*
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
Are you using
Pavel Stepchenko [EMAIL PROTECTED] writes:
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
HN Are you using a proxy?
Yes.
This means that HTTP is used for retrieval, and '*' won't work --
which is what Wget is trying to warn you about.
--17:26:58--
I use a proxy server, and have a line in my .wgetrc that says something like:
http_proxy = http://me:[EMAIL PROTECTED]:8080/
And couldn't get cookies to be passed back to pages correctly.
In wget 1.7 I noticed a line in http.c that says this:
if (header_process (hdr, Set-Cookie,
I use a proxy server, and have a line in my .wgetrc that says something like:
http_proxy = http://me:[EMAIL PROTECTED]:8080/
And couldn't get cookies to be passed back to pages correctly.
In wget 1.7 I noticed a line in http.c that says this:
if (header_process (hdr, Set-Cookie,
[EMAIL PROTECTED] writes:
I use a proxy server, and have a line in my .wgetrc that says
something like:
What version of Wget are you using? I believe this bug has been fixed
in Wget 1.7.1 and later.
By the way, your analysis is correct.
Hi,
when i run wget 1.8 with two arguments:
wget http://some/url http://some/url
which are the same, i get:
Assertion failed: !hash_table_contains (dl_url_file_map, url), file recur.c, line 752
Abort (core dumped)
Best,
v.
Dear,
Maybe a bug in file fnmatch.c, line 54:
( n==string || (flags FNM_PATHNAME) n[-1] == '/'))
the n[-1] should be change to *(n-1).
Regards
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
Herold Heiko [EMAIL PROTECTED] writes:
I put up the current cvs, mainly since there have been those patches
to ftp-ls.c and the signal handler. Ok ?
Please don't do that. Although all changes in the current CVS
*should* be stable,
Hi,
Today I downloaded the new wget release (1.8) (I'm a huge fan of the util
btw ;p ) and have been trying out the rate-limit feature.
When I run:
wget --limit-rate=20k
http://www.planetmirror.com/pub/debian-cd/2.1_r4/i386/binary-i386-1.iso
I get a core dump with the following output
[EMAIL PROTECTED] writes:
Today I downloaded the new wget release (1.8) (I'm a huge fan of the
util btw ;p ) and have been trying out the rate-limit feature.
[...]
assertion p - bp-buffer = bp-width failed: file progress.c,
line 673
Thanks for the report. The bug shows with downloads whose
Herold Heiko [EMAIL PROTECTED] writes:
On windows appareantly with beta2 and previous at least on nt4 a
screen size of 81 is needed in order to print the progress bar
correctly ?
Oh. Windows probably doesn't like the fact that Wget attempts to use
the full 80 characters. On most Unix
Herold Heiko [EMAIL PROTECTED] writes:
would insert `make distclean' before the second configure, or *at
least* `rm config.cache'.
Exactly my point - shouldn't either everything handled gracefully, or
cleaned up automatically when necessary (in ./configure ?),
I don't think so.
I have recently tripped across a bug with the
version of wget shipped with RedHat 7.2. When I attempt to recursively
retrieve a web tree starting with an html link that contains a base href, wget
apparently limits all href to base href even if another absolute path is
specified. You can
William H. Gilmore [EMAIL PROTECTED] writes:
I have recently tripped across a bug with the version of wget shipped
with RedHat 7.2. When I attempt to recursively retrieve a web tree
starting with an html link that contains a base href, wget apparently
limits all href to base href even
501 - 600 of 678 matches
Mail list logo