Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3bwget
ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/;
-P c:\Downloads\
--- command end ---
wget cant convert .listing-file into a html-file
regards
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sir Vision wrote:
Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3bwget
ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/;
-P c:\Downloads\
--- command
probably agree with that behavior... most people probably aren't
interested in being informed that a server breaks RFC 2616 mildly;
Generally, if Wget considers a header to be in error (and hence
ignores it), the user probably needs to know about that. After all,
it could be the symptom of a Wget bug
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
Generally, if Wget considers a header to be in error (and hence
ignores it), the user probably needs to know about that. After all,
it could be the symptom of a Wget bug, or of an unimplemented
extension the server
Hi,
I got a bug on wget when executing:
wget -a log -x -O search/search-1.html --verbose --wait 3
--limit-rate=20K --tries=3
http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
Segmentation fault (core dumped)
I created directory search.
The above creates a file search/search
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Diego Campo wrote:
Hi,
I got a bug on wget when executing:
wget -a log -x -O search/search-1.html --verbose --wait 3
--limit-rate=20K --tries=3
http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
Segmentation fault (core
Micah Cowan [EMAIL PROTECTED] writes:
I was able to reproduce the problem above in the release version of
Wget; however, it appears to be working fine in the current
development version of Wget, which is expected to release soon as
version 1.11.*
I think the old Wget crashed on empty
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
Micah Cowan [EMAIL PROTECTED] writes:
I was able to reproduce the problem above in the release version of
Wget; however, it appears to be working fine in the current
development version of Wget, which is expected to
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Mauro Tortonesi wrote:
Micah Cowan ha scritto:
Update of bug #20323 (project wget):
Status: Ready For Test = In
Progress
___
Follow-up Comment #3
), to indicate that it's doing this.
- -Micah
- Original Message
Subject: Bug#281201: wget prints it's progress even when background
Resent-Date: Tue, 10 Jul 2007 13:57:01 +, Tue, 10 Jul 2007 13:57:02
+
Resent-From: Ilya Anfimov [EMAIL PROTECTED]
Resent-To: [EMAIL PROTECTED
Hi,
wget appears to be confused by FTP servers that only have one space
between the file-size information. We only came across this problem
today so I don't know how common it is.
pjjH
From: Harrington, Paul
Sent: Thursday, May 31, 2007 12:06 AM
To:
Hi
If i connect with wget 1.10.2 (Debian Etch Ubuntu Feisty Fawn) to a
secure host, that uses multiple cnames in the certificate i get the
following error:
[EMAIL PROTECTED]:~$ wget https://host.domain.tld
--10:18:55-- https://host.domain.tld/
= `index.html'
Resolving
Jochen Roderburg wrote:
I have now tested the new wget 1.11 beta1 on my Linux system and the above issue
is solved now. The Remote file is newer message now only appears when the
local file exists and most of the other logic with time-stamping and
file-naming works like expected.
excellent.
Jochen Roderburg ha scritto:
Zitat von Jochen Roderburg [EMAIL PROTECTED]:
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached
Zitat von Jochen Roderburg [EMAIL PROTECTED]:
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached below. But the other part is
Hrvoje Niksic ha scritto:
Noèl Köthe [EMAIL PROTECTED] writes:
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Mauro, you
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header, the best solution i could
Hrvoje Niksic ha scritto:
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header,
Hrvoje Niksic wrote:
Noèl Köthe [EMAIL PROTECTED] writes:
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Mauro, you will
Hello,
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Weitergeleitete Nachricht
[EMAIL PROTECTED]:~$ strace
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached below. But the other part is that
hstat.local_file is a NULL pointer when
Daniel Richard G. ha scritto:
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
fixed, thanks.
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.tortonesi.com
University of Ferrara -
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
(build log excerpt)
+ make install DESTDIR=/tmp/wget--1.10.2.build/__dest__
cd src make CC='cc' CPPFLAGS='-D__EXTENSIONS__ -D_REENTRANT -Dsparc' ...
install.bin
Hello, i'm using wget 1.10.2 in Windows, the windows binary version, and it have a bug when downloading with -c and with a input file. If the first file of the list is the one to be continued, wget do it fine, if not, wgettry to download the files from the beginning, and it says that is
Noèl Köthe wrote:
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
i wonder if it makes sense to add generic support for multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma:
Mauro Tortonesi [EMAIL PROTECTED] writes:
Noèl Köthe wrote:
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
i wonder if it makes sense to add generic support for multiple headers
in wget, for instance by extending the --header
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
i wonder if it makes sense to add generic support for
multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma: xxx --header=dontoverride,Pragma:
xxx2 someurl
That could be a problem if you
Herold Heiko wrote:
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
i wonder if it makes sense to add generic support for
multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma: xxx --header=dontoverride,Pragma:
xxx2 someurl
That could be
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
thx.
Package: wget
Version: 1.10.2-1
It's meaningful to have multiple 'Pragma:' headers within an http
request, but wget will silently issue only a single one of them if
they
done.
== PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done.
[ = ] -673,009,664 113,23K/s
Assertion failed: bytes = 0, file retr.c, line 292
This application has requested the Runtime to terminate it in an unusual
way.
Please contact the
Tobias Koeck wrote:
done.
== PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done.
[ = ] -673,009,664 113,23K/s
Assertion failed: bytes = 0, file retr.c, line 292
This application has requested the Runtime to terminate it in an unusual
way.
That is, there is HTML like this:
pClick the following to go to the
a href=http://www.something.com/junk.asp?thepageIwant=2;;next
page/a./p
What I need is for wget to understand that stuff following an ? in a URL
indicates that it's a distinctly different page, and it should go
recursively
Thanks for the report; I've applied this patch:
2005-08-26 Jeremy Shapiro [EMAIL PROTECTED]
* openssl.c (ssl_init): Set SSL_MODE_AUTO_RETRY.
Index: openssl.c
===
--- openssl.c (revision 2063)
+++ openssl.c (working
I believe I've encountered a bug in wget. When using https, if the
server does a renegotiation handshake wget fails trying to peek for
the application data. This occurs because wget does not set the
openssl context mode SSL_MODE_AUTO_RETRY. When I added the line:
SSL_CTX_set_mode (ssl_ctx
[EMAIL PROTECTED]
Betreff: Bug#319088: wget: don't rely on exactly one blank char
between size and month
Datum: Wed, 20 Jul 2005 10:26:20 +0200
Package: wget
Version: 1.10-3+1.10.1beta1
Followup-For: Bug #319088
A better patch is the following, that drops the assumption
On Wednesday 15 June 2005 04:57 pm, Ulf Harnhammar wrote:
On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote:
the web pages (including the documentation) on gnu.org have just been
updated.
Nice! I have found some broken links and strange grammar, though:
* index.html: There
On Wednesday 15 June 2005 05:14 pm, Ulf Harnhammar wrote:
On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote:
* faq.html
** 3.1 [..]
Yes, starting from version 1.10, GNU Wget support files larger than 2GB.
(should be supports)
** 2.0 How I compile GNU Wget?
(should be How
Mauro Tortonesi [EMAIL PROTECTED] writes:
this seems to be already fixed in the 1.10 documentation.
Now that 1.10 is released, we should probably update the on-site
documentation.
On Wednesday 15 June 2005 02:05 pm, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
this seems to be already fixed in the 1.10 documentation.
Now that 1.10 is released, we should probably update the on-site
documentation.
i am doing it right now.
--
Aequam memento rebus in
On Wednesday 15 June 2005 02:16 pm, Mauro Tortonesi wrote:
On Wednesday 15 June 2005 02:05 pm, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
this seems to be already fixed in the 1.10 documentation.
Now that 1.10 is released, we should probably update the on-site
On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote:
the web pages (including the documentation) on gnu.org have just been updated.
Nice! I have found some broken links and strange grammar, though:
* index.html: There are archives of the main GNU Wget list at
** fly.cc.fer.hr
**
On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote:
* faq.html
** 3.1 [..]
Yes, starting from version 1.10, GNU Wget support files larger than 2GB.
(should be supports)
** 2.0 How I compile GNU Wget?
(should be How do I)
// Ulf
On Thursday 02 June 2005 09:33 am, Herb Schilling wrote:
Hi,
On http://www.gnu.org/software/wget/manual/wget.html, the section on
protocol-directories has a paragraph that is a duplicate of the
section on no-host-directories. Other than that, the manual is
terrific! Wget is wonderful also.
Title: Small bug in Wget manual page
Hi,
On
http://www.gnu.org/software/wget/manual/wget.html, the section
on
protocol-directories has a paragraph that is a duplicate of the
section on
no-host-directories. Other than that, the manual is terrific!
Wget is wonderful also. I don't know what I
Wget doesn't recognize the image tag,
Aah, thanks.
Should Wget support it to be compatible?
IMHO yes.
Thanks for your help.
Werner
simply doesn't download -- no error message,
no warning. My Mozilla browser displays the page just fine. Since
wget downloads the first thumbnail picture
`../image/ft2-nautilus-thumb.png' without problems I suspect a serious
bug in wget.
I'm running wget on a GNU/Linux box.
BTW
. Since
wget downloads the first thumbnail picture
`../image/ft2-nautilus-thumb.png' without problems I suspect a
serious bug in wget.
ft2-nautilus-thumb.png is referenced using the regular img tag.
BTW, it is not possible for CVS wget to have builddir != srcdir
(after creating the configure
I try to do something like
wget http://website.com/ ...
login=usernamedomain=hotmail%2ecom_lang=EN
But when wget sends the URL out, the hotmail%2ecom
becomes hotmail.com !!! Is this the supposed
behaviour ? I saw this on the sniffer. I suppose the
translation of %2 to . is done by wget. Because
Will Kuhn [EMAIL PROTECTED] writes:
I try to do something like
wget http://website.com/ ...
login=usernamedomain=hotmail%2ecom_lang=EN
But when wget sends the URL out, the hotmail%2ecom
becomes hotmail.com !!! Is this the supposed
behaviour ?
Yes.
I saw this on the sniffer. I suppose
Hrvoje Niksic [EMAIL PROTECTED] writes:
Can I have it not do the translation ??!
Unfortunately, only by changing the source code as described in the
previous mail.
BTW I've just changed the CVS code to not decode the % sequences.
Wget 1.10 will contain the fix.
Hello,
here a bugreport:
(http://bugs.debian.org/197916)
-Weitergeleitete Nachricht-
From: Antoni Bella Perez [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject: Bug#197916: wget: Mutual incompatibility between arguments -k and -O
Date: Wed, 18 Jun 2003 16:49:22 +0200
Package: wget
Hello,
maybe someone can document this (http://bugs.debian.org/182957) in one
or two sentences in wget.texi.
thx.
-Weitergeleitete Nachricht-
From: Daniel B. dsb smart.net
...
The wget manual page doesn't document the format of the comma-separated values
for the --rejlist and
Tristan Miller [EMAIL PROTECTED] writes:
There appears to be a bug in the documentation (man page, etc.) for
wget 1.9.1.
I think this is a bug in the man page generation process.
Greetings.
There appears to be a bug in the documentation (man page, etc.) for wget
1.9.1. Specifically, the section about the command-line option for
proxies ends abruptly:
-Y on/off
--proxy=on/off
Turn proxy support on or off. The proxy is on by default if the
D Richard Felker III [EMAIL PROTECTED] writes:
The request log shows that the slashes are apparently respected.
I retried a test case and found the same thing -- the slashes were
respected.
OK.
Then I remembered that I was using -i. Wget seems to work fine with
the url on the command
On Mon, Mar 01, 2004 at 07:25:52PM +0100, Hrvoje Niksic wrote:
Removing the offending code fixes the problem, but I'm not sure if
this is the correct solution. I expect it would be more correct to
remove multiple slashes only before the first occurrance of ?, but
not afterwards.
D Richard Felker III [EMAIL PROTECTED] writes:
The following code in url.c makes it impossible to request urls that
contain multiple slashes in a row in their query string:
[...]
That code is removed in CVS, so multiple slashes now work correctly.
Think of something like
On Mon, Mar 01, 2004 at 03:36:55PM +0100, Hrvoje Niksic wrote:
D Richard Felker III [EMAIL PROTECTED] writes:
The following code in url.c makes it impossible to request urls that
contain multiple slashes in a row in their query string:
[...]
That code is removed in CVS, so multiple
D Richard Felker III [EMAIL PROTECTED] writes:
Think of something like http://foo/bar/redirect.cgi?http://...
wget translates this into: [...]
Which version of Wget are you using? I think even Wget 1.8.2 didn't
collapse multiple slashes in query strings, only in paths.
I was using
The following code in url.c makes it impossible to request urls that
contain multiple slashes in a row in their query string:
else if (*h == '/')
{
/* Ignore empty path elements. Supporting them well is hard
(where do you save http://x.com///y.html;?), and
Hello,
I use a extra file with a long list of http entries. I included this
file with the -i option.
After 154 downloads I got an error message: Segmentation fault.
With wget 1.7.1 everything works well.
Is there a new limit of lines?
Regards,
Dieter Drossmann
Dieter Drossmann [EMAIL PROTECTED] writes:
I use a extra file with a long list of http entries. I included this
file with the -i option. After 154 downloads I got an error
message: Segmentation fault.
With wget 1.7.1 everything works well.
Is there a new limit of lines?
No, there's no
Hello,
I think I found a bug in wget.
My GNU wget version is 1.82
My system GNU/Debian unstable
I use wget to replay our apache logfiles to a
test webserver to try different tuning parameters.
Wget fails to run through the logfile
and give out the error message that msec =0 failed
Boehn, Gunnar von [EMAIL PROTECTED] writes:
I think I found a bug in wget.
You did. But I believe your subject line is slightly incorrect. Wget
handles 0 length time intervals (see the assert message), but what it
doesn't handle are negative amounts. And indeed:
gettimeofday({1063461157
Dear Sir;
We are using wget-1.8.2 and it's very convinient for our routine
program. By the way, now we have a trouble with the return code
from wget in case of trying to use it with -r option, When wget with
-r option fails in a ftp connection, wget returns a code 0. If no -r
option, it
!
Regards
Klaus
--- Forwarded message follows ---
From: [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Date sent: Thu, 4 Sep 2003 12:53:39 +0200
Subject:Hostname bug in wget ...
Priority: normal
... or a silly sleepless
[EMAIL PROTECTED] writes:
I found a workaround for the problem described below.
Using option -nh does the job for me.
As the subdomains mentioned below are on the same IP
as the main domain wget seems not to compare their
names but the IP only.
I believe newer versions of Wget don't do
... or a silly sleepless webmaster !?
Hi,
Version
==
I use the GNU wget version 1.7 which is found on
OpenBSD Release 3.3 CD.
I use it on i386 architecture.
How to reproduce
==
wget -r coolibri.com
(adding the span hosts option did not improve)
Problem category
=
The bug appers if you use another output file and try to convert the url's
at the same time.
If you try to execute the following:
wget -k -O myFile http://www.stud.ntnu.no/index.html
The file will not convert, becuse wget do not locate the file index.html
since the output-file is not index.html
Hello.
In version wget 1.8.1 i got a segfault after executing:
$wget -c -r -k http://www.repairfaq.orghttp://www.repairfaq.org
The bug is probably with two https in command line. I've attached strace
output, but there's rather noting usefull. I have no source code of such
version of wget, so i'm
Hello again.
Matter about version wget 1.8.1
I downloaded source code of wget 1.8.1, so i can tell you more for now
about this bug :)
Here's more data:
(gdb) set args -c -r -k http://www.repairfaq.orghttp://www.repairfaq.org
(gdb) run
Starting program: /home/byrek/testy/wget-1.8.1/src/wget -c -r
error-description
wget aborts with segmentation violation while i try to get some files
recursively.
wget -r -l1 http://somewhere/somewhat.htm
(gdb) where
#0 0x080532a2 in fnmatch ()
#1 0x08065788 in fnmatch ()
#2 0x0805e523 in fnmatch ()
#3 0x08060da7 in fnmatch ()
#4
-5907073
-- I-31021 Mogliano V.to (TV) fax x39-041-5907472
-- ITALY
-Original Message-
From: Cédric Rosa [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 21, 2002 4:37 PM
To: [EMAIL PROTECTED]
Subject: Bug with wget ? I need help.
Hello,
First, scuse my english but I'm french
this problem ?
Date: Fri, 21 Jun 2002 16:37:02 +0200
To: [EMAIL PROTECTED]
From: Cédric Rosa [EMAIL PROTECTED]
Subject: Bug with wget ? I need help.
Hello,
First, scuse my english but I'm french.
When I try with wget (v 1.8.1) to download an url which is behind a router,
the software wait for ever even
Cédric Rosa wrote:
Hello,
First, scuse my english but I'm french.
When I try with wget (v 1.8.1) to download an url which is behind a router,
the software wait for ever even if I've specified a timeout.
With ethereal, I've seen that there is no response from the server (ACK
never
thanks for your help :)
I'm installing version 1.9 to check. I think this update may solve my
problem.
Cedric Rosa.
- Original Message -
From: Hack Kampbjørn [EMAIL PROTECTED]
To: Cédric Rosa [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Friday, June 21, 2002 7:27 PM
Subject: Re: Bug
Hello,
I got this feature request:
http://bugs.debian.org/149075
- Forwarded message from Erno Kuusela [EMAIL PROTECTED] -
hello,
it would be really useful to be able to set the tcp window size
for wget, since the default window size can be much too small
for long latency links. also
Noel Koethe [EMAIL PROTECTED] writes:
the wget 1.8.1 manpage tells me:
--progress=type
Select the type of the progress indicator you wish to
use. Legal indicators are ``dot'' and ``bar''.
The ``dot'' indicator is used by default. It traces
I believe this is already on the todo list. However, this is made
harder by the fact that, to implement this kind of reject, you have to
start downloading the file. This is very different from the
filename-based rejection, where the decision can be made at a very
early point in the download
Good point there. I wonder... is there a legitimate reason to require
atime to be set to the mtime time? If not, we could just make the
change without the new option. In general I'm careful not to add new
options unless they're really necessary.
Guillaume Morin [EMAIL PROTECTED] writes:
If wget fetches a url which redirects to another host, wget
retrieves the file, and there's nothing that can be done to turn
that off.
So, if you do wget -r on a machine that happens to have a redirect to
www.yahoo.com you'll wind up trying to pull
I'm using the NT port of WGET 1.8.1.
FTP retrieval of files works fine, retrieval of directory listings fails.
The problem happens under certain conditions when connecting to OS2 FTP
servers.
For example, if the current directory on the FTP server at login time is
e:/abc, the command wget
Hello,
the wget 1.8.1 manpage tells me:
--progress=type
Select the type of the progress indicator you wish to
use. Legal indicators are ``dot'' and ``bar''.
The ``dot'' indicator is used by default. It traces
the retrieval by printing dots
Hi,
I am forwarding Debian bug 113281
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=113281repeatmerged=yes
It still applies to 1.8.1. I am sure it is a bug though
wget doesn't wait when retrying to connect to an FTP server. Not sure
if
this affects HTTP downloads.
In the case shown
Hi,
I am forwarding Debian wishlist bug 21148
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=21148repeatmerged=yes
While wget allows me to include/exclude documents based on their
extension,
it doesn't allow me to do the same based on mime type (for example,
if I only want to save
Hi,
I am forwarding you this bug. I can reproduce this on 1.8.1
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=117774repeatmerged=yes
---
wget seems to always return 0 as return code even when it fails, but
only
AFAIK when using some wildcard char in the URL. For example:
spiney:~ $ wget
Vladimir Volovich [EMAIL PROTECTED] writes:
while downloading some file (via http) with wget 1.8, i got an error:
assertion failed: p - bp-buffer = bp-width, file progress.c, line 673
Abort (core dumped)
Thanks for the report. It's a known problem in 1.8, fixed by this
patch.
Index:
Hi,
Today I downloaded the new wget release (1.8) (I'm a huge fan of the util
btw ;p ) and have been trying out the rate-limit feature.
When I run:
wget --limit-rate=20k
http://www.planetmirror.com/pub/debian-cd/2.1_r4/i386/binary-i386-1.iso
I get a core dump with the following output
[EMAIL PROTECTED] writes:
Today I downloaded the new wget release (1.8) (I'm a huge fan of the
util btw ;p ) and have been trying out the rate-limit feature.
[...]
assertion p - bp-buffer = bp-width failed: file progress.c,
line 673
Thanks for the report. The bug shows with downloads whose
In wget 1.7 and 1.6, if the WGETRC environment variable is set but the file
specified is inaccessible, the message:
wget: (null): No such file or directory.
is displayed and the program exits with status 1.
Debugging traces the problem to the following function in init.c (ca. line
261)
/*
Hello.
I have discovered a bug in wget 1.7
When I try to get thist page: http://www.lehele.de/
this error occurs:
-
wget -d -r -l 1 www.lehele.de
DEBUG output created by Wget 1.7 on linux-gnu.
parseurl (www.lehele.de
On 17 Aug 2001, at 11:41, Dave Turner wrote:
On Fri, 17 Aug 2001, Dave Turner wrote:
By way of a hack I have used the SIZE command, not supported by RFC959 but
still accepted by many of the servers I use, to get the size of the file.
If that fails then it falls back on the old method.
Not sure if this is wget's fault or a broken server, but it happens on a
lot of servers so maybe it should be handled better.
The bug seems to manifest itself when resuming an FTP transfter and the
length is unauthoritative. The reported total length is in fact the
remaining length (i.e. the
hi,
guess there is a bug in the Makefile.in of the doc directory
the wget.1 couldn't be found if --srcdir option is used ...
regards
michael
8-
/ diff doc/Makefile.in doc/Makefile.in_new
118c118
$(INSTALL_DATA) $(srcdir)/$(MAN)
Hi!
I found the following in the log file of piology.org:
202.108.68.179 - - [15/Jul/2001:10:50:19 +0200] GET /3.14/ HTTP/1.0
404 2332 http://www.go2net.com/useless/useless/pi.html; Wget/1.6
202.108.68.179 - - [15/Jul/2001:12:49:38 +0200] GET /elmi/ HTTP/1.0
404 2316
95 matches
Mail list logo