Hi
If i connect with wget 1.10.2 (Debian Etch Ubuntu Feisty Fawn) to a
secure host, that uses multiple cnames in the certificate i get the
following error:
[EMAIL PROTECTED]:~$ wget https://host.domain.tld
--10:18:55-- https://host.domain.tld/
= `index.html'
Resolving
Hi,
when calling wget -k -K , the backup files (.orig) are missing.
In one case (LOG.Linux.short) one backup file is missing (two files were
converted).
In another case (LOG.IRIX64.short) all backup files are missing.
This is also true when using recursive retrieval
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log 111,000 hits and
Gary Reysa wrote:
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log
Sorry for the crosspost, but the wget Web site is a little confusing on the
point of where to send bug reports/patches.
Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with
the
following error (once for each wget run):
Assertion failed: wget_cookie_jar != NULL, file
Hello!
I just found a feature in embedded system (no source) with ftp server.
In listing, there are two spaces between fileize and month.
As a consequence, wget allways thinks size is 0.
In procedure ftp_parse_unix_ls it just steps back one blank
before cur.size is calculated.
My quick hack is
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it
bounced and said to try this email address.
This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9
rpm -q wget
wget-1.8.2-9
When I use a wget with the -S to show the http headers, and I use
Hello. This is report on some wget bugs. My wgetdir command looks
the following (wget 1.9.1):
wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@
Bugs:
Command: wgetdir http://www.directfb.org;.
Problem: In file www.directfb.org/index.html the hrefs of type
Juhana Sadeharju [EMAIL PROTECTED] writes:
Command: wgetdir http://liarliar.sourceforge.net;.
Problem: Files are named as
content.php?content.2
content.php?content.3
content.php?content.4
which are interpreted, e.g., by Nautilus as manual pages and are
displayed as plain texts. Could
Hi again,
I found something what can be called a bug.
The command line and the output (shortened):
$ wget -k www.seznam.cz
--14:14:28-- http://www.seznam.cz/
= `index.html'
Resolving www.seznam.cz... done.
Connecting to www.seznam.cz[212.80.76.18]:80... connected.
HTTP request
If wget recieves a 302 TEmporarily Moved redirection to *another site*,
this site is browsed !
wget -r http://original/index.html
Server reply 302 http://redirect/index.html
WGET goes and downloads from redirect
I also tried adding -D flag but it doesnt help
wget -r -Doriginal -nh
1/ (serious)
#include config.h needs to be replaced by #include config.h in several source
files.
The same applies to strings.h.
2/
#ifdef WINDOWS should be replaced by #ifdef _WIN32.
With these two changes it is even possible to compile wget with MSVC[++] and Intel
C[++]. :-)
Jirka
Hello,
I'v downloaded wget-1.5.3 from http://ftp.gnu.org/gnu/wget into our
BSDI version 3.1 OS and used following commands:
% gunzip wget-1.5.3.tar.gz
% tar -xvf wget-1.5.3.tar
% cd wget-1.5.3
% ./configure
% ./make -f Makefile
% ./make install
But the following error message was displayed:
Hi!
Wget 1.5.3 uses /robots.txt to skip some parts of web-site. But it
doesn't use META NAME=ROBOTS CONTENT=NOFOLLOW tag, which serves
to the same purpose.
I believe that Wget must also parse and use META NAME='ROBOTS' ...
tags
WBR
Stas mailto:[EMAIL PROTECTED]
In message Re: bug report and patch, HTTPS recursive get,
Ian Abbott wrote...
Thanks again for the bug report and the proposed patch. I thought some
of the scheme tests in recur.c were getting messy, so propose the
following patch that uses a function to check for similar schemes.
Thanks
On Wed, 15 May 2002 18:44:19 +0900, Kiyotaka Doumae [EMAIL PROTECTED]
wrote:
I found a bug of wget with HTTPS resursive get, and proposal
a patch.
Thanks for the bug report and the proposed patch. The current scheme
comparison checks are getting messy, so I'll write a function to check
schemes
On Fri, 3 May 2002 18:37:22 +0200, Emmanuel Jeandel
[EMAIL PROTECTED] wrote:
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
Patient: Doctor, it hurts when I do this
Doctor: Well don't do that then!
Seriously, this is already fixed in CVS.
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
ejeandel@yoknapatawpha:~$
I encounter this bug while i wanted to do wget ftp://a:b@c/, forgetting the
ftp://
The bug is not present when -r is not there (a:b: Unsupported scheme)
Emmanuel
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hallo specialists,
I used wget 1.8.1 on my system to mirror the site www.europa.eu.int.
Transfer was throug a proxy and DSL over night.
After about 12-13 hours I found following situation:
Totally download about 1.8GB data.
wget process was
I found a serious bug in wget, all versions
affected.
Description: It is highly addictive
Solution:You should include a warning about this
somewhere in the product :)
a windows user
On 17 Jan 2002 at 2:15, Hrvoje Niksic wrote:
Michael Jennings [EMAIL PROTECTED] writes:
WGet returns an error message when the .wgetrc file is terminated
with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
command-line language for all versions of Windows, so ignoring the
WGet returns an error message when the .wgetrc file is terminated
with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
command-line language for all versions of Windows, so ignoring the
end-of-file mark would make sense.
Ouch, I never thought of that. Wget opens files in binary mode and
On 21 Jan 2002 at 14:56, Thomas Lussnig wrote:
Why not just open the wgetrc file in text mode using
fopen(name, r) instead of rb? Does that introduce other
problems?
I think it has to do with comments because the defeinition is that
starting with '#' the rest of the line
is ignored. And
On 17/01/2002 07:34:05 Herold Heiko wrote:
[proper order restored]
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 17, 2002 2:15 AM
To: Michael Jennings
Cc: [EMAIL PROTECTED]
Subject: Re: Bug report: 1) Small error 2) Improvement to Manual
Herold Heiko [EMAIL PROTECTED] writes:
My personal idea is:
As a matter of fact no *windows* text editor I know of, even the
supplied windows ones (notepad, wordpad) AFAIK will add the ^Z at the
end of file.txt. Wget is a *windows* program (although running in
console mode), not a *Dos*
-
Obviously, this is completely your decision. You are right, only DOS editors make the
mistake. (It should be noted that DOS is MS Windows only command line language. It
isn't going away; even Microsoft supplies command line utilities with all versions of
its OSs. Yes, Windows will probably
From: Michael Jennings [mailto:[EMAIL PROTECTED]]
Obviously, this is completely your decision. You are right,
only DOS editors make the mistake. (It should be noted that
DOS is MS Windows only command line language. It isn't going
away; even Microsoft supplies command line utilities with
Hello bug-wget,
$ wget --version
GNU Wget 1.8
$ wget
ftp://password:[EMAIL PROTECTED]:12345/Dir%20One/This.Is.Long.Name.Of.The.Directory/*
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
Please, fix it.
Thank you,
--
Best regards from future,
HillDale.
Pavel Stepchenko [EMAIL PROTECTED] writes:
Hello bug-wget,
$ wget --version
GNU Wget 1.8
$ wget
ftp://password:[EMAIL PROTECTED]:12345/Dir%20One/This.Is.Long.Name.Of.The.Directory/*
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
Are you using a
Pavel Stepchenko [EMAIL PROTECTED] writes:
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
HN Are you using a proxy?
Yes.
This means that HTTP is used for retrieval, and '*' won't work --
which is what Wget is trying to warn you about.
--17:26:58--
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
Herold Heiko [EMAIL PROTECTED] writes:
I put up the current cvs, mainly since there have been those patches
to ftp-ls.c and the signal handler. Ok ?
Please don't do that. Although all changes in the current CVS
*should* be stable,
31 matches
Mail list logo