Hello,
I think I found a security issue in wget.
Details:
when you try to download with http from a site requesting username
password, wget discloses the pw in two ways:
- on the local side not changing the argv[] vector to hide the password in
the ps output (OS dependent, I guess)
- on
Alle 10:44, mercoledì 31 agosto 2005, Adriaan van Os ha scritto:
Hello,
When I run
--
wget --debug --append-output=wgetscrybug.log --recursive --no-parent
--no-directories --delete-after --page-requisites
--referer=http://vdg38bis.xs4all.nl/panoramas
http://vdg38bis.xs4all.nl/panoramas
Hello,
When I run
--
wget --debug --append-output=wgetscrybug.log --recursive --no-parent
--no-directories --delete-after --page-requisites
--referer=http://vdg38bis.xs4all.nl/panoramas
http://vdg38bis.xs4all.nl/panoramas
--
on my eComStation box, I get an error. The info on console reads
--
Linda Walsh [EMAIL PROTECTED] writes:
I noticed after my post in the archives that this bug is fixed in
1.10.
Now if I can just get the server-ops to fix their CVS server, that'd
be great -- I've checked out CVS projects from other sites and not
had inbound TCP attempts to some 'auth'
Wget doesn't recognize the image tag,
Aah, thanks.
Should Wget support it to be compatible?
IMHO yes.
Thanks for your help.
Werner
[CVS 2005-05-25]
I tried this command:
wget -r -L -l1 freetype.freedesktop.org/freetype2/screenshots.html
directly from the build directory, without using a .wgetrc file. In
the file `screenshots.html' there is a reference to the file
../image/ft2-kde-thumb.png
(and others) which wget
Werner LEMBERG [EMAIL PROTECTED] writes:
directly from the build directory, without using a .wgetrc file. In
the file `screenshots.html' there is a reference to the file
../image/ft2-kde-thumb.png
The reference looks like this:
image width=160 height=120 alt=KDE screenshot
to newadmin.studylight.org.
Escape character is '^]'.
GET / HTTP/1.0
Host: www.studylight.org
User-Agent: Wget/1.9.1
HTTP/1.1 302 Found
Date: Wed, 18 May 2005 08:24:10 GMT
Server: Apache/1.3.33 (Unix) (Gentoo/Linux) mod_perl/1.27
Location: http://localhost/
Connection: close
Content-Type: text/html; charset
Using Fedora Core 3, when I wget http://www.studylight.org/;, it prints
out:
--02:52:30-- http://www.studylight.org/
= `index.html'
Resolving www.studylight.org... 63.164.18.58
Connecting to www.studylight.org[63.164.18.58]:80... connected.
HTTP request sent, awaiting response...
On Tuesday 17 May 2005 01:56 am, Jim Peterson wrote:
Using Fedora Core 3, when I wget http://www.studylight.org/;, it prints
out:
--02:52:30-- http://www.studylight.org/
= `index.html'
Resolving www.studylight.org... 63.164.18.58
Connecting to
Quoting Hrvoje Niksic [EMAIL PROTECTED]:
Dear Hrvoje,
You are right, now I also can't reproduce the bug. I just realize
that in my second dowloading have missed another file.
When I downloaded the pages, the webserver was a little slow, made
long (10 s) pauses too. Perhaps it reached the wget
... 131.188.3.71
Connecting to ftp.uni-erlangen.de[131.188.3.71]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: -1,542,565,888 [application/octet-stream]
200 OK
0 mandola ~wget --version
GNU Wget 1.9.1
0 mandola ~/we/wikipedia/ftp.uni-erlangen.de/pub/mirrors/wikipedia.dewget
hi alexander,
this is a known problem which is already fixed in cvs. perhaps you may want to
try using wget 1.10-alpha2:
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.gz
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2
--
Aequam memento rebus in
wget 1.9.1 fails
when trying to download a very large file.
The download stopped
in between and attempting to resume shows a negative sized balance to be
downloaded.
e.g.ftp://ftp.solnet.ch/mirror/SuSE/i386/9.2/iso/SUSE-Linux-9.2-FTP-DVD.iso 3284710 KB
I read somewhere
that it is due
Sanjay Madhavan [EMAIL PROTECTED] writes:
wget 1.9.1 fails when trying to download a very large file.
The download stopped in between and attempting to resume shows a negative
sized balance to be downloaded.
e.g.ftp://ftp.solnet.ch/mirror/SuSE/i386/9.2/iso/SUSE-Linux-9.2-FTP-DVD.iso
, Hrvoje Niksic [EMAIL PROTECTED] wrote:
Sanjay Madhavan [EMAIL PROTECTED] writes:
wget 1.9.1 fails when trying to download a very large file.
The download stopped in between and attempting to resume shows a negative
sized balance to be downloaded.
e.g.ftp://ftp.solnet.ch/mirror/SuSE/i386
Bryan [EMAIL PROTECTED] writes:
I may run into this in the future. What is the threshold for large
files failing on the -current version of wget???
The threshold is 2G (2147483648 bytes).
I'm not expecting to d/l anything over 200MB, but is that even too
large for it?
That's not too
Hello,
I am behind a firewall and trying to retrieve some pages using
the following command:
wget --tries=0 -r --level=0 -x --proxy=on -N http://some_url
I always get this error : "No such file or
directory."
Please let me know if some other parameters need to be
provided.
Secondly I want
--
Mat Garrett
[EMAIL PROTECTED]
---BeginMessage---
I have some suggestions for small modifications for enhancing future
versions of wget.
I understand the need to first retrieve a .listing when using
--timestamping with an ftp:// URL, but shouldn't that step be skipped
when downloading the
Dear GNU,
I was trying to download SuSE version 9.2 from the local mirror site
thinking that I could get the entire package as a single DVD image ( 2 GB).
So I did the wget command with the appropriate FTP arguments, and run it in
the background.
The first clue that this was going to have
Dear GNU,
I was trying to download SuSE version 9.2 from the local mirror site
thinking that I could get the entire package as a single DVD image ( 2 =
GB). So I did the wget command with the appropriate FTP arguments, and run
it = in the background.
The first clue that this was going to
Erik Ohrnberger [EMAIL PROTECTED] writes:
I was trying to download SuSE version 9.2 from the local mirror site
thinking that I could get the entire package as a single DVD image
( 2 GB). So I did the wget command with the appropriate FTP
arguments, and run it in the background.
Patrick Pirrotte ha scritto:
Hello,
I'm having a strange problem using wget-1.9.1-r2 or wget-1.9-r2.
I'm working behind a proxy (that has been duly configured with export
http_proxy) and connecting to
[SNIP]
I noticed the same thing. I think It is due to a patch (2004-11-18
Leonid Petrov [EMAIL
Hello,
I'm having a strange problem using wget-1.9.1-r2 or wget-1.9-r2.
I'm working behind a proxy (that has been duly configured with export
http_proxy) and connecting to
ftp://ftp.ebi.ac.uk/pub/databases/uniprot/knowledgebase/uniprot_sprot.fasta.
gz to get the newest Swissprot database. I
BUG in
proxy-password:
It is not possible
to pass a proxy-password, that includes a whitespace.
Would be good to
have to poss. to put the password in special characters to make it
work.
Greetings:
Carsten
Giese
There seems to be some quirky code in ftp.c affecting time-stamping
(-N). The following complaints are based on the 1.9.1 released code,
so things may have improved (or at least changed) since then.
I haven't checked elsewhere, but on VMS, setting the file date to
some old value (touch())
I have been trying to mirror part of a remote site, and had it hasn't worked
on subsequent mirror updates.
The problem is:
http.c:http_loop does exactly the right thing with regards to
the .html.orig: it identifies that it exists, uses it in determining
whether the timestamp of the remote page
-- -185546781 bytes (long int, must be long unsigned
int, best double long unsigned int)
unfortunately wget 1.9.1 does not have long file support, but there is a patch
waiting to be included in the next release.
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi
University of Ferrara
Dear Sirs,
I did:
> cd /tmp
> /usr/local/bin/wget -HEkpa wget.log http://svnbook.red-bean.com/svnbook-1.1/svn-book.html
and received almost a perfect copy of the Subversion book.
Only three images, which were referenced in styles.css, were missing.
The relevant entries in styles.css look like:
options don't exist, you are not to blame ;)
Should I get a newer version of wget?
1.9.1 is the latest stable version according to http://wget.sunsite.dk/
CU
Jens (just another user)
--
GMX ProMail mit bestem Virenschutz http://www.gmx.net/de/go/mail
+++ Empfehlung der Redaktion +++ Internet
On Mon, 18 Oct 2004, Gerriet M. Denkmann wrote:
So - is this a bug, did I misunderstand the documentation, did I use
the wrong options?
Reasonable request. You just couldn't find the archives:
http://www.mail-archive.com/[EMAIL PROTECTED]/msg06626.html
more:
For those who may be interested, my VMS port of Wget 1.9.1 is now
available. It fixes many (possibly all) of the problems in ftp.c
related to VMS FTP servers.
As usual, there are some changes to source files which have nothing
in particular to do with VMS (elimination of miscellaneous
://www.sysinternals.com/index.shtml
DEBUG output created by Wget 1.9.1 on Windows.
set_sleep_mode(): mode 0x8001, rc 0x8000
--17:50:19-- http://www.sysinternals.com/index.shtml
= `index.shtml.1'
Resolving www.sysinternals.com... seconds 0.00, 66.193.254.46
Caching www.sysinternals.com
Tristan Miller [EMAIL PROTECTED] writes:
There appears to be a bug in the documentation (man page, etc.) for
wget 1.9.1.
I think this is a bug in the man page generation process.
Greetings.
There appears to be a bug in the documentation (man page, etc.) for wget
1.9.1. Specifically, the section about the command-line option for
proxies ends abruptly:
-Y on/off
--proxy=on/off
Turn proxy support on or off. The proxy is on by default
I have compiled the 1.9.1. source on a Solaris 9 machine with Openssl
installed on both the compiling Sun and the operating Sun. I believe the
Sun compiler machine is properly configured, using the configure command
prior to compiling.
When I compile wget, I get a wget binary as
downloaded. Workaround by wget
would be fine.
_
wget and OS configuration
=
ftp://prep.ai.mit.edu/gnu/wget/wget-1.9.1.tar.gz mtime 2003-11-14
file wget-1.9.1/util/wget.spec, line 2:
I changed Version: 1.7 to Version: 1.9.1,
built with tar a new wget
M P [EMAIL PROTECTED] writes:
I'm also trying to automatically login to
https://online.wellsfargo.com/cgi-bin/signon.cgi using
wget but with no luck so far.
Any ideas to get this working is greatly appreciated.
I'm finding it hard to try this out, but I *think* that a combination
of
Yup; 1.9.1 cannot download large files. I hope to fix this by the
next release.
I'm also trying to automatically login to
https://online.wellsfargo.com/cgi-bin/signon.cgi using
wget but with no luck so far.
Any ideas to get this working is greatly appreciated.
Thanks.
PM
* From: Greg Underwood
* Subject: Re: recursive and form posts in wget
1.9.1
* Date: Mon
Hi,
I use wget on a i386 redhat 9 box to download 4G DVD from a ftp site.
The process stops at:
$ wget -c --proxy=off
ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso
--12:47:24--
ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso
=
Title: Message
It's a known
bug. I'm waiting for a fix for it myself.
Mark
Post
-Original Message-From: Lawrance, Mark
[mailto:[EMAIL PROTECTED] Sent: Wednesday, May 12,
2004 9:09 AMTo: [EMAIL PROTECTED]Subject: GNU Wget
1.9.1
GNU Wget 1.9.1
The non-interactive download
Lawrance, Mark [EMAIL PROTECTED] writes:
I am unable to get wget to work via a proxy for HTTPS sites.
It does work via proxy for HTTP
It does work with HTTPS NOT through proxy
Any ideas? Should this work?
It works in the CVS version of Wget. Please try it and see if it
works
Dear Sirs,
I have downloaded and build wget 1.9.1 (on Mac OS X 10.3.3) and I am
quite amazed on the speed and capabilities of wget. Thanks for a great
tool!
One problem:
I did: /usr/local/bin/wget -P ~/Desktop/Wget -a ~/Desktop/wget.log -k
-p -H -E http://cocoadevcentral.com/articles/80
Gerriet M. Denkmann [EMAIL PROTECTED] writes:
So: either the -P option should work as it does - than the man page
should mention this. Or it is a -P bug.
It's a bug, fixed in CVS.
I discovered a problem with wget 1.9.1 (Windows Version) I try to
download the file listed below in getright. And getright was able to
resume the file but failed after the filesize exceed 2 GB. A i had the
remaining file and knew the server is capable of resuming i thought of
using wget
filesize limitation of fat32 is 2gb afaik
- Original Message -
From: Sebastian Armbrust [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, March 21, 2004 1:35 PM
Subject: Wget 1.9.1 resume problems
I discovered a problem with wget 1.9.1 (Windows Version) I try to
download
Hello,
I use Wget (1.9.1) to stress-test my custom build ISAPI application, by
using -r to recursivly follow all links. In addition I use
-Dmydomain.org to prevent content from other domains to be requested.
However, when my ISAPI application sends a redirect (HTTP Errorcode
301), wget simply
I've now installed this fix, thanks.
Greg Underwood [EMAIL PROTECTED] writes:
On Tuesday 27 January 2004 05:23 pm, Hrvoje Niksic wrote:
Greg Underwood [EMAIL PROTECTED] writes:
I took a peek at my cookies while logging into the site in a regular
browser. It definitely adds a session cookie when I log in,
I think your
On Tuesday 27 January 2004 05:23 pm, Hrvoje Niksic wrote:
Greg Underwood [EMAIL PROTECTED] writes:
I took a peek at my cookies while logging into the site in a regular
browser. It definitely adds a session cookie when I log in,
I think your problem should be solvable with
Greg Underwood [EMAIL PROTECTED] writes:
I took a peek at my cookies while logging into the site in a regular
browser. It definitely adds a session cookie when I log in,
I think your problem should be solvable with `--keep-session-cookies'.
The server will have no way of knowing that the two
Nicolas,
Thanks for the tip.
I took a peek at my cookies while logging into the site in a regular browser.
It definitely adds a session cookie when I log in, but when I just browse to
the login page, it doesn't appear to be adding a session cookie. There's a
site cookie there, but I don't
So I've got a complex problem. I've been perusing the archives and not seen
anything resembling a conversation on it, but I may well have missed one. If
this is a solved problem, I will apologize and dissapear for the small
consideration of some pointers to the previous emails. :D
So, my
Hrvoje Niksic [EMAIL PROTECTED] writes:
What do you think about this patch:
+ if (USLEEP_usec 0) \
+usleep (USLEEP_usec); \
+} while (0)
Could you change this to have proper number conversions?
usleep ((unsigned long)USLEEP_usec);
works
Georg Bauhaus [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
What do you think about this patch:
+ if (USLEEP_usec 0) \
+usleep (USLEEP_usec);\
+} while (0)
Could you change this to have proper number
Hrvoje Niksic [EMAIL PROTECTED] writes:
I'm not sure why the above is in any way improper. [...]
It sounds like your usleep function is undeclared. See if compiler
-E reports the declaration of usleep. I can add the cast to work
around the problem, but it should not be required.
Right,
Quoting Hrvoje Niksic ([EMAIL PROTECTED]):
[...]
Thanks, I'll try to backfeed the information into the PR today, so more
people can look at it.
--
j.
Taking a peek inside the wget 1.9.1 source reveals the following
construction used a number of times in src/retr.c:
usleep(100L * opt.wait);
There are a couple of problems with this approach. First, we have from
the NetBSD man-page of usleep(3):
The microseconds argument must
Jesper Louis Andersen [EMAIL PROTECTED] writes:
There are a couple of problems with this approach. First, we have
from the NetBSD man-page of usleep(3):
The microseconds argument must be less than 1,000,000. This renders
the sleep impossible for other values of opt.wait than 1. NetBSD
Quoting Hrvoje Niksic ([EMAIL PROTECTED]):
Thanks for pointing this out -- I had no idea that some systems don't
allow usleep to sleep for more than a second. Why do they do that?
I do not know. It conforms to XPG4.2, So I expect the limit to be on
other operating systems too.
The latest
Hrvoje Niksic [EMAIL PROTECTED] writes:
The latest source (available in CVS) has a better sleeping function
that uses nanosleep where available, and that handles usleep's
wraparound for long sleeps. But it still can call usleep with values
larger than 1,000,000. I've attached a patch that
Hrvoje Niksic [EMAIL PROTECTED] writes:
May I suggest using sleep(3) instead. It is used in the code in other
places and has the semantics you want.
sleep(3) cannot sleep for less than a second. I like the idea of
being able to specify `--wait 0.5' for Wget to wait for half a
second
Thanks for the patch. A similar fix is already in CVS.
-BEGIN PGP SIGNED MESSAGE-
wget -A gz mplayer.hu/pipermail/mplayer-userss/ isn't working
correctly. Wget downloads the files with html , and says they are
unwanted and finally delete this file. I have this problem often with
http Servers.
with best regards from Dortmund
-BEGIN PGP SIGNED MESSAGE-
wget -A gz mplayer.hu/pipermail/mplayer-userss/ isn't working
correctly. Wget downloads the files with html , and says they are
unwanted and finally delete this file. I have this problem often with
http Servers. But wget 1.7.1 OS/2 works correct.
with best
Hans Werner Strube [EMAIL PROTECTED] writes:
There is a name clash in src/connect.c for IRIX 6.2: /usr/include/sys/socket.h
contains a #define sa_len ...
Thanks for the report; this is already fixed in CVS (both in the mail
trunk and in the 1.9 branch).
There is a name clash in src/connect.c for IRIX 6.2: /usr/include/sys/socket.h
contains a #define sa_len ...
Hans Werner Strube [EMAIL PROTECTED]
Drittes Physikalisches Institut, Univ. Goettingen
Buergerstr. 42-44,37073 Goettingen,Germany
Fix:
*** src/connect.c.ORI Sat Nov
To: [EMAIL PROTECTED]
Subject: Wget 1.9.1 has been released
Wget 1.9.1 is now available on ftp.gnu.org and its mirrors. It is a
bugfix release that contains fixes for several problem noted in the
1.9 release. Unless further serious bugs are discovered, it is likely
to remain the last
Wget 1.9.1 is now available on ftp.gnu.org and its mirrors. It is a
bugfix release that contains fixes for several problem noted in the
1.9 release. Unless further serious bugs are discovered, it is likely
to remain the last in the 1.9.x series.
Herold Heiko [EMAIL PROTECTED] writes:
Windows MSVC binary at http://xoomer.virgilio.it/hherold
Thanks. I assume this means that it compiled without a hitch.
Anyone else with a report? Should I release 1.9.1 now?
71 matches
Mail list logo