Hi there !
Since I compiled wget 1.8 some time ago, I've been wondering if the
different behavior for 'wget -m' is a bug or a feature. All versions up to
1.7 did what I expected but 1.8 only retrieves one single file (mostly
index.html). What is going wrong here? Do I just need a newer version
Hi!
in wget 1.8, FTP recursion through proxy servers does not work anymore:
when i run
wget --execute ftp_proxy = http://some.proxy:3128; -m ftp://some.host/dir/
wget only retrieves the file ftp://some.host/dir/index.html
and stops.
i found the following ChangeLog entries:
* recur.c
.
== original message ==
in wget 1.8, FTP recursion through proxy servers does not work anymore:
when i run
wget --execute ftp_proxy = http://some.proxy:3128; -m ftp://some.host/dir/
wget only retrieves the file ftp://some.host/dir/index.html
and stops.
i found the following ChangeLog
Hi, all
i've tried to mirror Open Group's website as follows:
wget -r -nv -p --convert-links -nH -nd http://www.opengroup.org/onlinepubs/007904975/
and got this:
09:55:36 URL:http://www.opengroup.org/contacts/ [7923] - index.html.42 [1]
wget: recur.c:752: register_download: Assertion
Vesselin Markov [EMAIL PROTECTED] writes:
I wrote a bulgarian localization of Wget 1.8.1 which
i hope you'll accept. I am sending
wget-1.8.1-BG/po/bg.po
wget-1.8.1-BG/po/bg.gmo
wget/share/locale/bg/LC_MESSAGES/wget.mo
Thanks a lot -- but please consider to join the Bulgarian translation
cagri coltekin [EMAIL PROTECTED] writes:
Apologies if this is a known issue. However, it seems that as of
wget 1.8, the `?' char is treated as a separator in URLs. But this
feature brakes the ftp downloads using wild-card `?'. It would be
nice to disable this in url_parse() if url
Hi!
I'm having trouble using wget 1.8.[01] over a (squid24-) proxy
to mirror a ftp-directory:
# setenv ftp_proxy http://139.21.68.25:
# wget181 -r -np -l0 ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
--12:06:58-- ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux
# wget181 -r -np -l0
ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
ummm... looks like the -l0 might be limiting your recursion level to 0
levels
On 12 Feb 2002 at 12:30, Holger Pfaff wrote:
I'm having trouble using wget 1.8.[01] over a (squid24-) proxy
to mirror a ftp-directory:
# setenv ftp_proxy http://139.21.68.25:
# wget181 -r -np -l0 ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
--12:06:58-- ftp
On 12 Feb 2002 at 7:54, Winston Smith wrote:
# wget181 -r -np -l0
ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
ummm... looks like the -l0 might be limiting your recursion level to 0
levels
No. '-l0' is the same as '-l inf'.
I've never tried wget through an http-based ftp proxy.
Are there any clues in the file it wrote (presumably a html-format
directory listing)?
Are there any more clues if you use the -d (--debug) option?
# wget181 -r -np -l8 --debug
Hi,
I am using the following script to mirror pages from
http://newton.uam.mx/xgeorge/
in the local directory (served as http://www.tarunz.org/~xgeorge/):
wget -q -t10 -c -T120 -nH --cut-dirs=1 -r -m -np http://newton.uam.mx/xgeorge/
Now, if the remote directory index document (mapped to the
children w/old parent broken
(wget-1.8)
Hi,
I am using the following script to mirror pages from
http://newton.uam.mx/xgeorge/
in the local directory (served as http://www.tarunz.org/~xgeorge/):
wget -q -t10 -c -T120 -nH --cut-dirs=1 -r -m -np
http://newton.uam.mx/xgeorge/
Now
Hi,
Apologies if this is a known issue. However, it seems that as of
wget 1.8, the `?' char is treated as a separator in URLs. But
this feature brakes the ftp downloads using wild-card `?'. It
would be nice to disable this in url_parse() if url is not an
https* url.
Regards.
--
cagri
-- https://mylogin:*password*@myhostsip/enderle/
= `myhostsip/enderle/index.html'
Verbindungsaufbau zu cacheip:3128... verbunden.
zsh: segmentation fault (core dumped) wget -m -r -v
https://mylogin:mypass@myhostsip/enderle/
edv02::/home/enderle % wget --version
GNU Wget 1.8
with wget
Hi,
I have downloaded your source code for wget and tried to make it but
failed due to va_list parameter conflict in stdarg.h and stdio.h. Please
advice.
Regards,
Tay Ngak San
Mobile Phone: 9620-9712
Tay Ngak San [EMAIL PROTECTED] writes:
I have downloaded your source code for wget and tried to make it but
failed due to va_list parameter conflict in stdarg.h and stdio.h.
Please advice.
What OS and compiler are you using to compile Wget?
Thomas Reinke [EMAIL PROTECTED] writes:
Neat...not sure that I really nkown enough to start digging to easily
figure out what went wrong, but it can be reproduced by running the
following:
$ wget -d -r -l 5 -t 1 -T 30 -o x.lg -p -s -P dir -Q 500
--limit-rate=256000 -R mpg,mpeg
The util/wget.spec file in wget-1.8.tar.gz has 1.7 as the version number.
I suggest that this file be replaced with a util/wget.spec.in file that
would use @VERSION@.
--
Pierre Sarrazin sarrazip at sympatico dot ca
On Mon, 17 Dec 2001, Hrvoje Niksic wrote:
Zvi Har'El [EMAIL PROTECTED] writes:
Although wget doesn't dump core, https thru a proxy doesnot
work. Note that wget should send the proxy the http header CONNECT
to extablish a ssl tunnel. This doesn't happen, and instead it
sends GET
Zvi Har'El [EMAIL PROTECTED] writes:
Even so, adding support for connect might be non-trivial in Wget's
hairy old HTTP code. I think it will have to wait for a cleanup of
the HTTP backend.
This is your decision, of course, but it should be understood that
right now you cannot use
Although retrieve_tree() stores and retrieves referring URLs in the
URL queue, it does not pass them to retrieve_url(). This seems to
have got lost during the transition from depth-first to breadth-
first retrieval.
This means that HTTP requests for URLs being retrieved at depth
greater than 0
Ian Abbott [EMAIL PROTECTED] writes:
Although retrieve_tree() stores and retrieves referring URLs in the
URL queue, it does not pass them to retrieve_url(). This seems to
have got lost during the transition from depth-first to breadth-
first retrieval.
It was an oversight on my part.
Dear Hrvoje Niksic
Although wget doesn't dump core, https thru a proxy doesnot work. Note that
wget should send the proxy the http header CONNECT to extablish a ssl tunnel.
This doesn't happen, and instead it sends GET https://...;, which is wrong!
DEBUG output created by Wget 1.8
Zvi Har'El [EMAIL PROTECTED] writes:
Although wget doesn't dump core, https thru a proxy doesnot
work. Note that wget should send the proxy the http header CONNECT
to extablish a ssl tunnel. This doesn't happen, and instead it
sends GET https://...;, which is wrong!
Would CONNECT work with
On Mon, 17 Dec 2001, Hrvoje Niksic wrote:
Would CONNECT work with HTTP/1.0? My google search indicates that CONNECT
predates HTTP/1.1, but I'm not sure if it would work without trying it.
Yes. CONNECT has been the way to do SSL over HTTP proxies since many years
back, even during the
Vladimir Volovich [EMAIL PROTECTED] writes:
while downloading some file (via http) with wget 1.8, i got an error:
assertion failed: p - bp-buffer = bp-width, file progress.c, line 673
Abort (core dumped)
Thanks for the report. It's a known problem in 1.8, fixed by this
patch.
Index: src
On 2001-12-16 19:02 +0100, Hrvoje Niksic wrote:
Andre Majorel [EMAIL PROTECTED] writes:
On 2001-12-15 07:37 +0100, Hrvoje Niksic wrote:
Is there a good fallback value of RAND_MAX for systems that don't
bother to define it?
The standard (SUS2) says :
The value of the
Andre Majorel [EMAIL PROTECTED] writes:
5th edition, 6th edition, 7th edition and System III all returned
0-32767. As RAND_MAX didn't exist at the time, plenty of code must
have been written that assumed 0-32767. For that reason I think it
unlikely that anybody ever wrote an implementation
On 2001-12-15 07:37 +0100, Hrvoje Niksic wrote:
Is there a good fallback value of RAND_MAX for systems that don't
bother to define it?
The standard (SUS2) says :
The value of the {RAND_MAX} macro will be at least 32767.
--
André Majorel
Work: [EMAIL PROTECTED]
Home: [EMAIL PROTECTED]
Mike Castle [EMAIL PROTECTED] writes:
I'm not on the list, just following via web archive.
In response to:
http://www.mail-archive.com/wget@sunsite.dk/msg02211.html
On my _homegrown_ Linux system, I'm also having an issue.
I have a libmd5, but no md5.h.
I believe this problem has
Greetings,
I have successfully built wget-1.8 under linux-2.4.xx (at home)
and under SunOS-5.5, but under SunOS-4.1.4, I get a compile-time
error:
gcc -I. -I../../../../src/wget-1.8/src-DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/u/shelto
n/new/SunOS-4.1/etc/wgetrc\ -DLOCALEDIR=\/u/shelton/new
Vin Shelton [EMAIL PROTECTED] writes:
I have successfully built wget-1.8 under linux-2.4.xx (at home) and
under SunOS-5.5, but under SunOS-4.1.4, I get a compile-time error:
../../../../src/wget-1.8/src/retr.c:682: `RAND_MAX' undeclared (first use in this fun
ction)
[...]
Please let me
Hi,
Setting https_proxy to some proxy, and doing wget https://..., wget dumps core:
(gdb) run https://www.math.technion.ac.il/
Starting program: /usr/local/src/wget-1.8/src/wget
https://www.math.technion.ac.il/
--17:48:32-- https://www.math.technion.ac.il/
= `index.html'
Resolving
Zvi Har'El [EMAIL PROTECTED] writes:
Setting https_proxy to some proxy, and doing wget https://..., wget
dumps core:
Thanks for the report. This patch should remove the crash.
2001-12-13 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): Check for conn-scheme, not u-scheme,
Hi,
when i run wget 1.8 with two arguments:
wget http://some/url http://some/url
which are the same, i get:
Assertion failed: !hash_table_contains (dl_url_file_map, url), file recur.c, line 752
Abort (core dumped)
Best,
v.
hi.
when leeching one file, wget dumped core.. I think it got timeout from
squid or something.
$ wget -c -t0 http://prdownloads.sourceforge.net/cmusphinx/sphinx2-0.4.tar.gz
--00:04:17-- http://prdownloads.sourceforge.net/cmusphinx/sphinx2-0.4.tar.gz
= `sphinx2-0.4.tar.gz'
Resolving
Sami Farin [EMAIL PROTECTED] writes:
when leeching one file, wget dumped core..
Thanks for the report. This is a known problem with the 1.8 release,
fixed by this patch:
Index: src/progress.c
===
RCS file:
I'm not on the list, just following via web archive.
In response to:
http://www.mail-archive.com/wget@sunsite.dk/msg02211.html
On my _homegrown_ Linux system, I'm also having an issue.
I have a libmd5, but no md5.h.
Apparently installing w3c-libwww-5.3.2 installs the library (with said
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]]
Herold Heiko [EMAIL PROTECTED] writes:
I put up the current cvs, mainly since there have been those patches
to ftp-ls.c and the signal handler. Ok ?
Please don't do that. Although all changes in the current CVS
*should* be stable,
I just downloaded and compiled wget 1.8 on Solaris 8 Intel MU6 with gcc
2.95.3 (configured to use Solaris ld and GNU as) and OpenSSL 0.9.6b. Thanks
for folding in the PRNG code into this release.
One problem I have encountered while building wget is a conflict between
/usr/include/md5.h and /usr
Fazal Majid [EMAIL PROTECTED] writes:
I just downloaded and compiled wget 1.8 on Solaris 8 Intel MU6 with
gcc 2.95.3 (configured to use Solaris ld and GNU as) and OpenSSL
0.9.6b. Thanks for folding in the PRNG code into this release.
Excellent -- glad it works for you.
One problem I have
RaXl NXXez de Arenas Coronado [EMAIL PROTECTED] writes:
I've downloaded, compiled and installed the new wget 1.8, and I have
a problem with the new progress bar (the default one).
When downloading, at some random point, an assertion fails at
file progress.c, line 673 (function
Hello Hrvoje :))
'p - bp-buffer = bp-width'
Yes; thanks for the report. This patch should fix the problem:
Thanks a lot for the patch and for the speed :
Shit, you're great :))) Thanks again for wget :))
Raúl
RaXl NXXez de Arenas Coronado [EMAIL PROTECTED] writes:
'p - bp-buffer = bp-width'
Yes; thanks for the report. This patch should fix the problem:
Thanks a lot for the patch and for the speed :
You're welcome. I'm usually not that fast, but as you can imagine,
you're not the
PROTECTED]]
Sent: Monday, December 10, 2001 9:43 AM
To: Wget List
Subject: Re: Wget 1.8 is released
The new version has appeared on the GNU site:
ftp://ftp.gnu.org/pub/gnu/wget/wget-1.8.tar.gz
war [EMAIL PROTECTED] writes:
gcc -I. -I. -I/app/openssl-0.9.6b/include -DHAVE_CONFIG_H
-DSYSTEM_WGETRC=\/app/wget-1.8/etc/wgetrc\
-DLOCALEDIR=\/app/wget-1.8/share/locale\ -O2 -Wall -Wno-implicit -c
gen-md5.c
In file included from gen-md5.c:31:
/usr/include/md5.h:27:
Could you send
Erik Sigra [EMAIL PROTECTED] writes:
I have compiled the previous versions of Wget without any
problem. But version 1.8 introduced a problem; it can't find md5.h
when compiling gen-md5.c.
I have the file md5.h in /usr/local/ssl/include/openssl but the Wget
compilation seems to look in
war [EMAIL PROTECTED] writes:
This file contains any messages produced by compilers while running
configure, to aid debugging if configure makes a mistake.
This configure run looks totally hosed.
The line we're looking for is the one that attempts to detect
MD5Update in libmd5:
Wget 1.8 is released. It should appear on ftp.gnu.org soon; until it
does, you can get it from:
ftp://ftp.gnjilux.hr/pub/unix/util/wget/wget-1.8.tar.gz
MD5 checksum of the archive is:
000caf43722b46df1f58b6fb2deb5b58
Please send bug reports to [EMAIL PROTECTED].
I will announce
Hrvoje Niksic [EMAIL PROTECTED] writes:
Wget 1.8 is released. It should appear on ftp.gnu.org soon; until it
does, you can get it from:
ftp://ftp.gnjilux.hr/pub/unix/util/wget/wget-1.8.tar.gz
This is a typo: the actual URL is:
ftp://ftp.srk.fer.hr/pub/unix/util/wget/wget-1.8
Stefan Bender [EMAIL PROTECTED] writes:
... but at home I get
gen-md5.c:31: md5.h: No such file or directory
when I try to compile the newer cvs versions.
(debian/potato, openssl installed in /usr/local)
That's weird. Apparently HAVE_SOLARIS_MD5 gets misautodetected for
you.
Do you have a
On 01/12/2001 19:44:44 John Poltorak wrote:
On Sat, Dec 01, 2001 at 04:30:47PM +0100, Hrvoje Niksic wrote:
John Poltorak [EMAIL PROTECTED] writes:
Is it possible to include OBJEXT in Makefile.in to make this more
cross-platform?
I suppose so. I mean, o is already defined to .@U@o, but
/wget-1.8-beta2.tar.gz
Success:
- Debian GNU/Linux woody, 80x86, GCC 2.95.4
- Solaris 7, SPARC, GCC 2.95.2
Failure:
- HP-UX 10.0, PA-RISC, GCC 3.0.1
Problem #1 :
gcc -I. -I.-DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\
-DLOCALEDIR=\/usr/local/share/locale\ -O2 -Wall -Wno
On Mon, 3 Dec 2001, Andre Majorel wrote:
Problem #2 :
gcc -I. -I.-DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\
-DLOCALEDIR=\/usr/local/share/locale\ -O2 -Wall -Wno-implicit -c host.c
host.c: In function `lookup_host':
host.c:258: `h_errno' undeclared (first use
On 2001-12-03 18:30 +0100, Hrvoje Niksic wrote:
Andre Majorel [EMAIL PROTECTED] writes:
gcc -I. -I.-DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\
-DLOCALEDIR=\/usr/local/share/locale\ -O2 -Wall -Wno-implicit -c connect.c
connect.c: In function `test_socket_open':
Andre Majorel [EMAIL PROTECTED] writes:
And weird ones, too. These arguments are of type pointer to
fd_set. What would HPUX like to see there?
HP-UX 10 wants (int *). However it defines fd_set as
struct
{
long[];
}
so it works anyway.
HP-UX 10 is wrong. SUS2 (and
/wget-1.8-beta2.tar.gz
Success:
- NCR MP-RAS 3.0, x86, NCR High Performance C Compiler R3.0c
- FreeBSD 4.0, x86, GCC 2.95.2
Thanks !
--
André Majorel URL:http://www.teaser.fr/~amajorel/
(Not speaking for my employer, etc.)
Maciej W. Rozycki [EMAIL PROTECTED] writes:
Better yet:
#if !HAVE_DECL_H_ERRNO
extern int h_errno;
#endif
and use AC_CHECK_DECLS(h_errno,,,[#include netdb.h]) somewhere in
configure.in.
My version of Autoconf does not have an AC_CHECK_DECLS macro.
On Mon, 3 Dec 2001, Hrvoje Niksic wrote:
and use AC_CHECK_DECLS(h_errno,,,[#include netdb.h]) somewhere in
configure.in.
My version of Autoconf does not have an AC_CHECK_DECLS macro.
Hmm, how about considering autoconf 2.52? It is said to be less broken
than 2.13 and indeed it seems
On 2001-12-03 19:16 +0100, Hrvoje Niksic wrote:
I find describing HP-UX 10 as a modern OS mildly amusing. :-)
How old is it? I used to work on HPUX 9, and I'm not old by most
definitions of the word.
Around 1995.
I completely disagree with your perception that snprintf() is to be
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Mon, 3 Dec 2001, Hrvoje Niksic wrote:
and use AC_CHECK_DECLS(h_errno,,,[#include netdb.h]) somewhere in
configure.in.
My version of Autoconf does not have an AC_CHECK_DECLS macro.
Hmm, how about considering autoconf 2.52?
Yes, but not
Bugfixes since 1.8-beta2. Please test it from clean compilation on
Unix (Windows and MacOS are known not to compile without modifications
when SSL is used.)
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta3.tar.gz
(The `.betas' directory is intentionally
On 2001-12-03 21:55 +0100, Hrvoje Niksic wrote:
Bugfixes since 1.8-beta2. Please test it from clean compilation on
Unix (Windows and MacOS are known not to compile without modifications
when SSL is used.)
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta3
:
On 2001-12-03 21:55 +0100, Hrvoje Niksic wrote:
Bugfixes since 1.8-beta2. Please test it from clean compilation on
Unix (Windows and MacOS are known not to compile without modifications
when SSL is used.)
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8
/wget/.betas/wget-1.8-beta2.tar.gz
--08:47:50-- ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
= `wget-1.8-beta2.tar.gz/.listing'
Resolving gnjilux.srk.fer.hr... done.
Connecting to gnjilux.srk.fer.hr[161.53.70.141]:21... connected.
Logging in as anonymous
Jochen Roderburg [EMAIL PROTECTED] writes:
wget.18 ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
--08:47:50--
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
= `wget-1.8-beta2.tar.gz/.listing
Jochen Roderburg [EMAIL PROTECTED] writes:
I think the 'timestamping' is the point, because without that I see
that no .listing file is used at all, and the problem doesn't
appear.
Yes, with the timestamping turned on, I was able to repeat the
problem. This patch should fix it:
2001-12-02
I have the output listing from ./configure and make for Wget-1.8-Beta under OS 10.1.1
Macintosh.
I don't want to bore the mailing list by including it. To whom should I send it
off-list?
Thanks,
Robin Lake
[EMAIL PROTECTED]
John Poltorak [EMAIL PROTECTED] writes:
Is it possible to include OBJEXT in Makefile.in to make this more
cross-platform?
I suppose so. I mean, o is already defined to .@U@o, but I'm not
exactly sure what the U is supposed to stand for.
Robin B. Lake [EMAIL PROTECTED] writes:
I have the output listing from ./configure and make for
Wget-1.8-Beta under OS 10.1.1 Macintosh. I don't want to bore the
mailing list by including it. To whom should I send it off-list?
If the compilation worked, you needn't send it at all
Robin B. Lake [EMAIL PROTECTED] writes:
I am trying to get real-time stock quotes from my broker's Web site.
If I come in via an http:// request, I get 20-minute delayed data.
If I log in with my name and password via my browser, I get
real-time data. By monitoring the IP packets, it seems
On Sat, Dec 01, 2001 at 04:30:47PM +0100, Hrvoje Niksic wrote:
John Poltorak [EMAIL PROTECTED] writes:
Is it possible to include OBJEXT in Makefile.in to make this more
cross-platform?
I suppose so. I mean, o is already defined to .@U@o, but I'm not
exactly sure what the U is supposed
Here is the next 1.8 beta. Please test it if you can -- try compiling
it on your granma's Ultrix box, run it on your niece's flashy web
site, see if cookies work, etc.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
(The `.betas' directory
On Sat, 1 Dec 2001, Hrvoje Niksic wrote:
Here is the next 1.8 beta. Please test it if you can -- try compiling
it on your granma's Ultrix box, run it on your niece's flashy web
site, see if cookies work, etc.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8
Hello.
Here is the next 1.8 beta. Please test it if you can -- try compiling
it on your granma's Ultrix box, run it on your niece's flashy web
site, see if cookies work, etc.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
Please pay attention
://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta1.tar.gz
(The `.betas' directory is intentionally unreadable, but the file is
there.)
I got a segmentation fault when retrieving URLs from a file.
2001-11-27 Ian Abbott [EMAIL PROTECTED]
* retr.c (retrieve_from_file): Initialize `new_file' to NULL to
prevent seg fault.
Index: src/retr.c
===
RCS
Ian Abbott [EMAIL PROTECTED] writes:
I got a segmentation fault when retrieving URLs from a file.
2001-11-27 Ian Abbott [EMAIL PROTECTED]
* retr.c (retrieve_from_file): Initialize `new_file' to NULL to
prevent seg fault.
Good catch. I've applied this, thanks!
79 matches
Mail list logo