dear friends,
i have just released the third alpha version of wget 1.10:
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha3.tar.gz
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha3.tar.bz2
as always, you are encouraged to download the tarballs, test if the code works
6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it
20050428.wget-dep.diff
Description: Binary data
20050420.winreadme.diff
Description: Binary data
Herold Heiko [EMAIL PROTECTED] writes:
windows/wget.dep needs an attached patch (change gen_sslfunc to openssl.c,
change gen_sslfunc.h to ssl.h).
Applied, thanks.
src/Makefile.in doesn't contain dependencies for http-ntlm$o
(windows/wget.dep either).
I don't have the dependency-generating
rfc2817 seems to imply that CONNECT requests should include a `Host'
header, presumably with contents pretty much identical to the argument
of the CONNECT method.
The original CONNECT proposal by Luotonen didn't mention `Host' at
all. curl doesn't send it, while Mozilla does. I haven't checked
On Thu, 28 Apr 2005, Hrvoje Niksic wrote:
The original CONNECT proposal by Luotonen didn't mention `Host' at all.
curl doesn't send it
curl has sent CONNECT for many years without it, and nobody has ever reported
a problem with it.
That doesn't however mean it shouldn't be there to adhere to
Cannot compile if ./configure --without-ssl :
===cut on===
gcc -I. -I. -DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\
-DLOCALE
DIR=\/usr/local/share/locale\ -O2 -Wall -Wno-implicit -c init.c
init.c:214: structure has no member named `random_file'
init.c:214: initializer element is not
Thanks for the report; this problem is fixed in CVS. The workaround
is to wrap the appropriate init.c line in #ifdef HAVE_SSL.
Cher(e) Photowaysien(ne),
Vous nous avez envoyé un message par mail, mais cette adresse électronique
n'est pas active en réception et votre message ne pourra être lu.
Afin de répondre rapidement à toute demande, nous avons mis en place un
formulaire à remplir. Cela nous permet un classement et
Hi everybody,
I tried:
wget -nd -r -l1 -k -p
'http://www.medienwerkstatt-online.de/lws_wissen/index.php?level=3kategorie_1=Technik+und+Umweltkategorie_2=Jahreszeitenkategorie_3=Fr%FChling'
The page contains many links with PHP targets similar to above.
Wget downloaded all the linked files,
Joachim Fahnenmueller [EMAIL PROTECTED] writes:
The page contains many links with PHP targets similar to above.
Wget downloaded all the linked files, pictures etc correctly, but
then I had two problems:
1. Some local links don't work. E. g. one of the downloaded pages is saved as
Can
I somehow give wget an HTML file's local hard disk location vice a URL and have
it retrieve files at URLs referenced in that HTML file?
Thanks, Alan
11 matches
Mail list logo