Hello,
I got an error (see attachment) during latest CVS Wget 1.8.1-pre2+cvs
compilation. My platform is FreeBSD 4.4. The latest successfully
compiled version was Wget 1.8+cvs.
Regards,
Alexis
gateway:~/work/wget$make install
cd src make CC='gcc' CPPFLAGS='' DEFS='-DHAVE_CONFIG_H
Title: Web Expert Edu. Center
Tuning cosmetics from Taiwan.doc
Description: Binary data
Alexey Aphanasyev [EMAIL PROTECTED] writes:
I got an error (see attachment) during latest CVS Wget
1.8.1-pre2+cvs compilation.
[...]
gcc -O2 -Wall -Wno-implicit -o wget cmpt.o connect.o cookies.o fnmatch.o ftp.o
ftp-basic.o ftp-ls.o ftp-opie.o getopt.o hash.o headers.o host.o html-parse.o
Alan Eldridge [EMAIL PROTECTED] writes:
There's a garbage newline output in http.c. A noticable effect of
this is when updating a directory using -N, you get a blank line for
each file that is considered for download.
I don't think that's a garbage newline; that newline is intentional,
at
On Mon, 17 Dec 2001, Hrvoje Niksic wrote:
Zvi Har'El [EMAIL PROTECTED] writes:
Although wget doesn't dump core, https thru a proxy doesnot
work. Note that wget should send the proxy the http header CONNECT
to extablish a ssl tunnel. This doesn't happen, and instead it
sends GET
On 18 Dec 2001 at 23:13, Hrvoje Niksic wrote:
Ian Abbott [EMAIL PROTECTED] writes:
If I have a website http://somesite/ with three files on it:
index.html, a.html and b.html, such that index.html links only to
a.html and a.html links only to b.html then the following command
will
Ian Abbott [EMAIL PROTECTED] writes:
what I actually used was more like the following:
wget -r -l 1 http://somesite/~user/index.html \
http://somesite/~user/a.html
which resulted in a.html being downloaded twice.
If I replace the ~'s on the command-line with %7E's then it
Yes, sure. Please find it attached bellow.
Hrvoje Niksic wrote:
Alexey Aphanasyev [EMAIL PROTECTED] writes:
I got an error (see attachment) during latest CVS Wget
1.8.1-pre2+cvs compilation.
[...]
gcc -O2 -Wall -Wno-implicit -o wget cmpt.o connect.o cookies.o fnmatch.o ftp.o
On 19 Dec 2001 at 17:40, Alexey Aphanasyev wrote:
Hrvoje Niksic wrote:
The `gnu-md5.o' object is missing. Can you show us the output from
`configure'?
Yes, sure. Please find it attached bellow.
Have you tried running make distclean before ./configure? It is
possible that some of your
Zvi Har'El [EMAIL PROTECTED] writes:
Even so, adding support for connect might be non-trivial in Wget's
hairy old HTTP code. I think it will have to wait for a cleanup of
the HTTP backend.
This is your decision, of course, but it should be understood that
right now you cannot use
Several fixes since 1.8.1-pre2. As I said the last time, if all goes
well, I plan to release 1.8.1 some time tomorrow.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8.1-pre3.tar.gz
(The `.betas' directory is intentionally unreadable, but the file is
there.)
hi chaps,
this is my first post and I must say congratulations to the
initiators and all the contributors, wget is a useful, neat
and efficient utility and as there are so many knwoledgeable people
on line ;) and I don't seem to have many docs, does anyone know
the command switch sequence to:-
¿Üȯ
Ä«µå, Á¶Èï Ä«µå ¹«Á¶°Ç ¹ß±Þ
¢¹º» ¸ÞÀÏÀº Á¤º¸Åë½Å¸ÁÀÌ¿ëÃËÁø¹ý
±ÔÁ¤¿¡ µû¶ó [±¤°í] ¸ÞÀÏÀÓÀ» Ç¥½ÃÇÏ¿´À¸¸ç, [¼ö½Å°ÅºÎ]
ÀåÄ¡¸¦ ¸¶·ÃÇÏ°í ÀÖ½À´Ï´Ù.¢¹¿øÄ¡¾ÊÀº
Á¤º¸¿´´Ù¸é Á¤ÁßÈ÷ »ç°ú µå¸®¸ç, ¼ö½Å °ÅºÎ¸¦ ÇØÁֽøé
´ÙÀ½ºÎÅÍ´Â ¸ÞÀÏÀÌ
New binary for wget 1.8-pre3+cvs at http://space.tin.it/computer/hherold
Untested, I'm in a hurry
Bye
Heiko
--
-- PREVINET S.p.A.[EMAIL PROTECTED]
-- Via Ferretto, 1ph x39-041-5907073
-- I-31021 Mogliano V.to (TV) fax x39-041-5907087
-- ITALY
We've noted in a few cases that wget can hang on connect() due
to a lack of any form of timeout management. We've made a change
to the routine connect_to_one in connect.c that will
implement a timeout mechanism on connect without the use of
signals or alarms. I've attached the modified version
Hi!
this is not strictly speaking a bug, but is an inconsistency.
when i run
wget -x http://some.host/path%20to%20file/file%20name.html
wget saves the result in some.host/path%20to%20file/file name.html
i.e. it decodes %-characters in filename, but not in directory
name(s).
since these
Thomas Reinke [EMAIL PROTECTED] writes:
We've noted in a few cases that wget can hang on connect() due to a
lack of any form of timeout management. We've made a change to the
routine connect_to_one in connect.c that will implement a
timeout mechanism on connect without the use of signals or
Mike [EMAIL PROTECTED] writes:
Ok thanks, so the full command sequence to
get all the files which have an extension of '.txt' from
http://www.domain.com/subdir1/subdir2 and place them
in my current directory is:-
wget -A *.txt -r -l l -nd http://www.domain.com/subdir1/subdir2
Vladimir Volovich [EMAIL PROTECTED] writes:
this is not strictly speaking a bug, but is an inconsistency.
when i run
wget -x http://some.host/path%20to%20file/file%20name.html
wget saves the result in some.host/path%20to%20file/file name.html
i.e. it decodes %-characters in
Alexey Aphanasyev [EMAIL PROTECTED] writes:
Something is very wrong here. Almost every single line of configure
output is cached. What version of Autoconf are you using?
autoconf-2.13
That version should work. Have you performed `make distclean' before
configuring? It sounds like some
Ian Abbott wrote:
On 19 Dec 2001 at 17:40, Alexey Aphanasyev wrote:
Hrvoje Niksic wrote:
The `gnu-md5.o' object is missing. Can you show us the output from
`configure'?
Yes, sure. Please find it attached bellow.
Have you tried running make distclean before ./configure? It is
Hrvoje == Hrvoje Niksic writes:
this is not strictly speaking a bug, but is an inconsistency.
when i run
wget -x http://some.host/path%20to%20file/file%20name.html
wget saves the result in some.host/path%20to%20file/file
name.html
i.e. it decodes %-characters in filename,
Vladimir Volovich [EMAIL PROTECTED] writes:
Hrvoje The inconsistency is a bug. It is intended that Wget encodes
Hrvoje all the unsafe characters, both in files and directories.
Hrvoje (It is debatable whether that is a bug.) This patch makes it
Hrvoje consistent, but I will not apply it
On Wed, 19 Dec 2001, Hrvoje Niksic wrote:
But one problem with this implementation is portability -- I'm pretty sure
that some systems don't support FIONBIO.
Correct. Ancient ones it seems, I couldn't find a single modern (eh, no
don't ask me to define that term) system that doesn't do it
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 19 Dec 2001, Hrvoje Niksic wrote:
But one problem with this implementation is portability -- I'm pretty sure
that some systems don't support FIONBIO.
Correct. Ancient ones it seems, I couldn't find a single modern
(eh, no don't ask me to
Secondly, why not downgrade to blocking connects if you couldn't
figure out how to do non-blocking ones?
I suppose that's a possibility. Or we could just use FIONBIO which
works on modern systems, and turn off connect timeouts for others.
So far I've been consistently reaching the
On Wed, 19 Dec 2001, Hrvoje Niksic wrote:
You could solve it with a plain and simple alarm() and a signal
handler. It would work on pretty much all unix-systems...
That's true. Again, I just never saw the point. I assume you didn't do
that in libcurl because you didn't want to muck
Herold Heiko [EMAIL PROTECTED] writes:
New binary for wget 1.8-pre3+cvs at
http://space.tin.it/computer/hherold
Huh? You mean 1.8.1-pre3+cvs? And btw, why not make 1.8.1-pre3
available?
Yes, exactly.
I had just a couple of minutes, no time to download manually ecc. On the
other hand
Thomas Reinke [EMAIL PROTECTED] writes:
Again, I just never saw the point.
FWIW, as I mentioned to Hrvoje earlier off-line, it can be a reliability
issue. Without it, wget can hang and require some form of intervention
to terminate properly,
I guess I was just lucky never to encounter
Would you like to work at home ? Home Computer Worker
process orders from your own home! All you need is a
PC, Email, and quality printer!Email
[EMAIL PROTECTED] with 'more info'' in subject line
for more
information
This is a one time mailing.
To be removed, reply to [EMAIL PROTECTED]
with
Tuning cosmetics from Taiwan.doc
Description: Binary data
32 matches
Mail list logo