Found it.
Using the 23:00 connect.c and the 23:59 retr.c does produce the bug.
Using the 23:59 connect.c and the 23:00 retr.c works ok.
This means the problem must be in retr.c .
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
--
Noèl Köthe [EMAIL PROTECTED] writes:
at the end of the description of the option --http-passwd=password:
For more information about security issues with Wget,
The sentence is incomplete.
wget.texi shows:
For more information about security issues with Wget, @xref{Security
Herold Heiko [EMAIL PROTECTED] writes:
Found it.
Using the 23:00 connect.c and the 23:59 retr.c does produce the bug.
Using the 23:59 connect.c and the 23:00 retr.c works ok.
This means the problem must be in retr.c .
OK, that narrows it down. Two further questions:
1) If you comment out
1), 1a), 2) no, no and no.
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 18, 2003 12:16 PM
To: Herold Heiko
Cc:
Hey!
How can I mirror a small ftp location using wget? I want to sync the ftp
location with a local location. Or is it better to use another program?
Thijs
I've noticed the mistake as soon as I compiled with SSL (and saw the
warnings):
2003-09-18 Hrvoje Niksic [EMAIL PROTECTED]
* retr.c (get_contents): Pass the correct argument to ssl_iread.
Index: src/retr.c
===
RCS file:
Works.
New windows test binary at the usual place.
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 18, 2003 1:43 PM
I'd like to stir up again an old (unresolved) problem regarding the
directory structure used to save files.
Currently if we have a host.site.domain with several services (say, http,
https and ftp) and run one or multiple downloads we could have collision due
to files with the same name downloaded
Herold Heiko [EMAIL PROTECTED] writes:
Solution 1: have a switch like --use-protocol-dir = [no|most|all]
no would be the current state:
./www.some.site/index.html
./www.some.site/index.html
./www.some.site/index.html
all would be: always add a directory level for the protocol:
Hello,
When I run wget to download any URL...
http://www.expert.ru/expert/current/data/raznoe.shtml
I *always* want wget to name the final main .html file index.html.
Long story short, I am downloading webpages, and then sticking them under
the an Apache htdocs dir. After that I
Hello!
Here are some ideas for wget:
1. wget to handle compressed files well. Some web sites hold htmls as
compressed htmls so that auto recursive downloaders like wget won't work. The
browsers just open the data and show it. handle that too (libz.so and maybe
other compression libraries).
Ilya N. Golubev [EMAIL PROTECTED] writes:
Duplicating my [EMAIL PROTECTED] sent on Wed, 10 Sep 2003
19:48:56 +0400 since mailer reports that [EMAIL PROTECTED] does not
work.
wget -mLd http://www.hro.org/docs/rlex/uk/index.htm
does not follow `A HREF=uk1.htm#1' links contained in the
Lucuk, Pete [EMAIL PROTECTED] writes:
as we can see above, wget has raznoe.shtml.html as the main file,
this is *not* what I want, I *always* want the main file to be name
index.html.
Wget doesn't really have the concept of a main file. As a
workaround, you could simply `ln -s
thanks for the response!
I have kicked around some different ideas like `ln -s raznoe.shtml.html
index.html' but the thing is, this whole process is an automated process, no
manual intervention.
How do I always know what exactly that final/main file name will be??
I could be feeding it all
I want to exclude GET URLs from --recursive downloads with wget. I've
tried using --reject *\?*, but it's not working. I suspect the
--reject only operates on the file name itself, and not GET portions of
the URL.
This isn't so much a bug as it is a non-feature. Can wget be extended
to do
##
## Hello all:
## I'm sending you this
## BUG report: wget.
## WGET kills internet connection when invoked from WINDOWS
##
## This problem does not appear when invoked from Linux or BeOS (I have
tested it).
##
## I have probed many versions. I started with that one included with JIGDO
for
On Thu, 18 Sep 2003, Hrvoje Niksic wrote:
modifying advance_declaration() in html-parse.c. A future version of
Wget will probably parse comments in a non-compliant fashion, by
considering everything between !-- and -- to be a comment, which is
what most other browsers have been doing since
17 matches
Mail list logo