RE: Wget Win32 Visual Studio project

2006-07-19 Thread Herold Heiko
 From: Christopher G. Lewis [mailto:[EMAIL PROTECTED]

 I've uploaded a working Visual Studio project file for the 

 Note that the debug build of Win32 Wget allows debugger attachment,
 stepping through code, etc.

Christopher,

I think this is a really nice contribution. Could you write a short how-to
create and maintain that project ? I'd see that as a nice addendum to the
window/readme file.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: 1.11 Alpha 1 Win32 files

2006-06-27 Thread Herold Heiko
FWIW, the alpha1 build runs fine on NT4sp6a, too.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax

 -Original Message-
 From: bruce [mailto:[EMAIL PROTECTED]
 Sent: Monday, June 26, 2006 5:05 PM
 To: 'Christopher G. Lewis'; 'www.mail'; wget@sunsite.dk
 Subject: RE: 1.11 Alpha 1 Win32 files
 
 
 hi...
 
 as you guys create/go forth in dealing with windows.. are you 
 focused on XP, or 2000 as well... keep in mind, there are a 
 lot of 2000 users still around!!
 
 -bruce
 
 
 -Original Message-
 From: Christopher G. Lewis [mailto:[EMAIL PROTECTED]
 Sent: Monday, June 26, 2006 7:27 AM
 To: www.mail; wget@sunsite.dk
 Subject: RE: 1.11 Alpha 1 Win32 files
 
 
 http://support.microsoft.com/default.aspx?scid=kb;en-us;326922
 
 Using the ResKit tool Depends on the SSL libs, it looks like the
 MSVCR80.dll will need to be included.  However, WGET itself 
 doesn't have
 the dependancy.
 
 I'm researching this, and should be able to have an answer today.
 
 Chris
 
 Christopher G. Lewis
 http://www.ChristopherLewis.com
  
 
  -Original Message-
  From: www.mail [mailto:[EMAIL PROTECTED] 
  Sent: Monday, June 26, 2006 5:12 AM
  To: Christopher G. Lewis; wget@sunsite.dk
  Subject: Re: 1.11 Alpha 1 Win32 files
  
  Hi Chris,
  
  Thanks for the new binaries.
  
  The new SSL libraries, v0.9.8b, on your site require msvcr80.dll, 
  which v0.9.7g didn't.  Please could you tell me where to get this 
  DLL.  I tried the one from
  
  http://www.dll-files.com/dllindex/pop.php?msvcr80
  
  but wget gave the error:
  
  The procedure entry point _encode_pointer could not be 
 located in the 
  dynamic link library MSVCR80.dll
  
  Regards,
  Jonny
  
  At 21:30 25/06/2006, you wrote:
  
  Hi all -
  
 I've published the latest alpha Win32 binaries using a 
  similar format
  to Heiko's Win32 page.  Hopefully I'll be able to keep up with what
  Heiko's done in the past, which has been excellent.  Heiko 
 deserves a
  big round of cheers for his work.
  
  The location for the downloads will be
  http://www.christopherlewis.com/wget/default.htm.  Right 
 now this is
  just a page off my personal web site, hopefully we'll just 
 be able to
  add these to the normal wget site.
  
  Christopher G. Lewis
  http://www.ChristopherLewis.com
  
  
 


RE: wget 1.11 alpha 1 released

2006-06-26 Thread Herold Heiko
The --ignore-case (and wgetrc option) don't seem to be documented in the
texi.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, June 13, 2006 5:06 PM
 To: wget@sunsite.dk
 Subject: wget 1.11 alpha 1 released
 
 
 
 hi to everybody,
 
 i've just released wget 1.11 alpha 1:
 
 ftp://alpha.gnu.org/pub/pub/gnu/wget/wget-1.11-alpha-1.tar.gz
 
 you're very welcome to try it and report every bug you might 
 encounter.
 
 
 with this release, the development cicle for 1.11 officially 
 enters the 
 feature freeze state. wget 1.11 final will be released when all the 
 following tasks are completed:
 
 1) win32 fixes (setlocale, fork)
 2) last fixes to -r and --spider
 3) update documentation
 4) return error/warning if multiple HTTP headers w/ same name 
 are given
 5) return error/warning if conflicting options are given
 6) fix Saving to: output in case -O is given
 
 unfortunately, this means that all the planned major changes (gnunet 
 support, advanced URL filtering w/ regex, etc...) will have 
 to wait until 
 1.12. however, i think that the many important features and bugfixes 
 recently commited into the trunk more than justify the new, 
 upcoming 1.11 
 release.
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 


RE: building wget 1.10.2 for msoft windows..

2006-06-26 Thread Herold Heiko
 From: bruce [mailto:[EMAIL PROTECTED]

 i'm playing around with building wget 1.10.2 on msoft windows 
 2000, using
 visual studio 6 (i know.. it's old!!)

Bruce,

I don't have time to look into your specific error, however I can say that
late wget till at least 1.10.2 at least with VC6 on Windows NT 4 exposed a
compiler error in httpc. and retr.c.

A workaround is turning off optimization either completely (remove the /O2
from Makefile.src before running configure.bat, not the best thing) or
selectively with the attached patch.
That patch applies cleanly to 1.10, to 1.10.2 and later possibly you need to
apply manually, basically insert the pragma lines around post_file() in
http.c, around fd_read_body in retr.c.

Then try the configure --msvc/nmake.



wget-1.10-compilererror.diff
Description: Binary data


RE: 1.11 Alpha 1 Win32 files

2006-06-26 Thread Herold Heiko
 From: Christopher G. Lewis [mailto:[EMAIL PROTECTED]

 to Heiko's Win32 page.  Hopefully I'll be able to keep up with what
 Heiko's done in the past, which has been excellent.  Heiko deserves a
 big round of cheers for his work.

blushes, mumbles of not deserving it
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: Bug in GNU Wget 1.x (Win32)

2006-06-22 Thread Herold Heiko
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] Behalf Of Þröstur
 Sent: Wednesday, June 21, 2006 4:35 PM

There have been some reports in the past but I don't think it has been acted
upon; one of the problems is that the list of names can be extended at will
(beside the standard comx, lptx, con, prn). Maybe it is possible to query
the os about the currently active device names and rename the output files
if neccessary ?

   I reproduced the bug with Win32 versions 1.5.dontremeber,
 1.10.1 and 1.10.2. I did also test version 1.6 on Linux but it
 was not affected.

That is since the problem is generated by the dos/windows filesystem drivers
(or whatever those should be called), basically com1* and so on are
equivalent of unix device drivers, with the unfortunate difference of acting
in every directory. 

 
 Example URLs that reproduce the bug :
 wget g/nul
 wget http://www.gnu.org/nul
 wget http://www.gnu.org/nul.html
 wget -o loop.end http://www.gnu.org/nul.html;
 
   I know that the bug is associated with words which are
 devices in the windows console, but i don't understand
 why, since I tried to set the output file to something else.

I think you meant to use -O, not -o.
Doesn't solve the real problem but at least a workaround.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: Windows compler need (was RE: wget 1.11 alpha 1 released)

2006-06-15 Thread Herold Heiko
 From: Travis Loyd [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, June 14, 2006 5:45 PM

 Hello gang, I'm considering building the Windows releases for you.
 
 My initial attempts to build wget resulted in an error when performing
 ./configure:

Sorry about the delay.

Travis, in what environment are you building ? I see configure and bash, I
suppose you are trying to build a cygnus binary. While that one needs to be
tested, too, I was referring to the native Visual C binary without need for
the cygnus runtime.
You'll need a current VC environment, build the openssl libraries (the first
time and whenever there are important changes like security fixes), add
those to the VC environment, configure.bat --msvc, nmake.

Can anybody confirm if the current alpha (or at least late subversion
exports) does build cleanly ? If not, if there are some problems, if it is
the first time you are trying to attempt this, I'd advice to try with the
1.10.1 sources first (since those are known to build cleanly) in order to
straight out your setup, if nobody else does step in I'll try to help you
whenever possible.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


Windows compler need (was RE: wget 1.11 alpha 1 released)

2006-06-14 Thread Herold Heiko
Everybody,
if there is somebody willing to step in as compiler of official windows
binaries please let yourself heared.
As I mentioned before currently I've have neither means nor time to provide
even release version binaries, develpoment-HEAD/alpha/beta builds are stuff
of dreams; yet there should be downloadable builds for all these, too.

Beside that I think the current (soon-to-be old) release 1.10 binary should
be moved from my home page to the official site, possibly the ftp server
(which needs a good cleaning out) or somewhere else.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, June 13, 2006 5:06 PM
 To: wget@sunsite.dk
 Subject: wget 1.11 alpha 1 released
 
 
 
 hi to everybody,
 
 i've just released wget 1.11 alpha 1:
 
 ftp://alpha.gnu.org/pub/pub/gnu/wget/wget-1.11-alpha-1.tar.gz
 
 you're very welcome to try it and report every bug you might 
 encounter.
 
 
 with this release, the development cicle for 1.11 officially 
 enters the 
 feature freeze state. wget 1.11 final will be released when all the 
 following tasks are completed:
 
 1) win32 fixes (setlocale, fork)
 2) last fixes to -r and --spider
 3) update documentation
 4) return error/warning if multiple HTTP headers w/ same name 
 are given
 5) return error/warning if conflicting options are given
 6) fix Saving to: output in case -O is given
 
 unfortunately, this means that all the planned major changes (gnunet 
 support, advanced URL filtering w/ regex, etc...) will have 
 to wait until 
 1.12. however, i think that the many important features and bugfixes 
 recently commited into the trunk more than justify the new, 
 upcoming 1.11 
 release.
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 


RE: [Fwd: Bug#366434: wget: Multiple 'Pragma:' headers not suppor ted]

2006-05-19 Thread Herold Heiko
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 i wonder if it makes sense to add generic support for 
 multiple headers 
 in wget, for instance by extending the --header option like this:
 
 wget --header=Pragma: xxx --header=dontoverride,Pragma: 
 xxx2 someurl

That could be a problem if you need to send a really weird custom header
named dontoverride,Pragma. Probability is near nil but with the whole big
bad internet waiting maybe separating switches (--header and --header-add)
would be better.

 as an alternative, we could choose to support multiple 
 headers only for 
 a few header types, like Pragma. however, i don't really like this 
 second choise, as it would require to hardcode the above mentioned 
 header names in the wget sources, which IMVHO is a *VERY* bad 
 practice.

Same opinion, hard coding the header list would be ugly and will byte some
user in the nose some time in the future: if you need to add several XXXY
headers either patch and recompile or use at least versione x.y

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: Windows Title Bar

2006-04-18 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

 The idea behind the feature is that you can see which URL is
 *currently* being downloaded (you can specify several).  That's
 somewhat different than just seeing the command line.  I still
 consider it a mistake, though.

I like it, for long/slow downloads the tooltip (float your mouse over closed
window) beats opening the window (if logging to stdout) or reloading the log
file (if used).
I understand why you don't like it, but personally I don't think it should
be removed, even if only to default=off, as at least two different
windows-code contributors did in the past (if I remember correctly).
I suppose a question of personal taste, as many other (stupid) religious
wars... anybody remembers the weeks of hate-posts in comp.unix about
display-or-not-display current working directory in the xterm title bar
somewhere around the mid-nineties ? I seldom laughed as much, it was WAY
better than vi vs. emacs :-)

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: regex support RFC

2006-03-30 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 I don't think such a thing is necessary in practice, though; remember
 that even if you don't escape the dot, it still matches the (intended)
 dot, along with other characters.  So for quickdirty usage not
 escaping dots will just work, and those who want to be precise can
 escape them.

I agree. Just how often will there be problems in a single wget run due to
both some.domain.com and somedomain.com present (famous last words...)

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 / +39-041-5917073 ph
-- +39-041-5907472 / +39-041-5917472 fax


RE: wget 1.10.2 released

2005-10-17 Thread Herold Heiko
Windows MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko Herold

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Thursday, October 13, 2005 11:59 AM
 To: wget@sunsite.dk
 Subject: wget 1.10.2 released
 
 
 hi to everybody,
 
 i have just uploaded the wget 1.10.2 tarball on ftp.gnu.org:
 
 ftp://ftp.gnu.org/gnu/wget/wget-1.10.2.tar.gz
 
 you can find the GPG signature of the tarball at these URLs:
 
 ftp://ftp.gnu.org/gnu/wget/wget-1.10.2.tar.gz.sig
 
 and the GPG key i have used for the signature at this URL:
 
 http://www.tortonesi.com/GNU-GPG-Key.txt
 
 the key fingerprint is:
 
 pub  1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer)
 Key fingerprint = 1E90 AEA8 D511 58F0 94E5  B106 7220 24E9 7B2F D4B0
 
 the MD5 checksum of the tarball is:
 
 795fefbb7099f93e2d346b026785c4b8  wget-1.10.2.tar.gz
 
 
 the 1.10.2 release features minor improvements in SSL support 
 and fixes 
 a remotely exploitable buffer overflow. all wget users are strongly 
 encouraged to upgrade.
 
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 


RE: Unifying Windows Makefiles

2005-07-08 Thread Herold Heiko
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  What about the doc directory and Makefile.doc ?
 
 I don't see much use for Info files on Windows.  Furthermore, I don't
 think many Windows builders have makeinfo lying around on their hard
 disk...

The windows/Readme instructs where to get it:
ftp://sunsite.dk/projects/wget/makeinfo.zip

 Those who know about Info (or, for that matter, Texinfo) documentation
 can always process it using Cygwin or by getting the release tarball
 which contains pre-processed Info files anyway.

The wget.hlp (windows help file), wget.rtf and wget.html are generate with
makeinfo.exe, too, I wasn't even thinking about the info files. Are there
even any windows native info viewers ? Well, I think there is an native
windows port of emacs, should do it.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget 1.10.1 beta 1

2005-07-07 Thread Herold Heiko
Windows MSVC test binary at
http://xoomer.virgilio.it/hherold/

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, July 06, 2005 11:07 PM
 To: wget@sunsite.dk
 Subject: wget 1.10.1 beta 1
 
 
 
 dear friends,
 
 i have just released the first beta of wget 1.10.1:
 
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10.1-beta1.tar.gz
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10.1-beta1.tar.bz2
 
 you are encouraged to download the tarballs, test if the code 
 works properly
 and report any bug you find.
 
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 Institute for Human  Machine Cognition  http://www.ihmc.us
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 


RE: ftp bug in 1.10

2005-06-27 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

 the 64-bit download sum, doesn't work for you.  What does this
 program print?
 
 #include stdio.h
 int
 main (void)
 {
   __int64 n = 100I64;  // ten billion, doesn't fit in 32 bits
   printf(%I64\n, n);
   return 0;
 }
 
 It should print a line containing 100.  If it does, it means
 we're applying the wrong format.  If it doesn't, then we must find
 another way of printing LARGE_INT quantities on Windows.


Folks, sorry for the delay, I've been on vacation and I'm still trying to
catch up with work and stuff.
%I64 not ok, %I64d is.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


ftp bug in 1.10

2005-06-15 Thread Herold Heiko
I have a reproducable report (thanks Igor Andreev) about a little verbouse
log problem with ftp with my windows binary, is this reproducable on other
platforms, too ?

wget -v ftp://garbo.uwasa.fi/pc/batchutil/buf01.zip
ftp://garbo.uwasa.fi/pc/batchutil/rbatch15.zip  

(seems to happen with any ftp download I tried though)

Last line of output is:

Downloaded:  bytes in 2 files

Note missing number of bytes.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget 1.10 released

2005-06-10 Thread Herold Heiko
Windows MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Friday, June 10, 2005 9:12 AM
 To: wget@sunsite.dk; [EMAIL PROTECTED]
 Subject: wget 1.10 released
 
 
 
 hi to everybody,
 
 i have just uploaded the wget 1.10 tarball on ftp.gnu.org:
 
 ftp://ftp.gnu.org/gnu/wget/wget-1.10.tar.gz
 
 you can find the GPG signature of the tarball at these URLs:
 
 ftp://ftp.gnu.org/gnu/wget/wget-1.10.tar.gz.sig
 
 and the GPG key i have used for the signature at this URL:
 
 http://www.tortonesi.com/GNU-GPG-Key.txt
 
 the key fingerprint is:
 
 pub  1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer) 
 [EMAIL PROTECTED]
  Key fingerprint = 1E90 AEA8 D511 58F0 94E5  B106 7220 
 24E9 7B2F D4B0
 
 the MD5 checksum of the tarball (and signature) are:
 
 caddc199d2cb31969e32b19fd365b0c5  wget-1.10.tar.gz
 7dff7d39129051897ab6268b713766bf  wget-1.10.tar.gz.sig
 
 the long-awaited 1.10 release is a significant improvement 
 over the last 1.9.1 
 release, introducing a few important features like long file 
 support and NTLM 
 authentication, lots of improvements (especially in IPv6 and 
 SSL code) and 
 many bugfixes.
 
 last but not least, a brief personal comment. this is my 
 first release as wget 
 maintainer, and i am very excited about it. however i would 
 like to say that, 
 even if he stepped down from the maintainer position, the 
 main author of wget 
 is still hrvoje niksic, who really did an awesome work on 
 wget 1.10. hrvoje 
 is one of the best developers i have ever worked with and i 
 would like to 
 thank him for all the effort he put on the this release of 
 wget, especially 
 since the last few months have been rather difficult for him.
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 Institute for Human  Machine Cognition  http://www.ihmc.us
 GNU Wget - HTTP/FTP file retrieval tool  
http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


RE: wget 1.10 beta 1

2005-05-12 Thread Herold Heiko
Windows MSVC6 binary for testing purposes here:
http://xoomer.virgilio.it/hherold/

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, May 11, 2005 8:41 PM
 To: wget@sunsite.dk
 Subject: wget 1.10 beta 1
 
 
 
 dear friends,
 
 i have just released the first beta version of wget 1.10:
 
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.gz
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2
 
 you are encouraged to download the tarballs, test if the code 
 works properly 
 and report any bug you find.
 
 i am still doing tests on this code, but it seems to work 
 fine, so i think 
 we'll be able to release wget 1.10 in 7-10 days.
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 Institute of Human  Machine Cognition   http://www.ihmc.us
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 


RE: wget 1.10 alpha 3

2005-04-28 Thread Herold Heiko
Windows (MSVC) test binary available at http://xoomer.virgilio.it/hherold/

Notes:

windows/wget.dep needs an attached patch (change gen_sslfunc to openssl.c,
change gen_sslfunc.h to ssl.h).
src/Makefile.in doesn't contain dependencies for http-ntlm$o
(windows/wget.dep either).
INSTALL should possibly mention the --disable-ntlm configure option.
I still advocate a warning (placed in windows/Readme or configure.bat) for
old msvc compilers, like in the attached patch.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Thursday, April 28, 2005 8:56 AM
 To: wget@sunsite.dk; [EMAIL PROTECTED]
 Subject: wget 1.10 alpha 3
 
 
 
 dear friends,
 
 i have just released the third alpha version of wget 1.10:
 
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha3.tar.gz
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha3.tar.bz2
 
 as always, you are encouraged to download the tarballs, test 
 if the code works 
 properly and report any bug you find.
 
 
 -- 
 Aequam memento rebus in arduis servare mentem...
 
 Mauro Tortonesi  http://www.tortonesi.com
 
 University of Ferrara - Dept. of Eng.http://www.ing.unife.it
 Institute of Human  Machine Cognition   http://www.ihmc.us
 GNU Wget - HTTP/FTP file retrieval tool  
 http://www.gnu.org/software/wget
 Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
 Ferrara Linux User Group http://www.ferrara.linux.it
 



20050428.wget-dep.diff
Description: Binary data


20050420.winreadme.diff
Description: Binary data


RE: wget 1.10 alpha 2

2005-04-20 Thread Herold Heiko
(sorry for the late answer, three days of 16+ hours/day migration aren't
fun, UPS battery exploding inside the UPS almost in my face even less)


 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

 Herold Heiko [EMAIL PROTECTED] writes:
 
  do have a compiler but aren't really developers (yet) (for example
  first year CS students with old lab computer compilers).
 
 From my impressions of the Windows world, non-developers won't touch
 source code anyway -- they will simply use the binary.

I feel I must dissent. Even today I'm not exactly a developer, I certainly
wasn't when I first placed my greedy hands on wget sources (in order to add
a couple of chars to URL_UNSAFE... back in 98 i think). I just knew where I
could use a compiler and followed instructions.
I'd just like wget still being compilable in an old setup by (growing)
newbies, for the learning value. Maybe something like a small note in the
windows/Readme instructions would be ok, as by the enclosed patch ?

 The really important thing is to make sure that the source works for
 the person likely to create the binaries, in this case you.  Ideally
 he should have access to the latest compiler, so we don't have to
 cater to brokenness of obsolete compiler versions.  This is not about

I must confess I'm torn between the two options. Your point is very valid,
on the other hand while it is still possible I'd like to continue using an
old setup exactly because there are still plenty of those around and I'd
like to catch these problems. Unfortunately I don't have the time to test
everything on two setups, so I think I'll continue with the old one till
easily feasable.

 Also note that there is a technical problem with your patch (if my
 reading of it is correct): it unconditionally turns on debugging,
 disregarding the command-line options.  Is it possible to save the old
 optimization options, turn off debugging, and restore the old options?
 (Borland C seems to support some sort of #pragma push to achieve
 that effect.)

It seems not, msdn mentions push only for #pragma warning, not for
#pragma optimize :(

   optimization, or with a lesser optimization level.  Ideally this
   would be done by configure.bat if it detects the broken compiler
   version.

I tried but didn't find a portably (w9x-w2x) way to do that, since in w9x we
can't redirect easily the standard error used by cl.exe.
Possibly this could be worked around by running the test from a simple perl
script, on the other hand today perl is required (on released packages) only
in order to build the documentation, not for the binary, adding another
dependency would be a pity.

 You mean that you cannot use later versions of C++ to produce
 Win95/Win98/NT4 binaries?  I'd be very surprised if that were the

Absolutely not, what I meant is, later versions can't be installed on older
windows operating systems. I think Visual Studio 6 is the last MS compiler
which runs on even NT4.

  Personally I feel wget should try to still support that not-so-old
  compiler platform if possible,
 
 Sure, but in this case some of the burden falls on the user of the
 obsolete platform: he has to turn off optimization to avoid a bug in
 his compiler.  That is not entirely unacceptable.

I concur, after all if a note is dropped in the windows/Readme either they
will read it, or they will stall due to OpenSSL dependencies (on by default)
anyway.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax



20050420.winreadme.diff
Description: Binary data


RE: wget 1.10 alpha 2

2005-04-15 Thread Herold Heiko
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]

 the patch you've posted is really such an ugly workaround 
 (shame on microsoft 

Exactly the same opinion here.
Please don't misunderstand me, personally for most of my work on windows I
use cygnus (including wget) anyway.
However there are still lots of people using Windows NT 4 or even
win95/win98, with old compilers, where the compilation won't work without
the patch.
Even if we place a comment in the source file or the windows/Readme many of
those will be discouraged, say those who do have a compiler but aren't
really developers (yet) (for example first year CS students with old lab
computer compilers).

I suppose we could leave that stuff present but commented out, and print a
warning when configure.bat --msvc is called.
Maybe we could even make that warning conditionally (run cl.exe, use the
dos/windows find.exe in order to check the output, if 12.00 echo warning)
but that would be even more hacky.


 have you tried the microsoft visual c++ toolkit 2003? maybe 
 it works. you can 
 download it for free at the following URL:
 
 http://msdn.microsoft.com/visualc/vctoolkit2003/

Not yet, but I will certainly.
Nevertheless, I think the point is the continue to support existing
installation if possble issue, after all VC6 is not free either, and at
least one newer commercial VC version has been reported to compile without
problems. Those, however, certainly don't support Win95, probably don't
Win98/ME or/and NT4 either (didn't yet check though).

Personally I feel wget should try to still support that not-so-old compiler
platform if possible, even if there are other options, either the direct
successor (current VC) or not (free alternatives like cygnus, mingw and
borland compilers), in order to keep the development process easily
accessible to old installations, in order to have more choices for
everybody.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget 1.10 alpha 2

2005-04-13 Thread Herold Heiko
With MS Visual Studio 6 still needs attached patch in order to compile
(disable optimization for part of http.c and retr.c if cl.exe version =12).

Windows msvc test binary at http://xoomer.virgilio.it/hherold/

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, April 13, 2005 12:36 AM
 To: wget@sunsite.dk; [EMAIL PROTECTED]
 Cc: Johannes Hoff; Leonid Petrov; Doug Kaufman; Tobias Tiederle; Jim
 Wright; garycao; Steven M.Schweda
 Subject: wget 1.10 alpha 2
 
 
 
 dear friends,
 
 i have just released the second alpha version of wget 1.10:
[snip]



20050413.diff
Description: Binary data


RE: NTLM authentication in CVS

2005-04-08 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
  3) As expected msvc still throws compiler error on http.c 

 1. Is it possible to only lessen the optimization level, not entirely
remove optimization?

Fooled around a bit, didn't accomplish anything. However it is enough
disabling the optimization for those two functions, only.
 
 2. Is it possible to test for the offending version of the compiler,
and only then remove optimization?

Since Tobias (thanks!) confirmed v13 works yes.
The enclosed patch disables for _MSC_VER=1200, if other reports should come
in maybe we can restrict the test further.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax



20050408.diff
Description: Binary data


RE: NTLM authentication in CVS

2005-04-07 Thread Herold Heiko
back after a while

In order to compile current cvs with msvc 3 patches are needed (enclosed):

1)
mswindows.c(118) : warning C4005: 'OVERFLOW' : macro redefinition
C:\PROGRA~1\MICROS~2\VC98\INCLUDE\math.h(415) : see previous
definition of 'OVERFLOW'
mswindows.c(119) : warning C4005: 'UNDERFLOW' : macro redefinition
C:\PROGRA~1\MICROS~2\VC98\INCLUDE\math.h(416) : see previous
definition of 'UNDERFLOW'

(simple rename of course solves)

2) add http-ntlm.c to windows Makefiles (enclosed Borland/Watcom/Mingw,
tested msvc only)

3) As expected msvc still throws compiler error on http.c and retr.c, (bad)
workaround: disable optimization. Anybody with a cl.exe newer than
Microsoft (R) 32-bit C/C++ Optimizing Compiler Version 12.00.8804 for 80x86
can comment if this is needed with newer versions, too ?

Binary for evaluation available at http://xoomer.virgilio.it/hherold as
usual.

I still can't access the sunsite ftp repository, maybe somebody who can
should at least remove/update the 00Readme.txt and index.html (extremely
outdated), probably throw away the pletora of old development binaries and
sources (everything named wgetmmdd[sb].zip).

Beside that, I'm still very short on time and can barely follow development,
in future most probably the situation will be worse (due to personal
issues). If there would be anybody else capable and volunteering to release
msvc binaries somewhat regulary I'd be more than glad.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, April 07, 2005 4:01 PM
 To: wget@sunsite.dk
 Subject: NTLM authentication in CVS
 
 
 As of two days ago, the NTLM authentication support is in CVS,
 although it has undergone very little testing.  (To quote Linus, If
 it compiles, it's perfect.)
 
 If someone has access to an NTLM-authorizing web server, please try it
 out with Wget and let us know how it goes.  Unfortunately, this
 version of Wget doesn't support NTLM for proxies[1], so you'll need a
 *server* that authorizes over NTLM.
 
 
 [1]
 The proxy limitation will hopefully be fixed after 1.10 is released.
 Fixing it for 1.10 would be too much work since it might require using
 HTTP 1.1.
 



20050407b.diff
Description: Binary data


20050407a.diff
Description: Binary data


20050407c.diff
Description: Binary data


RE: Making passive ftp default

2005-03-04 Thread Herold Heiko
 From: Karsten Hopp [mailto:[EMAIL PROTECTED]

  What do others think about a switch?  Mauro?
 
 That's the default on Red Hat Linux / Fedora Core for some 
 releases now.
 Not even one objection from customers so far.
 

We had a whole TON of problems while switching some of our internal
production servers to RH (with passive ftp as default) due to direct
connections to customers with badly written ACLs on routers (permitting
active ftp only).

Never had any problem on internet ftp servers, though, so I'd say go with
it, too.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file problem

2005-03-02 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Gisle Vanem [EMAIL PROTECTED] writes:
 
  It doesn't seem the patches to support 2GB files works on
  Windows. Wget hangs indefinitely at the end of transfer.  E.g.
 [...]
 
 I seem to be unable to repeat this.

Me too. I transferred successfully several 3GB files from linux and
solaris/sparc servers to NT4 by ftp and http (on lan though, no restarts).
No help here, sorry.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file support

2005-02-28 Thread Herold Heiko
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Behalf Of Maciej W. Rozycki

  Huh?  It's you who should actually charge them for doing bug 
 discovery for them.

Yeah. Only please understand - I'm not a C programmer. I'm not an expert in
Microsoft interaction (the fewer the better). I just have (legal!) access to
a Visual Studio kit and use it, anything more in-depth unfortunately would
require familiarizing far more with C in general and VS in particular than I
have time available :(

  Anyway, with that level of vendor (non-)support, you should really 
 consider switching to GCC, no matter how much hassle with 
 libraries it 
 involves.  It's just an effort to be done once and *will* pay back.

I know that, after all I could just download the free Borland compiler or
MingW or for myself use the cygnus port (which I have installed anyway) and
forget about VS.
On the other hand I started compiling these binaries because I can and there
seemed to be nobody else doing it and I feel it would be a shame dropping
this compatibility exactly as it would be a shame dropping support for, say
Ultrix. Just my (minimal) bit of support for a great tool - I do what I can
but I won't become VSC proficient, time is never enough, family (and work)
has precedence, I just can't afford it (although I'd really like to).

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file support

2005-02-25 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, February 24, 2005 10:50 PM
 To: Maciej W. Rozycki
 Cc: Herold Heiko; wget@sunsite.dk
 Subject: Re: Large file support
 
 
 Maciej W. Rozycki [EMAIL PROTECTED] writes:
 
  Doesn't GCC work for this target?
 
 It does, in the form of Cygwin and MingW.  But Heiko was using MS
 VC before, and we have catered to broken compilers before, so it
 doesn't hurt to try.

Also, Cygwin requires a large installed environment. It may be possible to
link statically and adapt everything in order to produce a single (large...)
binary but that would be rather an ugly hack.
MingW could be better, I really don't use it personally but I know the
(standalone, portable) mame win32 binary is compiled with that.
Nevertheless, Visual C++ is the de facto standard on the windows platforms,
so supporting it if possible renders (self compiled) wget accessible to a
larger audience. Now, if this means hacking the source too much I think the
more recent vc++ only path shoule be tried first before throwing away vc++
support altogether.

Said that, in retr.c simplifying the int rdsize line did not solve, but I
tried the following, we have:
#ifndef MIN
# define MIN(i, j) ((i) = (j) ? (i) : (j))
#endif

int rdsize = exact ? MIN (toread - sum_read, dlbufsize) : dlbufsize;
double tmout = opt.read_timeout;

Commenting out the double tmout... line removes the compiler error, OR
exact ? (toread... (without MIN!) does compile, OR
commenting out the #ifndef MIN..#endif does compile (in other words, MIN is
already defined somewhere, how can I discover where?), however changing any
occurence of MIN to XXXTEST still generates the compiler error, as 
Something simple like this
int rdsize = 1;
double tmout = opt.read_timeout;
compiles, as does this
int rdsize = dlbufsize;
double tmout = opt.read_timeout;
or this
int rdsize = toread - sum_read;
double tmout = opt.read_timeout;
while this one 
int rdsize;
rdsize = 1;
double tmout = opt.read_timeout;
fails with
retr.c(263) : error C2143: syntax error : missing ';' before 'type'
retr.c(269) : error C2065: 'tmout' : undeclared identifier
(where line 263 is the double tmout=...).

Any suggestions for other tests ?
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file support

2005-02-25 Thread Herold Heiko
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Behalf Of Maciej W. Rozycki

  Well, instead of scratching the head, how about filing a bug 
 report?  

Ha :), would be nice.
I suppose that would mean calling PSS, which (if things didn't change) means
an immediate billing on your credit card (to be refunded later if there
really was a problem).
Anyway in this case the answer would probably be upgrade to VS.Net 2002.
But see my other email... disabling optimization as Hrvoje suggested does
work around the problem, and indeed I found some Microsoft articles
suggesting exactly that.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file support

2005-02-25 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Friday, February 25, 2005 1:29 PM

 The whole point of #ifndef MIN ... #endif is to *use* the
 compiler-provided MIN where available.
 
 Dealing with a compiler crash is tricky, as it seems that every change
 has a potential of causing it to occur.  Could you maybe tone down or
 remove the optimization flags when compiling retr.c?

Does solve, in fact I found some MS articles suggesting the same thing.
Attached patch does work around the problem by disabling optimization
selectively..
I was able to retrieve a 2.5GB file with ftp.

I tried to put a copy in the usual place (sunsite.dk by
[EMAIL PROTECTED]), got autentication required - I suppose the password
has changed. If still possible I'd like to put them there, for now a copy
for general testing is at
http://xoomer.virgilio.it/hherold/

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Large file support

2005-02-24 Thread Herold Heiko
Sorry, replied to wget-patches instead of redirecting it.

I tried a test compile just now, with Visual C++ 6 I get different errors:

string_t.[ch] - iswblank doesn't seem to be available, however there is a 
int isspace( int c );
int iswspace( wint_t c );
Routine Required Header Compatibility 
isspace ctype.h ANSI, Win 95, Win NT 
iswspace ctype.h or wchar.h ANSI, Win 95, Win NT 
isspace returns a non-zero value if c is a white-space character (0x09 -
0x0D or 0x20). iswspace returns a non-zero value if c is a wide character
that corresponds to a standard white-space character or is one of an
implementation-defined set of wide characters for which iswalnum is false.
Each of these routines returns 0 if c does not satisfy the test condition.


Also, the large file support patch introduced syntax errors in mswindows.h
and internal compiler error in retr.c and httpd.c:

Microsoft (R) Program Maintenance Utility   Version 6.00.8168.0
Copyright (C) Microsoft Corp 1988-1998. All rights reserved.

cd src
NMAKE

Microsoft (R) Program Maintenance Utility   Version 6.00.8168.0
Copyright (C) Microsoft Corp 1988-1998. All rights reserved.

cl /nologo /MT /O2 /I. /DWINDOWS /D_CONSOLE /DHAVE_CONFIG_H
/DHAVE_SSL /c http.c hash.c retr.c gen-md5.c gnu-md5.c
http.c
.\mswindows.h(109) : warning C4005: 'stat' : macro redefinition
.\mswindows.h(101) : see previous definition of 'stat'
http.c(503) : warning C4090: 'function' : different 'const' qualifiers
http.c(503) : warning C4022: 'xrealloc_real' : pointer mismatch for actual
parameter 1
http.c(521) : warning C4090: 'function' : different 'const' qualifiers
http.c(521) : warning C4022: 'xrealloc_real' : pointer mismatch for actual
parameter 1
http.c(675) : warning C4090: 'function' : different 'const' qualifiers
http.c(675) : warning C4022: 'xfree_real' : pointer mismatch for actual
parameter 1
http.c(1891) : warning C4133: 'function' : incompatible types - from 'struct
_stati64 *' to 'struct _stat *'
http.c(1900) : warning C4133: 'function' : incompatible types - from 'struct
_stati64 *' to 'struct _stat *'
http.c(1959) : warning C4133: 'function' : incompatible types - from 'struct
_stati64 *' to 'struct _stat *'
hash.c
.\mswindows.h(109) : warning C4005: 'stat' : macro redefinition
.\mswindows.h(101) : see previous definition of 'stat'
.\mswindows.h(121) : error C2143: syntax error : missing ')' before '*'
.\mswindows.h(121) : error C2143: syntax error : missing '{' before '*'
.\mswindows.h(121) : error C2059: syntax error : ')'
.\mswindows.h(121) : error C2059: syntax error : ';'
retr.c
.\mswindows.h(109) : warning C4005: 'stat' : macro redefinition
.\mswindows.h(101) : see previous definition of 'stat'
retr.c(920) : warning C4133: 'function' : incompatible types - from 'struct
_stati64 *' to 'struct _stat *'
gen-md5.c
.\mswindows.h(109) : warning C4005: 'stat' : macro redefinition
.\mswindows.h(101) : see previous definition of 'stat'
.\mswindows.h(121) : error C2143: syntax error : missing ')' before '*'
.\mswindows.h(121) : error C2143: syntax error : missing '{' before '*'
.\mswindows.h(121) : error C2059: syntax error : ')'
.\mswindows.h(121) : error C2059: syntax error : ';'
gnu-md5.c
.\mswindows.h(109) : warning C4005: 'stat' : macro redefinition
.\mswindows.h(101) : see previous definition of 'stat'
.\mswindows.h(121) : error C2143: syntax error : missing ')' before '*'
.\mswindows.h(121) : error C2143: syntax error : missing '{' before '*'
.\mswindows.h(121) : error C2059: syntax error : ')'
.\mswindows.h(121) : error C2059: syntax error : ';'
Generating Code...
retr.c(261) : fatal error C1001: INTERNAL COMPILER ERROR
(compiler file 'E:\8966\vc98\p2\src\P2\main.c', line 494)
Please choose the Technical Support command on the Visual C++
Help menu, or open the Technical Support help file for more information
http.c(1412) : warning C4761: integral size mismatch in argument; conversion
supplied
http.c(381) : fatal error C1001: INTERNAL COMPILER ERROR
(compiler file 'E:\8966\vc98\p2\src\P2\main.c', line 494)
Please choose the Technical Support command on the Visual C++
Help menu, or open the Technical Support help file for more information
NMAKE : fatal error U1077: 'cl' : return code '0x2'
Stop.
NMAKE : fatal error U1077: 'C:\PROGRA~1\MICROS~2\VC98\BIN\NMAKE.EXE' :
return code '0x2'
Stop.

Bye
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, February 23, 2005 11:14 PM
 To: [EMAIL PROTECTED]
 Subject: Large file support
 
 
 The most requested feature of the last several years finally arrives
 -- large file support.  With this patch Wget should be able to
 download files larger than 2GB on systems that support them.
 
 ChangeLog:
 2005-02-20  Hrvoje Niksic  [EMAIL 

RE: Large file support

2005-02-24 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, February 24, 2005 1:50 PM

 Thanks for checking it.

Don't thank me, I'd like to do more but I have no time available :(

  http.c(503) : warning C4090: 'function' : different 'const'
  qualifiers
 [...]
 
 I don't quite understand these warnings.  Did they occur before?

Definitively, I trie with a rev from March 2004, same warnings.

  Generating Code...
  retr.c(261) : fatal error C1001: INTERNAL COMPILER ERROR
  (compiler file 
 'E:\8966\vc98\p2\src\P2\main.c', line 494)
  Please choose the Technical Support command on the Visual C++
  Help menu, or open the Technical Support help file for 
 more information
  http.c(1412) : warning C4761: integral size mismatch in 
 argument; conversion
  supplied
  http.c(381) : fatal error C1001: INTERNAL COMPILER ERROR
  (compiler file 
 'E:\8966\vc98\p2\src\P2\main.c', line 494)
  Please choose the Technical Support command on the Visual C++
  Help menu, or open the Technical Support help file for 
 more information
 
 I don't quite know what to make of this.  Any ideas?

The problem was introduced with the large file patch. Using the http.c and
retr.c from before that (with current headers and everything else) I get
http.c(502) : warning C4090: 'function' : different 'const' qualifiers
http.c(502) : warning C4022: 'xrealloc_real' : pointer mismatch for actual
parameter 1
http.c(520) : warning C4090: 'function' : different 'const' qualifiers
http.c(520) : warning C4022: 'xrealloc_real' : pointer mismatch for actual
parameter 1
http.c(674) : warning C4090: 'function' : different 'const' qualifiers
http.c(674) : warning C4022: 'xfree_real' : pointer mismatch for actual
parameter 1
http.c(1733) : warning C4133: 'function' : incompatible types - from 'long
*' to '__int64 *'
http.c(1733) : warning C4133: 'function' : incompatible types - from 'long
*' to '__int64 *'
http.c(1874) : warning C4133: 'function' : incompatible types - from 'struct
stat *' to 'struct _stati64 *'
http.c(1883) : warning C4133: 'function' : incompatible types - from 'struct
stat *' to 'struct _stati64 *'
http.c(1942) : warning C4133: 'function' : incompatible types - from 'struct
stat *' to 'struct _stati64 *'
retr.c(197) : warning C4028: formal parameter 3 different from declaration
retr.c(197) : warning C4028: formal parameter 4 different from declaration
retr.c(197) : warning C4028: formal parameter 5 different from declaration
retr.c(197) : warning C4028: formal parameter 6 different from declaration
retr.c(493) : warning C4028: formal parameter 1 different from declaration
retr.c(513) : warning C4028: formal parameter 1 different from declaration
retr.c(918) : warning C4133: 'function' : incompatible types - from 'struct
stat *' to 'struct _stati64 *'

but it does compile.

I tried to reverse stuff, as for retr.c by changing wgint back to long it
compiles with these warnings:
retr.c(199) : warning C4028: formal parameter 3 different from declaration
retr.c(199) : warning C4028: formal parameter 4 different from declaration
retr.c(199) : warning C4028: formal parameter 5 different from declaration
retr.c(199) : warning C4028: formal parameter 6 different from declaration
retr.c(495) : warning C4028: formal parameter 1 different from declaration
retr.c(515) : warning C4028: formal parameter 1 different from declaration

while http.c with long instead of wgint fails with
http.c(1474) : error C2065: 'long_MAX' : undeclared identifier
http.c(1750) : warning C4133: 'function' : incompatible types - from 'long
*' to '__int64 *'
http.c(1750) : warning C4133: 'function' : incompatible types - from 'long
*' to '__int64 *'

so the problem seems to be in there. Anything else I can try ?

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wGET - NTLM Support

2004-12-31 Thread Herold Heiko



There 
has been some discussions on that in the past.
IIRC 
there is no current implementation, however please see the following messages in 
the archives:

From: 
Daniel Stenberg [EMAIL PROTECTED]
Subject RE: Is NTLM authentication with wget 
possible?
Date: 
25/05/2004 13.14

 
On Tue, 25 May 2004, Herold Heiko wrote:   Wget does not 
support NTLM auth (yet).  For the ones who feel like helping 
out, I have already donated (and properly assigned copyright to FSF with 
papers and everything) fully working NTLM code to the wget project 
(mailed privately to Hrvoje) that I'm sure everyone would be happy if it 
was incorporated in the wget code base "properly".  My code - 
proved working in curl - is based on the wonderfully detailed notes done 
by Eric Glass = http://davenport.sourceforge.net/ntlm.html 
 I'm not motivated nor involved enough in wget to do the necessary 
changes myself.  -- 
 -=- Daniel 
Stenberg -=- http://daniel.haxx.se 
-=- ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 
is xg'`ol
I 
suppose this wasn't implemented due to Hrvoje disappearing into real life 
again.
Daniel, could you resend that code to the current co-maintainer Mauro 
Tortonesi [EMAIL PROTECTED] ? Maybe sooner or later he finds some time 
for this.

Heiko
 PREVINET S.p.A. 
www.previnet.it-- Heiko Herold [EMAIL PROTECTED] 
[EMAIL PROTECTED]-- +39-041-5907073 ph-- +39-041-5907472 fax 


  -Original Message-From: Mudliar, Anand 
  [mailto:[EMAIL PROTECTED]Sent: Friday, December 31, 2004 
  7:02 AMTo: [EMAIL PROTECTED]Subject: wGET - NTLM 
  Support
  Hi,
  I started using 
  wget recently and found it very useful and wonderfull tool, but I am kind of 
  surprised to see that currently it is not supporting NTLM (may be because ifs 
  MS proprietry). I would like to know if any development is going on in this 
  area. If yes, then when can i expect the NTLM Support (time 
  frame).
  
  I'll appreciate 
  if you could throw some light on this.
  
  Thanks,
  Anand 
  Mudliar
  INTEL 
  Corporation


RE: Newbie needs to start wget in background

2004-11-10 Thread Herold Heiko



This 
is not a wget problem.
Your 
task scheduler runs wget in foreground, over any console application (the movie) 
you are running currently.
THEN 
wget immedeatly correctly puts itself in background, the window closes and your 
previous topmost application (the movie) is topmost again (although possibly 
without keyboard/mouse input focus!).
You 
need to investigate how to run wget in a different way. With the old scheduler 
(windows NT 4) you could configure the service accordingly, I don't know the 
impact of that on the IE5.5/W2K/XP Task Scheduler (which replaced the previous 
scheduler).

Heiko
 PREVINET S.p.A. www.previnet.it-- Heiko 
Herold [EMAIL PROTECTED] [EMAIL PROTECTED]-- +39-041-5907073 
ph-- +39-041-5907472 fax 

  -Original Message-From: Mike Andersen 
  [mailto:[EMAIL PROTECTED]Sent: Wednesday, November 10, 2004 
  12:19 AMTo: [EMAIL PROTECTED]Subject: Newbie needs to 
  start wget in background
  
  Hi,
  Can anyone on the list tell me how 
  to have wget start and run entirely in the background on Windows XP, so no 
  console window ever opens, not even briefly.
  
  I would like to use wget for 
  automatically updating files on windows XP client machines, however it needs 
  to be completely invisible to the user. The client machines will be playing 
  small movies, which will update from time to time, and wget seems like a 
  natural solution to have the client machines pull down their own updates. My 
  strategy so far has been to use Windows Task Scheduler and schedule at regular 
  intervals a command line call to wget, like this:
  
  C:\WINDOWS\wget.exe -b -q -m -np 
  -nH -nd -l 1 -Pc:/test http://www.mydomain.com/movies/ 
  
  
  As you can see, I'm using the -b 
  (background) and -q (quiet) options, and this *almost* works. A windows console appears 
  for just a fraction of a second, but it appears on top of the movie and I know 
  my client will not accept that. I even tried putting the  on the end of 
  the command, but no dice.
  
  I've also tried using the -o 
  option, but that didn't help either. 
  
  I've looked through the list and 
  noticed a thread or two about making wget run completely in the background, 
  but this appears to involve patching wget, and I'm not sure if it works in 
  windows, and I doubt I have the know-how to patch the application it 
  correctly. And that's assuming that a version of wget that runs entirely in 
  the background does actually exist. 
  
  I'd really appreciate it if anyone 
  on the list could help me out with this. 
  
  I'm not subscribed to the list, so 
  please copy me in your responses: [EMAIL PROTECTED].
  
  Many Thanks,
  
  Mike
  


RE: Is NTLM authentication with wget possible?

2004-05-25 Thread Herold Heiko
Wget does not support NTLM auth (yet).
At http://ntlmaps.sourceforge.net/ is a project which could help.
Never tried it myself but I had a working ok report.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Jeroen Pluimers (mailings) [mailto:[EMAIL PROTECTED]
 Sent: Monday, May 24, 2004 7:42 PM
 To: [EMAIL PROTECTED]
 Subject: Is NTLM authentication with wget possible?
 
 
 Hi,
 
 I'm trying to wget from an IIS server of a vendor of us.
 
 Their server asks for NTLM authentication, which IE6 can 
 correctly supply.
 
 However, wget 1.9.1 under win32 doesn't seem to work:
 
 E:\downloadswget -m -np -d -d -d -d -d -d -d
 --http-user=server.domain.com\username --http-passwd=password
 --user-agent=Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 
 5.1; MyIE2; .NET
 CLR 1.1.4322)  http://server.domain.com/directory/
 DEBUG output created by Wget 1.9.1 on Windows.
 
 set_sleep_mode(): mode 0x8001, rc 0x8000
 Enqueuing http://server.domain.com/directory/ at depth 0
 Queue count 1, maxcount 1.
 Dequeuing http://server.domain.com/directory/ at depth 0
 Queue count 0, maxcount 1.
 --19:36:47--  http://server.domain.com/directory/
= `server.domain.com/directory/index.html'
 Resolving server.domain.com... seconds 0.00, 999.999.999.999
 Caching server.domain.com = 999.999.999.999
 Connecting to server.domain.com[999.999.999.999]:80... seconds 0.00,
 connected.
 Created socket 1940.
 Releasing 00392958 (new refcount 1).
 ---request begin---
 GET /directory/ HTTP/1.0
 User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 
 5.1; MyIE2; .NET
 CLR 1.1.4322)
 Host: server.domain.com
 Accept: */*
 Connection: Keep-Alive
 Authorization: Basic YmRudHYuYm9ybGFuZC5jb21cYnRwdXNlcjpuZEBzdHVmZg==
 
 ---request end---
 HTTP request sent, awaiting response... HTTP/1.1 401 Unauthorized
 Content-Length: 1656
 Content-Type: text/html
 Server: Microsoft-IIS/6.0
 WWW-Authenticate: Negotiate
 WWW-Authenticate: NTLM
 X-Powered-By: ASP.NET
 Date: Mon, 24 May 2004 17:37:07 GMT
 Connection: keep-alive
 
 
 Found server.domain.com in host_name_addresses_map (00392958)
 Registered fd 1940 for persistent reuse.
 Closing fd 1940
 Releasing 00392958 (new refcount 1).
 Invalidating fd 1940 from further reuse.
 Unknown authentication scheme.
 
 FINISHED --19:36:48--
 Downloaded: 0 bytes in 0 files
 
 
 
 Any idea how to get this working?
 
 
 --jeroen
 


RE: wget hangs or downloads end up incomplete in Windows 2000 X P.

2004-05-12 Thread Herold Heiko
 From: Phillip Pi [mailto:[EMAIL PROTECTED]

 OK, I did more tests. I noticed -v is already enabled by 
 default since the

you probably have verbose=on in your wgetrc file.

  5250K .. .. .. .. ..
 
 
 The timestamp was from almost an hour ago (I was in a 
 meeting) during the 
 download test. Notice it never timed out to retry or abort! Please 

What happens if you restart wget again with mirror-like options on the same
directory tree ? Does it hang again on the same file ? If yes, what if you
try to download that file only ?
If not, could you for any chance run a sniffer on that machine (ethereal is
free) ?
It would be useful to know if really everything is freezed, or if, for
example, for some reason the data is just trickling down at 1byte/minute or
something similar (stuck in retrasmission?). 

Heiko Herold

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget hangs or downloads end up incomplete in Windows 2000 X P.

2004-05-11 Thread Herold Heiko
Did you ever run the download with -v ?
What did the log say when wget seemed to hang or regarding to the missing or
corrupt files, or regarding the parsing of the directory index (or whatever
it was) linking to those files ?
If nothing usefull is logged, try again with -d (but be prepared, a huge
amount of information will be logged - better redirect it to a file with -o
log.txt or -a log.txt).
If still nothing comes up, take a look at the server logs, if you can.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Phillip Pi [mailto:[EMAIL PROTECTED]
 Sent: Monday, May 10, 2004 10:40 PM
 To: [EMAIL PROTECTED]
 Subject: wget hangs or downloads end up incomplete in Windows 
 2000  XP.
 
 
 Hello. 
 
 I downloaded wget v1.9.1 (complete) from
 http://xoomer.virgilio.it/hherold/. I am having problems in 
 downloading
 almost a GB of files (over 2600 files and 250 folders). Randomly, the
 download will just stall completely and never resume OR the download
 completes, but download is corrupted (sometimes missing files and
 subfolders). I also had this problem with old v1.8.2. The 
 only way to fix
 is to set pause one second for each process, but this takes too long 
 (almost an hour) with so many files and folders! Here's what 
 I am using in 
 batch file (download.bat -- changed URLs, account, and passwords for 
 sample):
 
 @call wget -c -l0 -r -nH -w0 
 ftp://domain\username:[EMAIL PROTECTED]/Unreleasedbuilds
 /blah/blah/1/setups/SUBSETUP/*
 @call wget -c -l0 -r -nH -w0 
 ftp://domain\username:[EMAIL PROTECTED]/Unreleasedbuilds
 /blah/blah/1/setups/SUBSETUP/*
 
 I used a Pentium 3 1 Ghz system (512 MB of RAM) with Windows 
 2000 SP4 (all
 updates) and a Pentium 4 3 Ghz (HT enabled) with Windows XP 
 Home SP1 (all
 updates). Each computer is using 100mb connection for network.
 
 Doing a copy in Windows' Explorer through network share has 
 NO problems. I 
 assume this method is slower transfer compared to wget 
 command. I'd like 
 to use script so I don't have to do this manually. ;)
 
 Thank you in advance. :)
 -- 
 Ants can carry twenty times their weight, which is useful information
if you're moving out and you need help getting a potato chip across
town. --Ron Darian
   /\___/\
  / /\ /\ \   Phillip Pi (Ant) @ The Ant Farm: http://antfarm.ma.cx
 | |o   o| |   E-mail: [EMAIL PROTECTED] or [EMAIL PROTECTED]
\ _ /Be sure you removed ANT from e-mail address if you get
 ( ) a returned e-mail.
 


RE: Large Files Support for Wget

2004-05-10 Thread Herold Heiko
 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 * Profit!

I think you'd really deserve some.
Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: How to download last cvs version behind a firewall ?

2004-04-08 Thread Herold Heiko
I was going to say
ftp://ftp.sunsite.dk/projects/wget/snapshots/
but that seems to have stopped being updated after 2003 09 09.

Get the latest source zip from 
http://xoomer.virgilio.it/hherold/
Pay attention, these are files in dos line endings, if you are going to use
them on unix you need to unix2dos everything.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Emmanuel EISENSTAEDT [mailto:[EMAIL PROTECTED]
 Sent: Thursday, April 08, 2004 12:29 PM
 To: [EMAIL PROTECTED]
 Subject: How to download last cvs version behind a firewall ?
 
 
 Hi Listers,
 
 I am a newbie to wget. First of all, I thank the authors for 
 this work, it 
 is really a great tool.
 I would need the last version of wget because I am 
 interesseted by the 
 --keep-session-cookie option. 
 My goal is to check some websites in https through a form 
 behind a proxy, 
 can you confirm that it's possible with the last cvs version?
 My problem is that I cannot run cvs nor ssh because I am behind a 
 firewall. I was thinking if there was a machine making a tar 
 recursively 
 or a place where I can find the last beta version... does it 
 exists and do 
 you have any address ?? 
 
 Regards and thanks for your answer,
 Emmanuel Eisenstaedt
 


RE: I have no idea how to work wget...

2004-03-30 Thread Herold Heiko
Keep present wget is a command line application - doubleclick on wget.exe
won't accomplish anything.
Opening a command prompt and running wget is a bit better, you'll get this
output:

wget: missing URL
Usage: wget [OPTION]... [URL]...

Try `wget --help' for more options.

Here
http://xoomer.virgilio.it/hherold/index.html#Howto
is a basic minimal starter howto.

If you don't like command line programs maybe you'll prefer to use a wrapper
like wgetgui
http://www.jensroesner.de/wgetgui/
although I really don't use it and can't comment on it.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, March 30, 2004 6:07 AM
 To: [EMAIL PROTECTED]
 Subject: I have no idea how to work wget...
 
 
 Hello,
 I tried simply opening the executable, but that just flashed 
 up like a DOS 
 prompt and then disappeared. I've read the documentation, but 
 I simply don't 
 understand how to get it to work. Are there any REALLY 
 elementary step-by-step 
 resources out there? I AM NOT ON THE MAILING LIST!!! cc: responses to 
 [EMAIL PROTECTED] Thanks.
 God bless,
 Justin
 


Win msvc binary

2004-03-29 Thread Herold Heiko
http://xoomer.virgilio.it/hherold/
I removed the test version and added a binary from cvs with the
OpenFileMapping() patch.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: [PATCH] Windows console control event handling

2004-03-23 Thread Herold Heiko
Windows MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Saturday, March 20, 2004 12:58 AM
 To: David Fritz
 Cc: [EMAIL PROTECTED]
 Subject: Re: [PATCH] Windows console control event handling
 
 
 Thanks for looking into this.  I don't pretend to understand the logic
 behind this part of the Windows code.
 
 I applied your patch, thanks.
 


RE: [PATCH] A working implementation of fork_to_background() under Windows – please test

2004-03-23 Thread Herold Heiko
MSVC binary at http://xoomer.virgilio.it/hherold/ for public testing.
I performed only basic tests on NT4 sp6a, everything performed fine as
expected.

Some ideas on this thing:

In verbose mode the child should probably acknowledge in the log file the
fact it was invocated as child.

In debug mode the client should probably also log the name of the section
object and any information retrieved from it (currently the flag only).

In quiet mode the parent log (child pid, log on wget-log or whatever)
probably should not be printed.

A possible fix for the wgetrc race condition could be caching the content of
the whole wgetrc in the parent and transmit it in the section object to the
child, a bit messy I must admit but a possible solution if that race
condition is considered a Bad Thing. About the only scenario I could think
of is where you have a script creating a custom wgetrc, run wget, then
change the wgetrc: introduce -b and the script could change the wgetrc after
running wget but before the parsing on client side a rather remote but
possible scenario.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: David Fritz [mailto:[EMAIL PROTECTED]
 Sent: Saturday, March 20, 2004 2:37 AM
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: [PATCH] A working implementation of 
 fork_to_background() under
 Windows  please test
 
 
 Attached is an implementation of fork_to_background() for 
 Windows that (I hope) 
 has the desired effect under both 9x and NT.
 
 _This is a preliminary patch and needs to be tested._
 
 The patch is dependant upon the fact that the only time 
 fork_to_background() is 
 called is on start-up when b is specified.
 
 Windows of course does not support the fork() call, so it 
 must be simulated. 
 This can be done by creating a new process and using some 
 form of inter-process 
 communication to transfer the state of the old process to the 
 new one.  This 
 requires the parent and child to cooperate and when done in a 
 general way (such 
 as by Cygwin) requires a lot of work.
 
 However, with Wget since we have a priori knowledge of what 
 could have changed 
 in the parent by the time we call fork(), we could implement 
 a special purpose 
 fork() that only passes to the child the things that we know 
 could have changed. 
   (The initialization done by the C run-time library, etc. 
 would be performed 
 anew in the child, but hold on a minute.)
 
 The only real work done by Wget before calling fork() is the 
 reading of wgetrc 
 files and the processing of command-line arguments.  Passing 
 this information 
 directly to the child would be possible, but the 
 implementation would be complex 
 and fragile. It would need to be updated as changes are made 
 to the main code.
 
 It would be much simpler to simply perform the initialization 
 (reading of config 
 files, processing of args, etc.) again in the child.  This 
 would have a small 
 performance impact and introduce some race-conditions, but I 
 think the 
 advantages (having b work) outweigh the disadvantages.
 
 The implementation is, I hope, fairly straightforward.  I 
 have attempted to 
 explain it in moderate detail in an attached README.
 
 I'm hoping others can test it with various operating systems 
 and compilers. 
 Also, any feedback regarding the design or implementation 
 would be welcome.  Do 
 you feel this is the right way to go about this?
 
 Cheers,
 David Fritz
 
 
 2004-03-19  David Fritz  [EMAIL PROTECTED]
 
   * mswindows.c (make_section_name, fake_fork, 
 fake_fork_child): New
   functions.
   (fork_to_backgorund): Replace with new implementation.
 
 
 
 


RE: Wget - relative links within a script call aren't followed

2004-03-15 Thread Herold Heiko
No way, sorry.
wget does not support javascript, so there is no way to have it follow that
kind of links.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Raydeen A. Gallogly [mailto:[EMAIL PROTECTED]
 Sent: Friday, March 12, 2004 4:20 PM
 To: [EMAIL PROTECTED]
 Subject: Wget - relative links within a script call aren't followed
 
 
 I'm new to Wget but have learned alot in the last week.  We are
 successfully running Wget to mirror a website existing on the 
 other side of
 a firewall within our own agency.  We can retrieve all 
 relative links from
 existing HTML files with the exception of those that are 
 contained within a
 script.
 
 For example, this is an excerpt from a script call to load an 
 image within
 an HTML document that is not being followed:
 MM_preloadImages('pix/lats_but_lite.gif',)
 
 The only fix to this problem so far that we have been able to 
 implement is
 to have the webmaster on the site that we want to mirror 
 create a small
 HTML file named 'wgetfixes.html', link to it from the home 
 page using style
 (display:none;) so that users won't see.  Within the file, 
 list all the
 files that they are calling from within their scripts 
 individually using
 the following syntax: img src=pix/lat_but_lite.gif -- 
 this works fine
 but I'm hopeful that there is a better way using a switch within Wget.
 
 Thanks for any input, it is truly appreciated.  - Raydeen
 
 ..
 
 
 Raydeen Gallogly
 Web Manager
 NYS Department of Health, Wadsworth Center
 http://www.wadsworth.org
 email: [EMAIL PROTECTED]
 
 
 
 
 
 
 
 


RE: Wget - relative links within a script call aren't followed

2004-03-15 Thread Herold Heiko
This has been discusses several times in the past, for a complete solution
LOT of work would be needed (a complete javascript engine would be
neccessary for a starter), also there are several semantic problems (for
example if a pic is laded only during mouseover, without preload, we still
would not get it, since there is no mouse).
Possibly some very partial, incomplete solution would be possible but
frankly that would be an ugly hack.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Fred Holmes [mailto:[EMAIL PROTECTED]
 Sent: Monday, March 15, 2004 3:09 PM
 To: Herold Heiko; 'Raydeen A. Gallogly'; [EMAIL PROTECTED]
 Subject: RE: Wget - relative links within a script call 
 aren't followed
 
 
 It surely would be nice if some day WGET could support 
 javascript.  Is that something to put on the wish list or 
 is it substantially impossible to implement?  Do folks use 
 Java to load images in order to thwart 'bots such as WGET?
 
 I run into the same problem regularly, and simply create a 
 series of lines in a batch file that download each of the 
 images by explicit filename.  Very doable, but requires 
 manual setup, rather than having WGET automatically follow 
 the links.  This will test for/download files that are known 
 to ought to be there, but won't find files that are newly added.
 
 Thanks,
 
 Fred Holmes
 
 At 05:07 AM 3/15/2004, Herold Heiko wrote:
 No way, sorry.
 wget does not support javascript, so there is no way to have 
 it follow that
 kind of links.
 Heiko
 
 -- 
 -- PREVINET S.p.A. www.previnet.it
 -- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
 -- +39-041-5907073 ph
 -- +39-041-5907472 fax
 
  -Original Message-
  From: Raydeen A. Gallogly [mailto:[EMAIL PROTECTED]
  Sent: Friday, March 12, 2004 4:20 PM
  To: [EMAIL PROTECTED]
  Subject: Wget - relative links within a script call aren't followed
  
  
  I'm new to Wget but have learned alot in the last week.  We are
  successfully running Wget to mirror a website existing on the 
  other side of
  a firewall within our own agency.  We can retrieve all 
  relative links from
  existing HTML files with the exception of those that are 
  contained within a
  script.
  
  For example, this is an excerpt from a script call to load an 
  image within
  an HTML document that is not being followed:
  MM_preloadImages('pix/lats_but_lite.gif',)
  
  The only fix to this problem so far that we have been able to 
  implement is
  to have the webmaster on the site that we want to mirror 
  create a small
  HTML file named 'wgetfixes.html', link to it from the home 
  page using style
  (display:none;) so that users won't see.  Within the file, 
  list all the
  files that they are calling from within their scripts 
  individually using
  the following syntax: img src=pix/lat_but_lite.gif -- 
  this works fine
  but I'm hopeful that there is a better way using a switch 
 within Wget.
  
  Thanks for any input, it is truly appreciated.  - Raydeen
  
  ..
  
  
  Raydeen Gallogly
  Web Manager
  NYS Department of Health, Wadsworth Center
  http://www.wadsworth.org
  email: [EMAIL PROTECTED]
  
  
  
  
  
  
  
  
 


RE: [PATCH] Don't launch the Windows help file in response to --h elp

2004-02-23 Thread Herold Heiko
MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Sunday, February 22, 2004 4:47 PM
 To: David Fritz
 Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED]
 Subject: Re: [PATCH] Don't launch the Windows help file in response to
 --help
 
 
 David Fritz [EMAIL PROTECTED] writes:
 
  Attached is a patch that removes the ws_help() function from
  mswindows.[ch] and the call to it from print_help() in main.c.  Also
  attached is an alternate patch that will fix ws_help(), which I
  neglected to update when I changed ws_mypath().
 
  I find this behavior inconsistent with the vast majority of other
  command line tools.  It's something akin to popping-up a web browser
  with the HTML version of the docs in response to `wget --help' when
  running in a terminal under X.
 
 You are right.  Popping up the help screen probably seemed like a good
 idea to an early porter of Wget to Windows, and it stuck because noone
 bothered to remove it.
 
 Thanks for the patch, I've now applied it to CVS.
 


RE: delete-before switch

2004-02-16 Thread Herold Heiko
[resubmitted to wget@ instead of wget-patches]

 From: Rupert Levene [mailto:[EMAIL PROTECTED]
 
..
 
 My vote: keep the option for either behaviour :-) As written, the
 patch only changes behaviour if the --timestamping and 
--delete-before
 options are in effect.
 
 Rupert

I understand that you want that feature for your own special 
needs, on the
other hand there is Hrvoje's (more than reasonable!) desire 
to avoid option
proliferation and creeping featuritis.
So why not a more general option - you could code a 
run-external-command
feature before and after downloading a file, passing a number 
of arguments.
Something like

command BEFORE [LOC=location, url] [SAVE_PATH=path where 
the file will be
saved] [REF=possibly referring url] [ORG_SIZE=...] [STARTTIME=] ...

then download, followed by

command AFTER SUCCESS|FAILURE [NUM_ATTEMPTS=..]
[ERRTYPE=TIMEOUT|MAX_ATTEMPTS|NOT_RESOLVED] [FINAL_SIZE=] 
[USERTIME=...]
[EFFECTIVETIME=usertime except the retry waiting periods] ...

just as an example of syntax and parameters, probably 
somebody could come up
with a better syntax, possibly some other interesting data could be
gathered. Possibly the data could be passed in the 
environment instead of
arguments (this would avoid the need for getopts or string 
operations for
simple shell scripts).

This would solve a whole lot of wanted features with just one 
option, for
example from time to time somebody wants to know how to get 
an exact list of
downloaded files, currently the log must be parsed or 
something similar.
You would just write a small script in order to unlink the 
SAVE_PATH file
and run wget --run-before=dounlink.pl or whatever.

I suppose for a starter just basic data already available (url, path 
filename, SUCCESS|FAILURE) would contain the amount of work 
needed for this.

Hrvoje, what do you think about this ? Acceptable ? Horrible ?

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: apt-get via Windows with wget

2004-02-02 Thread Herold Heiko
 From: Jens Rösner [mailto:[EMAIL PROTECTED]

 Note:
 Mail redirected from bug to normal wget list.

  H 
 ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip,
  OK, but too bad there's no stable second link .../latest.zip so I
  don't have to update my web page to follow the link.
 Yep, this would make things much easier for applications like yours.

Dan, Jens,
I added a wget-complete-stable.zip, if you want to link to a fixed url use
that, I'll update it whenever needed. Currently it is the same archive as
the wget-wget-1.9.1b-complete.zip .


  Furthermore, they don't need SSL, but I don't see any 'diet'
  versions...
 Right, Heiko is so kind to compile the SSL enabled wget binaries. 
 If you need it without SSL, you would have to compile it yourself.
 But since you don't have windows...

For some time I provided a binary with and without ssl, then I started to
get 15 email/month I can't dl https sitez (change to your preferred hax0r
speak) and grew tired answering read the description in the index, you want
the ssl version. As long as the libraries are placed somewhere in the path
OR simply kept in the same directory where wget is the ssl version is fine
for everything after all.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: [PATCH] implementation of determine_screen_width() for Window s

2004-01-28 Thread Herold Heiko
Works fine for me on Winnt 4.0 sp6a (cmd windows with column sizes != 80 now
use the whole line for the progress bar), compiled with MSVC.
A binary with that patch is available from
http://xoomer.virgilio.it/hherold/ 
In order to test just open a command window with a buffer column size !=80,
the progress bar now should use the whole line.
Anyone who can test this on different platfroms (windows 95/98/ME/2000/XP)
should please report to the list.
Anyone who can compile this with different compilers (watcom, cygwin, mingw,
borland) should please report to the list.
If no problems arise I'd vote for inclusion in cvs.

Note: for a complete look-and-feel similar to the unix version we still need
a detection when the size changes (on unix this is done with
received_sigwinch in bar_create and bar_update), if this is possible.
Currently if the cmd window is resized on the fly (as a matter of fact the
window buffer size, not the real window) the progress bar continues to use
the old size.
Still from a usability point of view this patch is already a lot better than
the old behaviour (DEFAULT_SCREEN_WIDTH).

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: David Fritz [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, January 28, 2004 4:33 AM
 To: [EMAIL PROTECTED]
 Subject: [PATCH] implementation of determine_screen_width() 
 for Windows
 
 
 Attached is a small patch that implements 
 determine_screen_width() for 
 the Windows build.
 
 Cheers
 
 


RE: [PATCH] implementation of determine_screen_width() for Window s

2004-01-28 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
..
 Yes.  Specifically, Unix's SIGWINCH simply sets a flag that means
 window size might have changed, please check it out.  That is
 because checking window size on each refresh would perform an
 unnecessary ioctl.
 
 One thing we could do for Windows is check for window size every
 second or so.

I agree, but I have no idea how taxing those GetStdHandle() and
GetConsoleScreenBufferInfo() are.
Maybe David can shed more light on this, or even profile a bit.
Possibly the handle could be cached, saving at least the GetStdHandle() bit.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: POST trouble

2003-12-17 Thread Herold Heiko
Works ok.
windows MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, December 16, 2003 9:41 PM
 To: Herold Heiko
 Cc: List Wget (E-mail); [EMAIL PROTECTED]
 Subject: Re: POST trouble
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  Content-Length: Content-Length: 35
 [...]
  The line
  Content-Length: Content-Length: 35
  certainly seems strange.
 
 Yup, that's where the bug is.  This should fix it:
 
 2003-12-16  Hrvoje Niksic  [EMAIL PROTECTED]
 
   * http.c (gethttp): Fix generation of `Content-Length'.
 
 Index: src/http.c
 ===
 RCS file: /pack/anoncvs/wget/src/http.c,v
 retrieving revision 1.137
 diff -u -r1.137 http.c
 --- src/http.c2003/12/12 22:55:19 1.137
 +++ src/http.c2003/12/16 20:39:30
 @@ -1253,8 +1253,7 @@
   }
   }
request_set_header (req, Content-Length,
 -   aprintf (Content-Length: %ld, 
 post_data_size),
 -   rel_value);
 +   aprintf (%ld, post_data_size), rel_value);
  }
  
/* Add the user headers. */
 


RE: Wget 1.9 error

2003-12-09 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
..
 released one that we want (most) users to download.  Heiko, would you
 consider reordering the table so that the 1.9.1 release row comes
 first, followed by development version, (optionally) followed by older
 versions?

Ha! :-)
Time ago it was like that, but during some frequent activity people often
bugged me for a chronological order.
I changed the first column somewhat in order to make the distinction between
stable and development more obvious.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Wget 1.9 error

2003-12-09 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 Hmm.  Then how about separating the development snapshots, and older
 entries, to a separate page?  It seems simpler for there to be only

Yes, ok, I'll do something like that.
Maybe even one single archive with ssl libraries included, although since I
started to include a small readme.txt in any zip I almost don't get any more
mail what is ssleay32.dll why is wget asking for it and where do I get it
?).

 BTW do people really download old snapshots?  Other than by mistake,
 that is?

The answer definitively is yes.
I don't know how much or how often, but the few times I deleted a bunch of
old stuff I always got some mail please send me old_version you had until
yesterday. Just don't ask me why, I don't understand it either :(

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Testing on BEOS?

2003-11-26 Thread Herold Heiko
Sample windows MSVC compiled and basic test performed (download of the same
site with http and https, got exactly the same files).
Binary at the usual place, unfortunately my crappy ISP webserver seems to be
in Guru Meditation just now and refuses access (not the first problem after
the recent merger induced changes), so here are the direct links for binary
and sources:
ftp://ftp.sunsite.dk/projects/wget/windows/wget20031126b.zip
ftp://ftp.sunsite.dk/projects/wget/windows/wget20031126s.zip

Whenever that webserver will decide to return to earth the usual description
(stating nothing special in this case) will again be available at
http://xoomer.virgilio.it/hherold

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, November 26, 2003 2:38 PM
 To: [EMAIL PROTECTED]
 Subject: Testing on BEOS?
 
 
 Does someone have access to a BEOS machine with a compiler?  I'd like
 to verify whether the current CVS works on BEOS, i.e. whether it's
 still true that BEOS doesn't support MSG_PEEK.
 
 Speaking of testing, please be sure to test the latest CVS on Windows
 as well, where MSG_PEEK is said to be flaky.  HTTPS is another thing
 that might work strangely because SSL_peek is undocumented (!).
 


ipv6 patch

2003-11-19 Thread Herold Heiko
Attached a little patch needed for current cvs in order to compile on
windows nt 4 (any system without IPV6 really).

Changelog:
connect.c (socket_has_inet6): don't use AF_INET6 without ENABLE_IPV6
main.c (main): don't test opt.ipv[46]_only without ENABLE_IPV6

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax



20031119.diff
Description: Binary data


RE: windows devel binary

2003-11-17 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

  This is a binary compiled and run on windows nt 4, which 
 doesn't support
  IPV6, so the -4 should probably be a no-op ?
 
 Or not work at all.
 


I was thinking (rather late, I see you have changed other IPV6 stuff in the
meantime), why cut the -4 switch if no IPV6 is present ? The principle of
least surprise would say leave the switch there in order to avoid a unknown
switch error.

Suppose you have a bunch of machines, some with, some without IPV6 support.
You always want to enforce IPV4 usage. With a -4 switch always supported a
simple wget -4 would do the trick in any script used on all those machines.
Without that you'd need to have some mean to detect the IPV6 support and
change the wget switches used accordingly.

Same thing for -6 in fact - leave the switch even if no IPV6 is present and
supported, die with a meaningful error message (much better than a unknown
switch failure).

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: feature request: --second-guess-the-dns

2003-11-17 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

 Dan Jacobson [EMAIL PROTECTED] writes:
 
  But I want a
 --second-guess-the-dns=ADDRESS
 
 Aside from `--second-guess-the-dns' being an awful name (sorry), what
 is the usage scenario for this kind of option?  I.e. why would anyone
 want to use it?

Just yesterday I did something similar (by changing local /etc/hosts) in
order to directly test different web servers behind a farming device.
Multiple servers behind a round robin dns or similar stuff could be another
possible scenario where this would be useful. Not your daily usage though.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Wget 1.9.1 has been released

2003-11-14 Thread Herold Heiko
Windows MSVC binary at http://xoomer.virgilio.it/hherold
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Friday, November 14, 2003 2:55 AM
 To: [EMAIL PROTECTED]
 Subject: Wget 1.9.1 has been released
 
 
 Wget 1.9.1 is now available on ftp.gnu.org and its mirrors.  It is a
 bugfix release that contains fixes for several problem noted in the
 1.9 release.  Unless further serious bugs are discovered, it is likely
 to remain the last in the 1.9.x series.
 


windows devel binary

2003-11-14 Thread Herold Heiko
Windows MSVC binary for current cvs at http://xoomer.virgilio.it/hherold/

This is a bit of a Doctor, if I do this it hurts. - So don't do that!, but
I think this should not happen:

D:\Wip\Wget\wget.wip\srcwget -4
Assertion failed: 0 = comind  comind  countof (commands), file init.c,
line 589

This is a binary compiled and run on windows nt 4, which doesn't support
IPV6, so the -4 should probably be a no-op ?  As I said, shouldn't really be
done, IPV6 code is in development and so on, so this is just a FYI.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: keep alive connections

2003-11-11 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]

 With the HEAD method you never know when you'll stumble upon a CGI
 that doesn't understand it and that will send the body anyway.  But
 maybe it would actually be a better idea to read (and discard) the
 body than to close the connection and reopen it.

Wouldn't that be suboptimal in case that page is huge (and/or the connection
slow) ?
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget v1.9 (Windows port) newbie needs help in download files recursively...

2003-11-10 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 To get the stable sources that have this bug fixed, you might want to
 check out the head of the wget-1_9 branch in CVS.  Heiko, how about
 creating a bugfix 1.9 release for Windows?

No problem with that, but wouldn't a dot release be better ?

I'm not too comfortable with the idea of a windows binary based on the cvs
sources for a already released version, confusion could easily arise due to
different behaviour of the latest somewhat official released windows binary
and the behaviour of the released 1.9 sources, the real problem being the
fact most people on windows will not compile but use a ownloaded binary,
while most people on unix will use the released 1.9 sources (or whatever the
latest rpm included in their distribution is, probably based on the released
1.9 sources, not the latest 1.9 cvs sources).

Or do ou prefer to wait another bit before a dot release ?

Your judgement call, I'll happily compile this or that as you prefer.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


Windows msvc 1.9-stable bugfix release

2003-11-05 Thread Herold Heiko
Binary for current 1.9 cvs branch at http://xoomer.virgilio.it/hherold/

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: wget 1.9 for windows: no debug support?

2003-10-27 Thread Herold Heiko
Confirmed, missed that. Updated binary available at the usual place.
This does the trick for the 1.9 tree.

Changelog: rename DEBUG to ENABLE_DEBUG in windows\config.h.{ms,bor}

diff -ubBr wget-1.9/windows/config.h.bor wget-1.9+debug/windows/config.h.bor
--- wget-1.9/windows/config.h.bor   Mon Oct 13 15:20:52 2003
+++ wget-1.9+debug/windows/config.h.bor Mon Oct 27 09:47:08 2003
@@ -76,7 +76,7 @@
 #define USE_DIGEST 1

 /* Define if you want the debug output support compiled in.  */
-#define DEBUG
+#define ENABLE_DEBUG

 /* Define if you have sys/time.h header. */
 #undef HAVE_SYS_TIME_H
diff -ubBr wget-1.9/windows/config.h.ms wget-1.9+debug/windows/config.h.ms
--- wget-1.9/windows/config.h.msMon Oct 13 15:20:52 2003
+++ wget-1.9+debug/windows/config.h.ms  Mon Oct 27 09:36:44 2003
@@ -55,7 +55,7 @@
 #define USE_DIGEST 1

 /* Define if you want the debug output support compiled in.  */
-#define DEBUG
+#define ENABLE_DEBUG

 /* Define if you have sys/time.h header. */
 #undef HAVE_SYS_TIME_H

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Sunday, October 26, 2003 9:16 PM
 To: [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: Re: wget 1.9 for windows: no debug support?
 
 
 [EMAIL PROTECTED] writes:
 
  Well, to find out what was happening, I specified -d for the debug
  output. The message was: debug support not compiled in
 [...]
  Is this an oversight or does it serve a purpose?
 
 Heiko will know for sure, but it's most likely an oversight.  The
 Windows config.h.* files still enabled debugging with #define DEBUG,
 which has in the meantime been renamed to ENABLE_DEBUG.
 


RE: Problem with wget 1.9 and question mark at least on windows

2003-10-23 Thread Herold Heiko
Also note, I didn't yet compile and publish the msvc windows binary for 1.9
- I suppose that was one of the beta binaries.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, October 23, 2003 12:12 PM
 To: Boris New
 Cc: [EMAIL PROTECTED]
 Subject: Re: Problem with wget 1.9 and question mark at least 
 on windows
 
 
 Sorry about that, Wget currently applies -R and -A only to file names,
 not to the query part of the URL.  Therefore there is currently no
 built-in way to do what you want.
 
 I do plan to fix this, but Wget 1.9 was too late in the works to add
 such a feature.
 
 The current behavior is due to many people using -R to restrict based
 on file names and file name extensions; this usage might break if -R
 also matched the query portion of the URL by default.
 


RE: Wget 1.9 has been released

2003-10-23 Thread Herold Heiko
Windows MSVC binary present at
http://xoomer.virgilio.it/hherold

Attention if you want to compile your own: there still is the
configure.bat.in file - usually in released packages that was renamed
already to configure.bat .

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, October 22, 2003 11:50 PM
 To: [EMAIL PROTECTED]
 Subject: Wget 1.9 has been released
 
 
 I've announced the 1.9 release on freshmeat and will send a mail to
 [EMAIL PROTECTED] shortly.  You can get it from ftp.gnu.org or from a mirror
 site.
 
 ftp://ftp.gnu.org/pub/gnu/wget/wget-1.9.tar.gz
 
 The MD5 checksum of the archive should be:
 
 18ac093db70801b210152dd69b4ef08a  wget-1.9.tar.gz
 
 Again, thanks to everyone who made this release possible by
 contributing bug reports, help, suggestions, test cases, code,
 documentation, or support -- in no particular order.
 
 A summary of the user-visible changes since 1.8, borrowed from `NEWS',
 follows:
 
 * Changes in Wget 1.9.
 
 ** It is now possible to specify that POST method be used for HTTP
 requests.  For example, `wget --post-data=id=foodata=bar URL' will
 send a POST request with the specified contents.
 
 ** IPv6 support is available, although it's still experimental.
 
 ** The `--timeout' option now also affects DNS lookup and establishing
 the TCP connection.  Previously it only affected reading and writing
 data.  Those three timeouts can be set separately using
 `--dns-timeout', `--connection-timeout', and `--read-timeout',
 respectively.
 
 ** Download speed shown by the progress bar is based on the data
 recently read, rather than the average speed of the entire download.
 The ETA projection is still based on the overall average.
 
 ** It is now possible to connect to FTP servers through FWTK
 firewalls.  Set ftp_proxy to an FTP URL, and Wget will automatically
 log on to the proxy as [EMAIL PROTECTED].
 
 ** The new option `--retry-connrefused' makes Wget retry downloads
 even in the face of refused connections, which are otherwise
 considered a fatal error.
 
 ** The new option `--dns-cache=off' may be used to prevent Wget from
 caching DNS lookups.
 
 ** Wget no longer escapes characters in local file names based on
 whether they're appropriate in URLs.  Escaping can still occur for
 nonprintable characters or for '/', but no longer for frequent
 characters such as space.  You can use the new option
 --restrict-file-names to relax or strengthen these rules, which can be
 useful if you dislike the default or if you're downloading to
 non-native partitions.
 
 ** Handling of HTML comments has been dumbed down to conform to what
 users expect and other browsers do: instead of being treated as SGML
 declaration, a comment is terminated at the first occurrence of --.
 Use `--strict-comments' to revert to the old behavior.
 
 ** Wget now correctly handles relative URIs that begin with //, such
 as //img.foo.com/foo.jpg.
 
 ** Boolean options in `.wgetrc' and on the command line now accept
 values yes and no along with the traditional on and off.
 
 ** It is now possible to specify decimal values for timeouts, waiting
 periods, and download rate.  For instance, `--wait=0.5' now works as
 expected, as does `--dns-timeout=0.5' and even `--limit-rate=2.5k'.
 


RE: Wget 1.9-rc1 available for testing

2003-10-17 Thread Herold Heiko
Windows MSVC binary at
http://xoomer.virgilio.it/hherold/
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, October 16, 2003 4:42 PM
 To: [EMAIL PROTECTED]
 Subject: Wget 1.9-rc1 available for testing
 
 
 As the name implies, this should be 1.9 (with only version changed)
 unless a show-stopper is discovered.  Get it from:
 
 http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-rc1.tar.gz
 


RE: Error in wget-1.9-b5.zip

2003-10-15 Thread Herold Heiko
Seems more a DNS issue of that domain to me, at least here and now. That
page redirects to www.yourworstenemy.com which doesn't resolve:

whois www.yourworstenemy.com
Registrant:
HUMMEL, GREG (YOURWORSTENEMY-DOM)
...
  Domain servers in listed order:

   NS1.BEST.COM 128.121.101.11
   NS2.BEST.COM 161.58.9.11
   NS3.BEST.COM 128.121.101.19

dig @ns1.best.com www.yourworstenemy.com A

;  DiG 2.2  @ns1.best.com www.yourworstenemy.com A
; (1 server found)
;; res options: init recurs defnam dnsrch
;; got answer:
;; -HEADER- opcode: QUERY, status: SERVFAIL, id: 55757
;; flags: qr rd ra; Ques: 1, Ans: 0, Auth: 0, Addit: 0
;; QUESTIONS:
;;  www.yourworstenemy.com, type = A, class = IN

;; Total query time: 1001 msec
;; FROM: ns to SERVER: ns1.best.com  128.121.101.11
;; WHEN: Wed Oct 15 14:46:06 2003
;; MSG SIZE  sent: 40  rcvd: 40

in other words, no answer. If you prefer:

c:\nslookup
..
 server ns1.best.com
Server predefinito:  ns1.best.com
Address:  128.121.101.11

 www.yourworstenemy.com.
Server:  ns1.best.com
Address:  128.121.101.11

*** ns1.best.com non trova www.yourworstenemy.com.: Server failed

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, October 15, 2003 2:22 PM
 To: David Drobny
 Cc: [EMAIL PROTECTED]
 Subject: Re: Error in wget-1.9-b5.zip
 
 
 Unfortunately, I don't know what the problem is here.  Perhaps some of
 the Windows people can take over this one?
 


RE: Wget 1.9-beta5 available for testing

2003-10-13 Thread Herold Heiko
 
 This beta includes portability tweaks and minor improvements.  Please
 test it on as many diverse platforms as possible, preferrably with
 both gcc and non-gcc compilers.  If all goes well, I'd like to release
 1.9 perhaps as early as tomorrow.

Windows, msvc:

host.c
host.c(604) : error C2065: 'u_int32_t' : undeclared identifier
host.c(604) : error C2146: syntax error : missing ';' before identifier
'addr_ipv4'
host.c(604) : error C2065: 'addr_ipv4' : undeclared identifier
host.c(605) : error C2275: 'ip_address' : illegal use of this type as an
expression
host.h(74) : see declaration of 'ip_address'
host.c(605) : error C2146: syntax error : missing ';' before identifier
'addr'
host.c(605) : error C2065: 'addr' : undeclared identifier
host.c(615) : error C2146: syntax error : missing ';' before identifier
'inet_addr'

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Wget 1.9-beta5 available for testing

2003-10-13 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Does it compile if you change #define HAVE_U_INT32_T 1 to #undef
 HAVE_U_INT32_T in config.h.ms?

It does.
Windows msvc binary at http://xoomer.virgilio.it/hherold

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Wget 1.9-beta5 available for testing

2003-10-13 Thread Herold Heiko
C:\Programmi\Microsoft Visual Studio\VC98\Include\Native.h has:
typedef long int32_t;
However the comment for Native.h says:
// Public header for facilities provided by MSJava.dll
so I don't know if that one should be used. As a matter of fact the only
other header including Native.h is Nativcom.h:
// Public header for COM-marshaling facilities provided by MSJava.dll

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Monday, October 13, 2003 4:02 PM
 To: Herold Heiko
 Cc: [EMAIL PROTECTED]
 Subject: Re: Wget 1.9-beta5 available for testing
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
  Does it compile if you change #define HAVE_U_INT32_T 1 to #undef
  HAVE_U_INT32_T in config.h.ms?
 
  It does.
  Windows msvc binary at http://xoomer.virgilio.it/hherold
 
 Cool.  BTW does MSVC have int32_t?
 


windows patch for cvs

2003-10-09 Thread Herold Heiko
Changelog:

Remove fnmatch.[ch] from windows\Makefile.src, windows\Makefile.dep.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

diff -urbB wget/windows/Makefile.src wget.wip/windows/Makefile.src
--- wget/windows/Makefile.src   Thu Oct  9 07:17:09 2003
+++ wget.wip/windows/Makefile.src   Thu Oct  9 09:56:49 2003
@@ -66,13 +66,13 @@
 SRC = cmpt.c safe-ctype.c convert.c connect.c host.c http.c netrc.c \
   ftp-basic.c ftp.c ftp-ls.c ftp-opie.c getopt.c hash.c headers.c \
   html-parse.c html-url.c progress.c retr.c recur.c res.c url.c cookies.c \
-  init.c utils.c main.c version.c mswindows.c fnmatch.c gen-md5.c \
+  init.c utils.c main.c version.c mswindows.c gen-md5.c \
   gnu-md5.c rbuf.c log.c $(SSLSRC)
 
 OBJ = cmpt$o safe-ctype$o convert$o connect$o host$o http$o netrc$o \
   ftp-basic$o ftp$o ftp-ls$o ftp-opie$o getopt$o hash$o headers$o \
   html-parse$o html-url$o progress$o retr$o recur$o res$o url$o cookies$o \
-  init$o utils$o main$o version$o mswindows$o fnmatch$o gen-md5$o gnu-md5$o\
+  init$o utils$o main$o version$o mswindows$o gen-md5$o gnu-md5$o\
   rbuf$o log$o $(SSLOBJ)
 
 .SUFFIXES: .c .obj
diff -urbB wget/windows/wget.dep wget.wip/windows/wget.dep
--- wget/windows/wget.dep   Mon Apr 15 19:10:17 2002
+++ wget.wip/windows/wget.dep   Thu Oct  9 09:51:16 2003
@@ -3,11 +3,10 @@
 cmpt$o: cmpt.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h
 connect$o: connect.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
utils.h connect.h host.h
 cookies$o: cookies.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
cookies.h hash.h url.h utils.h
-fnmatch$o: fnmatch.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
fnmatch.h
 ftp-basic$o: ftp-basic.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
utils.h rbuf.h connect.h host.h ftp.h
 ftp-ls$o: ftp-ls.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
utils.h ftp.h url.h
 ftp-opie$o: ftp-opie.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
gen-md5.h
-ftp$o: ftp.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
url.h rbuf.h retr.h ftp.h connect.h host.h fnmatch.h netrc.h
+ftp$o: ftp.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
url.h rbuf.h retr.h ftp.h connect.h host.h netrc.h
 gen-md5$o: gen-md5.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
gen-md5.h
 gen_sslfunc$o: gen_sslfunc.c config.h wget.h sysdep.h mswindows.h options.h 
safe-ctype.h utils.h connect.h host.h url.h
 getopt$o: getopt.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
getopt.h
@@ -17,7 +16,7 @@
 host$o: host.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
host.h url.h hash.h
 html-parse$o: html-parse.c config.h wget.h sysdep.h mswindows.h options.h 
safe-ctype.h html-parse.h
 html-url$o: html-url.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
html-parse.h url.h utils.h
-http$o: http.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
url.h host.h rbuf.h retr.h headers.h connect.h fnmatch.h netrc.h gen-md5.h
+http$o: http.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
url.h host.h rbuf.h retr.h headers.h connect.h netrc.h gen-md5.h
 init$o: init.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
init.h host.h recur.h netrc.h cookies.h progress.h
 log$o: log.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h
 main$o: main.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
getopt.h init.h retr.h rbuf.h recur.h host.h gen_sslfunc.h getopt.h
@@ -25,10 +24,10 @@
 netrc$o: netrc.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
netrc.h init.h
 progress$o: progress.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h 
utils.h retr.h rbuf.h
 rbuf$o: rbuf.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h rbuf.h 
connect.h host.h
-recur$o: recur.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h url.h 
recur.h utils.h retr.h rbuf.h ftp.h fnmatch.h host.h hash.h
+recur$o: recur.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h url.h 
recur.h utils.h retr.h rbuf.h ftp.h host.h hash.h
 retr$o: retr.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
retr.h rbuf.h url.h recur.h ftp.h host.h connect.h hash.h
 safe-ctype$o: safe-ctype.c config.h safe-ctype.h
 snprintf$o: snprintf.c config.h safe-ctype.h
 url$o: url.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
url.h host.h hash.h
-utils$o: utils.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
fnmatch.h hash.h
+utils$o: utils.c config.h wget.h sysdep.h mswindows.h options.h safe-ctype.h utils.h 
hash.h
 version$o: version.c


RE: Bug in Windows binary?

2003-10-06 Thread Herold Heiko
 From: Gisle Vanem [mailto:[EMAIL PROTECTED]

 Jens Rösner [EMAIL PROTECTED] said:
 
...
 
 I assume Heiko didn't notice it because he doesn't have that function
 in his kernel32.dll. Heiko and Hrvoje, will you correct this ASAP?
 
 --gv

Probably.
Currently I'm compiling and testing on NT 4.0 only.
Beside that I'm VERY tight on time in this moment so testing usually means
does it run ? Does it download one sample http and one https site ? Yes ?
Put it up for testing!.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


New win binary was (RE: Compilation breakage in html-parse.c)

2003-10-06 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 This might be one cause for compilation breakage in html-parse.c.
 It's a Gcc-ism/c99-ism/c++-ism, depending on how you look at it, fixed
 by this patch:
 
 2003-10-03  Hrvoje Niksic  [EMAIL PROTECTED]
 
   * html-parse.c (convert_and_copy): Move variable declarations
   before statements.

Either this or another patch resolved - I didn't have time to track it down
for good. Didn't even read the Changelog, just a quick export, make, minimal
test, put up on site.
New msvc binary from current cvs at http://xoomer.virgilio.it/hherold
(yes, ISP decided to change the url. Old urls do still work).

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Option to save unfollowed links

2003-10-02 Thread Herold Heiko
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, October 01, 2003 9:20 PM
 
 Tony Lewis [EMAIL PROTECTED] writes:
 
  Would something like the following be what you had in mind?
 
  301 http://www.mysite.com/
  200 http://www.mysite.com/index.html
  200 http://www.mysite.com/followed.html
  401 http://www.mysite.com/needpw.html
  --- http://www.othersite.com/notfollowed.html
 
 Yes, with the possible extensions of file name where the link was
 saved, sensible status for non-HTTP (currently FTP) links, etc.
 

url which contained the first encountered link to that object, all urls
pointing to that page, number of retries used, total time needed, mean
donwload bandwidth...
lots of interesting data could be logged that way. Collection of desired
fields should definitively be configurable at runtime.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


Another windows binary

2003-09-29 Thread Herold Heiko
description and Changelogs at 
http://space.tin.it/computer/hherold
as usual.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Wget 1.9-beta1 is available for testing

2003-09-25 Thread Herold Heiko
Hrvoje,

please add this patch:

--- wget-1.9-beta1\windows\Makefile.src Sat May 18 02:16:36 2002
+++ wget-1.9-beta1.wip\windows\Makefile.src Thu Sep 25 08:09:26 2003
@@ -63,15 +63,17 @@

 RM  = del

-SRC = cmpt.c safe-ctype.c connect.c host.c http.c netrc.c ftp-basic.c ftp.c
\
-  ftp-ls.c ftp-opie.c getopt.c hash.c headers.c html-parse.c html-url.c
\
-  progress.c retr.c recur.c res.c url.c cookies.c init.c utils.c main.c
\
-  version.c mswindows.c fnmatch.c gen-md5.c gnu-md5.c rbuf.c log.c
$(SSLSRC)
+SRC = cmpt.c safe-ctype.c convert.c connect.c host.c http.c netrc.c \
+  ftp-basic.c ftp.c ftp-ls.c ftp-opie.c getopt.c hash.c headers.c \
+  html-parse.c html-url.c progress.c retr.c recur.c res.c url.c
cookies.c \
+  init.c utils.c main.c version.c mswindows.c fnmatch.c gen-md5.c \
+  gnu-md5.c rbuf.c log.c $(SSLSRC)

-OBJ = cmpt$o safe-ctype$o connect$o host$o http$o netrc$o ftp-basic$o ftp$o
\
-  ftp-ls$o ftp-opie$o getopt$o hash$o headers$o html-parse$o html-url$o
\
-  progress$o retr$o recur$o res$o url$o cookies$o init$o utils$o main$o
\
-  version$o mswindows$o fnmatch$o gen-md5$o gnu-md5$o rbuf$o log$o
$(SSLOBJ)
+OBJ = cmpt$o safe-ctype$o convert$o connect$o host$o http$o netrc$o \
+  ftp-basic$o ftp$o ftp-ls$o ftp-opie$o getopt$o hash$o headers$o \
+  html-parse$o html-url$o progress$o retr$o recur$o res$o url$o
cookies$o \
+  init$o utils$o main$oversion$o mswindows$o fnmatch$o gen-md5$o
gnu-md5$o\
+  rbuf$o log$o $(SSLOBJ)

 .SUFFIXES: .c .obj

A windows binary ONLY (sorry I'm in a real hurry these days) for testing
available at 
http://space.tin.it/computer/hherold
However the binaries are as usualy at sunsite.dk, and presently anonymous
ftp access at sunsite.dk seems to be down.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, September 23, 2003 11:29 PM
 To: [EMAIL PROTECTED]
 Subject: Wget 1.9-beta1 is available for testing
 
 
 After a lot of time of sitting in CVS, a beta of Wget 1.9 is
 available.  To see what's new since 1.8, check the `NEWS' file in the
 distribution.  Get it from:
 
 http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta1.tar.gz
 
 Please test it on as many different platforms as possible and in the
 places where Wget 1.8.x is currently being used.  I expect this
 release to be extremely stable, but noone can guarantee that without
 wider testing.  I didn't want to call it pre1 or rc1 lest I anger
 the Gods.
 
 One important addition scheduled for 1.9 and *not* featured in this
 beta are Mauro's IPv6 improvements.  When I receive and merge Mauro's
 changes, I'll release a new beta.
 
 As always, thanks for your help.
 


RE: Wget 1.9-beta1 is available for testing

2003-09-25 Thread Herold Heiko
Unmangled patch attached.

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Herold Heiko [mailto:[EMAIL PROTECTED]
 Sent: Thursday, September 25, 2003 10:45 AM
 To: 'Hrvoje Niksic'; [EMAIL PROTECTED]
 Subject: RE: Wget 1.9-beta1 is available for testing
 
 
 Hrvoje,
 
 please add this patch:
 
 --- wget-1.9-beta1\windows\Makefile.src Sat May 18 02:16:36 2002
 +++ wget-1.9-beta1.wip\windows\Makefile.src Thu Sep 25 
 08:09:26 2003
 @@ -63,15 +63,17 @@
 
  RM  = del
 
 -SRC = cmpt.c safe-ctype.c connect.c host.c http.c netrc.c 
 ftp-basic.c ftp.c
 \
 -  ftp-ls.c ftp-opie.c getopt.c hash.c headers.c 
 html-parse.c html-url.c
 \
 -  progress.c retr.c recur.c res.c url.c cookies.c init.c 
 utils.c main.c
 \
 -  version.c mswindows.c fnmatch.c gen-md5.c gnu-md5.c 
 rbuf.c log.c
 $(SSLSRC)
 +SRC = cmpt.c safe-ctype.c convert.c connect.c host.c http.c netrc.c \
 +  ftp-basic.c ftp.c ftp-ls.c ftp-opie.c getopt.c hash.c 
 headers.c \
 +  html-parse.c html-url.c progress.c retr.c recur.c res.c url.c
 cookies.c \
 +  init.c utils.c main.c version.c mswindows.c fnmatch.c 
 gen-md5.c \
 +  gnu-md5.c rbuf.c log.c $(SSLSRC)
 
 -OBJ = cmpt$o safe-ctype$o connect$o host$o http$o netrc$o 
 ftp-basic$o ftp$o
 \
 -  ftp-ls$o ftp-opie$o getopt$o hash$o headers$o 
 html-parse$o html-url$o
 \
 -  progress$o retr$o recur$o res$o url$o cookies$o init$o 
 utils$o main$o
 \
 -  version$o mswindows$o fnmatch$o gen-md5$o gnu-md5$o 
 rbuf$o log$o
 $(SSLOBJ)
 +OBJ = cmpt$o safe-ctype$o convert$o connect$o host$o http$o netrc$o \
 +  ftp-basic$o ftp$o ftp-ls$o ftp-opie$o getopt$o hash$o 
 headers$o \
 +  html-parse$o html-url$o progress$o retr$o recur$o res$o url$o
 cookies$o \
 +  init$o utils$o main$oversion$o mswindows$o fnmatch$o gen-md5$o
 gnu-md5$o\
 +  rbuf$o log$o $(SSLOBJ)
 
  .SUFFIXES: .c .obj
 
 A windows binary ONLY (sorry I'm in a real hurry these days) 
 for testing
 available at 
 http://space.tin.it/computer/hherold
 However the binaries are as usualy at sunsite.dk, and 
 presently anonymous
 ftp access at sunsite.dk seems to be down.
 
 Heiko 
 
 -- 
 -- PREVINET S.p.A. www.previnet.it
 -- Heiko Herold [EMAIL PROTECTED]
 -- +39-041-5907073 ph
 -- +39-041-5907472 fax
 
  -Original Message-
  From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
  Sent: Tuesday, September 23, 2003 11:29 PM
  To: [EMAIL PROTECTED]
  Subject: Wget 1.9-beta1 is available for testing
  
  
  After a lot of time of sitting in CVS, a beta of Wget 1.9 is
  available.  To see what's new since 1.8, check the `NEWS' 
 file in the
  distribution.  Get it from:
  
  http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta1.tar.gz
  
  Please test it on as many different platforms as possible and in the
  places where Wget 1.8.x is currently being used.  I expect this
  release to be extremely stable, but noone can guarantee that without
  wider testing.  I didn't want to call it pre1 or rc1 
 lest I anger
  the Gods.
  
  One important addition scheduled for 1.9 and *not* featured in this
  beta are Mauro's IPv6 improvements.  When I receive and 
 merge Mauro's
  changes, I'll release a new beta.
  
  As always, thanks for your help.
  
 

--- wget-1.9-beta1\windows\Makefile.src Sat May 18 02:16:36 2002
+++ wget-1.9-beta1.wip\windows\Makefile.src Thu Sep 25 08:09:26 2003
@@ -63,15 +63,17 @@

 RM  = del

-SRC = cmpt.c safe-ctype.c connect.c host.c http.c netrc.c ftp-basic.c ftp.c \
-  ftp-ls.c ftp-opie.c getopt.c hash.c headers.c html-parse.c html-url.c \
-  progress.c retr.c recur.c res.c url.c cookies.c init.c utils.c main.c \
-  version.c mswindows.c fnmatch.c gen-md5.c gnu-md5.c rbuf.c log.c $(SSLSRC)
+SRC = cmpt.c safe-ctype.c convert.c connect.c host.c http.c netrc.c \
+  ftp-basic.c ftp.c ftp-ls.c ftp-opie.c getopt.c hash.c headers.c \
+  html-parse.c html-url.c progress.c retr.c recur.c res.c url.c cookies.c \
+  init.c utils.c main.c version.c mswindows.c fnmatch.c gen-md5.c \
+  gnu-md5.c rbuf.c log.c $(SSLSRC)

-OBJ = cmpt$o safe-ctype$o connect$o host$o http$o netrc$o ftp-basic$o ftp$o \
-  ftp-ls$o ftp-opie$o getopt$o hash$o headers$o html-parse$o html-url$o \
-  progress$o retr$o recur$o res$o url$o cookies$o init$o utils$o main$o \
-  version$o mswindows$o fnmatch$o gen-md5$o gnu-md5$o rbuf$o log$o $(SSLOBJ)
+OBJ = cmpt$o safe-ctype$o convert$o connect$o host$o http$o netrc$o \
+  ftp-basic$o ftp$o ftp-ls$o ftp-opie$o getopt$o hash$o headers$o \
+  html-parse$o html-url$o progress$o retr$o recur$o res$o url$o cookies$o \
+  init$o utils$o main$oversion$o mswindows$o fnmatch$o gen-md5$o gnu-md5$o\
+  rbuf$o log$o $(SSLOBJ)

 .SUFFIXES: .c .obj


RE: windows compile error

2003-09-18 Thread Herold Heiko
Found it.
Using the 23:00 connect.c and the 23:59 retr.c does produce the bug.
Using the 23:59 connect.c and the 23:00 retr.c works ok.
This means the problem must be in retr.c .

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, September 18, 2003 12:38 AM
 To: Herold Heiko
 Cc: List Wget (E-mail)
 Subject: Re: windows compile error
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  Repeatable, and it seems to appear with this:
 
  2003-09-15  Hrvoje Niksic  [EMAIL PROTECTED]
 
  * retr.c (get_contents): Reduce the buffer size to the amount of
  data that may pass through for one second.  This prevents long
  sleeps when limiting bandwidth.
 
  * connect.c (connect_to_one): Reduce the socket's RCVBUF when
  bandwidth limitation to small values is requested.
 
  Previous checkout (checkout -D 23:30 15 sep 2003) wget works fine.
  I also found a public site which seems to expose the 
 problem (at least from
  my machine):
  wget -dv https://www.shavlik.com/pHome.aspx
  dies after
  DEBUG output created by Wget 1.9-beta on Windows.
 [...]
 
 Herold, I'm currently having problems obtaining a working SSL build,
 so I'll need your help with this.
 
 Notice that the above change in fact consists of two changes: one to
 `retr.c', and the other to `connect.c'.  Please try to figure out
 which one is responsible for the crash.  Then we'll have a better idea
 of what to look for.
 


RE: windows compile error

2003-09-18 Thread Herold Heiko
1), 1a), 2) no, no and no.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, September 18, 2003 12:16 PM
 To: Herold Heiko
 Cc: List Wget (E-mail)
 Subject: Re: windows compile error
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  Found it.
  Using the 23:00 connect.c and the 23:59 retr.c does produce the bug.
  Using the 23:59 connect.c and the 23:00 retr.c works ok.
  This means the problem must be in retr.c .
 
 OK, that narrows it down.  Two further questions:
 
 1) If you comment out lines 180 and 181 of retr.c, does the problem go
away?
 
 1a) How about if you replace line 181 with `dlbufsize = 
 sizeof(dlbuf)'?
 
 2) Do you even specify --limit-rate?  If so, to what size?
 


RE: windows compile error

2003-09-18 Thread Herold Heiko
Works.
New windows test binary at the usual place.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Thursday, September 18, 2003 1:43 PM
 To: Herold Heiko
 Cc: List Wget (E-mail); [EMAIL PROTECTED]
 Subject: Re: windows compile error
 
 
 I've noticed the mistake as soon as I compiled with SSL (and saw the
 warnings):
 
 2003-09-18  Hrvoje Niksic  [EMAIL PROTECTED]
 
   * retr.c (get_contents): Pass the correct argument to ssl_iread.
 
 Index: src/retr.c
 ===
 RCS file: /pack/anoncvs/wget/src/retr.c,v
 retrieving revision 1.57
 diff -u -r1.57 retr.c
 --- src/retr.c2003/09/15 21:48:43 1.57
 +++ src/retr.c2003/09/18 11:41:56
 @@ -191,7 +191,7 @@
   ? MIN (expected - *len, dlbufsize) 
 : dlbufsize);
  #ifdef HAVE_SSL
if (rbuf-ssl!=NULL)
 - res = ssl_iread (rbuf-ssl, dlbufsize, amount_to_read);
 + res = ssl_iread (rbuf-ssl, dlbuf, amount_to_read);
else
  #endif /* HAVE_SSL */
   res = iread (fd, dlbuf, amount_to_read);
 
 


protocols directories ?

2003-09-18 Thread Herold Heiko
I'd like to stir up again an old (unresolved) problem regarding the
directory structure used to save files.
Currently if we have a host.site.domain with several services (say, http,
https and ftp) and run one or multiple downloads we could have collision due
to files with the same name downloaded with different protocols.

Think a http://www.some.site/index.html with links to
https://www.some.site.index.html (and why not, even to
ftp://www.some.site/index.html), all three files different. Currently wget
will try to save these files all to  ./www.some.site/index.html (usually).

Solution 1: have a switch like --use-protocol-dir = [no|most|all]

no would be the current state:
./www.some.site/index.html
./www.some.site/index.html
./www.some.site/index.html

all would be: always add a directory level for the protocol:
./http/www.some.site/index.html
./https/www.some.site/index.html
./ftp/www.some.site/index.html

most would be mixed, http (used most) without, other protocols with:
./www.some.site/index.html
./https/www.some.site/index.html
./ftp/www.some.site/index.html)

However I think the third solution would be ugly and confusing. The second
would solve the problem, but most people won't like the extra directory, so
probably the default should still be no

Solution 2 would be not adding the protocol as an extra directory structure,
but adding it in the directory name. Really ugly and confusing however it is
implmented (for example ./httpss.domain.org/ is a http://ss.domain.org or a
https://s.domain.org ?) but another possibility.

This whole thing has been already discussed time ago, in the end the
behavior was unchanged. I still think an option would be correct.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: windows compile error

2003-09-17 Thread Herold Heiko
Does compile now, but I managed to produce an application error during a
test run on a https site.

I produced a debug build with /DDEBUG /Zi /Od /Fd /FR and produced the
wget.bsc by running bscmake on all the sbr files, but I didn't yet
understand how to use that one in VC++ in order to get a meaningfull stack
trace and so on.
The only thing I got for now is :SSLEAY32! 0023ca38() as the breaking
point.
Anybody knows to to debug this beast or how to generate a working project
file in order to make the source browser work ?

Heiko Herold

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, September 16, 2003 11:02 PM
 To: Herold Heiko
 Cc: List Wget (E-mail)
 Subject: Re: windows compile error
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  Just a quick note, the current cvs code on windows during 
 compile (with
  VC++6) stops with
 
  cl /I. /DWINDOWS /D_CONSOLE /DHAVE_CONFIG_H 
 /DSYSTEM_WGETRC=\wgetrc\
  /DHAVE_SSL /nologo /MT /W0 /O2 /c utils.c
  utils.c
  utils.c(1651) : error C2520: conversion from unsigned 
 __int64 to double not
  implemented, use signed __int64
 
  The culprit seems to be (in wtimer_sys_diff)
 
  #ifdef WINDOWS
return (double)(wst1-QuadPart - wst2-QuadPart) / 1;
  #endif
 
 Does this patch help?
 
 2003-09-16  Hrvoje Niksic  [EMAIL PROTECTED]
  
   * utils.c (wtimer_sys_diff): Convert the time 
 difference to signed
   __int64, then to double.  This works around MS VC++ 6 
 which can't
   convert unsigned __int64 to double directly.
 
 Index: src/utils.c
 ===
 RCS file: /pack/anoncvs/wget/src/utils.c,v
 retrieving revision 1.54
 diff -u -r1.54 utils.c
 --- src/utils.c   2003/09/15 21:14:15 1.54
 +++ src/utils.c   2003/09/16 21:01:02
 @@ -1648,7 +1648,10 @@
  #endif
  
  #ifdef WINDOWS
 -  return (double)(wst1-QuadPart - wst2-QuadPart) / 1;
 +  /* VC++ 6 doesn't support direct cast of uint64 to double.  To work
 + around this, we subtract, then convert to signed, then 
 finally to
 + double.  */
 +  return (double)(signed __int64)(wst1-QuadPart - 
 wst2-QuadPart) / 1;
  #endif
  }
  
 


RE: windows compile error

2003-09-17 Thread Herold Heiko
Repeatable, and it seems to appear with this:

2003-09-15  Hrvoje Niksic  [EMAIL PROTECTED]

* retr.c (get_contents): Reduce the buffer size to the amount of
data that may pass through for one second.  This prevents long
sleeps when limiting bandwidth.

* connect.c (connect_to_one): Reduce the socket's RCVBUF when
bandwidth limitation to small values is requested.

Previous checkout (checkout -D 23:30 15 sep 2003) wget works fine.
I also found a public site which seems to expose the problem (at least from
my machine):
wget -dv https://www.shavlik.com/pHome.aspx
dies after
DEBUG output created by Wget 1.9-beta on Windows.

--17:23:55--  https://www.shavlik.com/pHome.aspx
   = `www.shavlik.com/pHome.aspx'
Resolving www.shavlik.com... 65.173.207.46
Caching www.shavlik.com = 65.173.207.46
Connecting to www.shavlik.com[65.173.207.46]:443... connected.
Created socket 112.
Releasing 009D00D0 (new refcount 1).
---request begin---
GET /pHome.aspx HTTP/1.0
User-Agent: Wget/1.9-beta
Host: www.shavlik.com
Accept: */*
Connection: Keep-Alive
Pragma: no-cache

---request end---
HTTP request sent, awaiting response... HTTP/1.1 200 OK
Cache-Control: private
Content-Length: 14115
Content-Type: text/html; charset=utf-8
X-Powered-By: ASP.NET
X-AspNet-Version: 1.1.4322
Server: WEBSERVER
Date: Wed, 17 Sep 2003 15:27:28 GMT
Connection: keep-alive


Found www.shavlik.com in host_name_addresses_map (009D00D0)
Registered fd 112 for persistent reuse.
Length: 14,115 [text/html]

 0% [ ] 0 --.--K/s


Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, September 17, 2003 12:44 PM
 To: Herold Heiko
 Cc: List Wget (E-mail)
 Subject: Re: windows compile error
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  Does compile now, but I managed to produce an application 
 error during a
  test run on a https site.
 
  I produced a debug build with /DDEBUG /Zi /Od /Fd /FR and 
 produced the
  wget.bsc by running bscmake on all the sbr files, but I didn't yet
  understand how to use that one in VC++ in order to get a 
 meaningfull stack
  trace and so on.
  The only thing I got for now is :SSLEAY32! 0023ca38() as 
 the breaking
  point.
 
 It sounds like an https thing.
 
 Is the error repeatable?  If so, can you repeat it an earlier CVS
 snapshot?
 


windows compile error

2003-09-16 Thread Herold Heiko
Just a quick note, the current cvs code on windows during compile (with
VC++6) stops with

cl /I. /DWINDOWS /D_CONSOLE /DHAVE_CONFIG_H /DSYSTEM_WGETRC=\wgetrc\
/DHAVE_SSL /nologo /MT /W0 /O2 /c utils.c
utils.c
utils.c(1651) : error C2520: conversion from unsigned __int64 to double not
implemented, use signed __int64

The culprit seems to be (in wtimer_sys_diff)

#ifdef WINDOWS
  return (double)(wst1-QuadPart - wst2-QuadPart) / 1;
#endif

But this isn't really my area, anyone ?

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


New windows binary

2003-09-15 Thread Herold Heiko
http://space.tin.it/computer/hherold

New windows binary for the current 1.9-dev.
Be sure to get the correct ssl libraries (as linked in the description) if
you didn't get them before (that is, if you did not use my previous
1.9-dev-unoff binary, every other binary used the older libraries).

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


Windows filename patch

2003-09-09 Thread Herold Heiko
Hrvoje,

could you please check the thread Windows filename patch for 1.8.2 from
around 24-05-2002 (Hack Kampbjørn, Ian Abbott) ?
That patch (url.c) got committed to the 1.8 branch but not to the 1.9
branch.
Also, it is comprised of two parts, the first one:

@@ -1273,7 +1273,12 @@
   if (u-port != scheme_default_port (u-scheme))
{
  int len = strlen (dirpref);
+#if WINDOWS || __CYGWIN__
+ /* Use '_' instead of ':' here for Windows. */
+ dirpref[len] = '_';
+#else
  dirpref[len] = ':';
+#endif
  number_to_string (dirpref + len + 1, u-port);
}
 }

should still apply cleanly to url.c around line 1406, but I'm not sure where
the other part:

   if (query  to - result  sizeof (result))
 {
+#if WINDOWS || __CYGWIN__
+  /* Temporary fix.  Use '@' instead of '?' here for Windows. */
+  *to++ = '@';
+#else
   *to++ = '?';
+#endif
 
   /* Copy QUERY to RESULT and encode all '/' characters. */
   from = query;

should go now, or even if it is still needed. At that time Hack said:

 When generating filenames from URLs with queries, Wget puts a '?'
 character in the filename, which is an illegal filename character for
 Windows.  This patch changes that character to '@' when compiled for
 Windows.


Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Windows filename patch

2003-09-09 Thread Herold Heiko
 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, September 09, 2003 3:00 PM
 To: Herold Heiko
 Cc: [EMAIL PROTECTED]; 'Hack Kampbjørn'
 Subject: Re: Windows filename patch
 
 
 Herold Heiko [EMAIL PROTECTED] writes:
 
  could you please check the thread Windows filename patch for 1.8.2
  from around 24-05-2002 (Hack Kampbjørn, Ian Abbott) ?  That patch
  (url.c) got committed to the 1.8 branch but not to the 1.9 branch.
  Also, it is comprised of two parts, the first one:
 
 Part of the reason it wasn't applied was that I wanted to fix the
 problem properly for 1.9.  I guess I could apply your patch now and
 remove it if/when the proper fix is in place.
 

Not mine, but I agree, a temporary fix is better than nothing, without that
one currently files are lost.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


RE: Download an article for off-line browsing problem

2003-09-02 Thread Herold Heiko
At http://space.tin.it/computer/hherold/ you can find some examples in the
minihowto section, get the wgetbatch.zip, although those are windows batch
files the wget syntax is the same - in this case gettree.cmd should do the
trick.
That would be something like
wget -vkKrp -l0 -np 
although at least due to the presence of javascript in that page complete
offline browsing won't work (the banner ads are still referenced to the real
locations).

You still should have the complete wget manual in your distribution,
probably as info files.
Otherwise you could get the source (see http://wget.sunsite.dk/ for
instructions), get the source of the manual page only (via anonymous cvs,
explained on the previous page, or from 
http://cvs.sunsite.dk/viewcvs.cgi/wget/doc/wget.texi?only_with_tag=branch-1_
8_2 (mind the wrap, will need to be converted from tex), or get my windows
binary package from http://space.tin.it/computer/hherold/ which incudes the
complete manual as html file and windows help file.

I really suppose the complete manual is included in RH8 but don't have one
handy. Anyone ?

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Stephen Liu [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, September 02, 2003 6:35 AM
 To: [EMAIL PROTECTED]
 Subject: Download an article for off-line browsing problem
 
 
 Hi all folks,
 
 RH8.0
 wget-1.8.2-5
 
 
 I just join this list.  I encountered problem in using 
 following syntax 
 $ wget --no-parent
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1
 $ wget --no-parent
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1/
 $ wget http://www.linuxplanet.com/linuxplanet/tutorials/3174/1
 $ wget -r --no-parent
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1
 $ wget -r --no-parent
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1/
 $ wget -Sr --no-parent
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1
 etc.
 
 to download
 http://www.linuxplanet.com/linuxplanet/tutorials/3174/1
 (Linux Partition HOWTO which consists several pages) for off-line
 browsing.
 
 Either only index.html or all files downloaded.  Starting the 1st page
 was without problem.  But clicking Next: Section 1: 
 Introduction etc.
 link to 2nd page, following warning popup
 
 ***
 Unable to run the command specified. The file or directory
 file:/linuxplanet/tutorials/3174/1/ does not exist.
 ***
 
 Kindly advise what will be the correct syntax to download all pages of
 an Article/URL for off-line browsing.
 
 I could not find answer from man wget and from examples on following
 link as well
 
 http://www.lns.cornell.edu/public/COMP/info/wget/wget_7.html
 
 Where can I find a comprehensive examples of wget application?
 
 Thanks in advance.
 
 B.Regards
 Stephen Liu
 
 
 To Get Your Own iCareHK.com Email Address?  Go To www.iCareHK.com.
 


RE: autoconf 2.5x and automake support for wget 1.9 beta

2003-09-02 Thread Herold Heiko
Did you know wget is currently in lack of a maintainer ?

If you have time to perform those enhancements you enlistened you should
probably apply as (at least temporary) maintainer.
This wouldn't mean you would be forced to devolve more time to wget than you
can and want, or tackle issus you really are not currently interested with -
but still the community would be gratefull.

Even a maintainer only partially present would be better than the current
hiatus and a branch.
On the other hand, if you could/would take over full wget development that
would be even better.

Heiko Herold

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
 Sent: Monday, September 01, 2003 7:20 PM
 To: [EMAIL PROTECTED]; [EMAIL PROTECTED]; Deep Space 6 Development
 Mailing List
 Subject: autoconf 2.5x and automake support for wget 1.9 beta
 
 
 
 hi to everybody,
 
   i have just imported into the deepspace6.net CVS repository a
 modified version of wget 1.9 beta:
 
 http://cvs.deepspace6.net/view/wget/
 
 i have fixed a bug in the parsing of urls with embedded ipv6 addresses
 (the previous check was not exaustive, the new code is taken 
 from glibc
 2.3.2 and modified) and i have repackaged wget 1.9 beta to make use of
 autoconf 2.57, automake = 1.6, gettext 0.12.1 and libtool.
 
 you can access the CVS repository from the handy ViewCVS 
 interface at the
 URL above, or via anonymous CVS, following the instructions below:
 
 http://www.deepspace6.net/sections/cvs.html
 
 to build the package from the CVS snapshot you will need 
 autoconf 2.57 or
 better, automake 1.6 or better, gettext 0.12.1 (yes, exactly 
 0.12.1) and a
 recent libtool (versions = 1.4.0 should work).
 
 just type:
 
 make -f Makefile.cvs prep
 
 and then build wget as usual:
 
 ./configure  make
 su
 make install
 
 you're encouraged to perform the build from the CVS snapshot, but if
 there is a particular reason for which you just don't want to 
 install/use
 the GNU autotools, or if you're simply just too lazy, you may 
 want to try
 one of these tarballs instead:
 
 ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.9-beta-ma
uro.tar.gz
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.9-beta-mauro.tar.bz2

i hope that you will like the new autoconf-2.5x/automake-1.6-ed package.
i also hope that the wget maintainer will consider accepting this patch.

in the meanwhile, i am going to add autoconf checks for IPv6 support
to my wget tree above, just like what i've done with oftpd:

http://cvs.deepspace6.net/view/oftpd/

other things in my TODO:

autoconf:

 - use AC_SYS_LARGEFILE in configure.ac?
 - handle automatic de-ansi-fication better or get rid of it
   (i would prefer the second choice, as i like the autobook approach:
http://sources.redhat.com/autobook/autobook/autobook_51.html)
 - handle md5 support with AC_LIBOBJs instead of libmd5?
 - check build process on windows:
   a) fix broken windows/config.h.ms?
   b) update windows/config.h.*
   c) verify windows/Makefile*
   d) check with DJGPP
   e) check with cygwin
   f) check with gnuwin32
 - check opie, ssl and md5 support

C code:

 - fix gcc = 3.2 warnings
 - check IPv6 support
 - fix uncorrected bugs previously reported on the [EMAIL PROTECTED]
   mailing list
 - check FTP mirroring via HTTP proxy (i had problems with 1.8.2)

please, feel free to send me feedback and bug reports ;-)



BTW: in the same CVS repository:

http://cvs.deepspace6.net/view/wget-1.8/

you can find an IPv6-enabled version of wget 1.8.2. beware, the code is
still incomplete and has not been fully tested, yet.

-- 
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi [EMAIL PROTECTED]
[EMAIL PROTECTED]
[EMAIL PROTECTED]
Deep Space 6 - IPv6 with Linux  http://www.deepspace6.net
Ferrara Linux User Grouphttp://www.ferrara.linux.it



RE: Win32 binary without FTP vulnerability

2003-08-29 Thread Herold Heiko
Either I didn't understand your point, or possibly you didn't follow the
problem.

The vulnerability in question happens when wget tries to get a file with
ftp, named for example file, and the (rogue) ftp server instead returning
a file named ../../../../../../../../etc/passwd or similar.

In windows beside that we should also check if the returned filename is (for
example) c:/winnt/calc.exe or c:\winnt\calc.exe which would be interpreted
as a relative directory in unix (no harm, isn't checked currently) but is an
absolute directory path in windows (attack possibility).
Similar we need to check if possible rogue filenames like \\some\thing\here
could be harmfull.

This is different from  wget -O \\server\share\dir or similar.

Heiko Herold

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Winter Christopher [mailto:[EMAIL PROTECTED]
 Sent: Thursday, August 28, 2003 8:50 AM
 To: [EMAIL PROTECTED]
 Subject: AW: Win32 binary without FTP vulnerability
 
 
 Hello Heiko,
 
 in that case you'll lose the ability to write to network shares,
 which don't have a ':' but normally a '/' in that place.
 
 Regards,
 
 Christopher
 
 -Ursprüngliche Nachricht-
 Von: Herold Heiko [mailto:[EMAIL PROTECTED]
 Gesendet am: Montag, 25. August 2003 15:46
 An: [EMAIL PROTECTED]
 Betreff: RE: Win32 binary without FTP vulnerability
 
 However as a matter of fact that could still suffers from a 
 similar direct
 drive access bug (instead of dot dot use driveletter:). I don't think
 anybody ever check if in that case access to an absolute path 
 on another
 drive would be possible or if that would be thwarted later by the file
 renaming routine (which would change ':' to '@').
 
 I've always wanted to implement a small additional path for 
 that but never
 did it since I don't have a patched rogue ftp server handy to test it.
 
 What would be needed to be patched is has_insecure_name_p() 
 in fnmatch.c,
 #ifdef WINDOWS check if the second character is ':' .
 
 Heiko
 


RE: Bug in total byte count for large downloads

2003-08-26 Thread Herold Heiko
Wget 1.5.3 is ancient.
You should be well advised to upgrade to the current stable version (1.8.2)
or better the latest development version (1.9beta) even if wget is currently
in develpment stasis due to lack of maintainer.
You can find more information how to get the sources at
http://wget.sunsite.dk/
There are about 35 user visible changes mentioned in the news file after
1.5.3, so take a look at that before upgrading.
Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Stefan Recksiegel 
 [mailto:[EMAIL PROTECTED]
 Sent: Monday, August 25, 2003 6:49 PM
 To: [EMAIL PROTECTED]
 Subject: Bug in total byte count for large downloads
 
 
 Hi,
 
 this may be known, but
 
 [EMAIL PROTECTED]:/scratch/suse82 wget --help
 GNU Wget 1.5.3, a non-interactive network retriever.
 
 gave me
 
 FINISHED --18:32:38--
 Downloaded: -1,713,241,830 bytes in 5879 files
 
 while
 
 [EMAIL PROTECTED]:/scratch/suse82 du -c
 6762560 total
 
 would be correct.
 
 Best wishes,  Stefan
 
 -- 
 
 * Stefan Recksiegelstefan AT recksiegel.de *
 * Physikdepartment T31 office +49-89-289-14612 *
 * Technische Universität München home +49-89-9547 4277 *
 * D-85747 Garching, Germanymobile +49-179-750 2854 *
 
 
 


RE: help with wget????

2003-08-25 Thread Herold Heiko
http://space.tin.it/computer/hherold

Read the instructions (the mini HOWTO quick start will be ok).
You should probably get the 1.8.2 binary and the relevant ssl libraries
mentioned in the same paragraph.
If you got binaries from ftp://ftp.sunsite.dk/projects/wget/windows/ you'll
see a index.html and 00Readme.txt directing to the above site for the
descriptions of the files.
If you got a binary from somehwere else I can't help you, sorry.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Shell Gellner [mailto:[EMAIL PROTECTED]
 Sent: Thursday, August 14, 2003 12:33 AM
 To: [EMAIL PROTECTED]
 Subject: help with wget
 
 
 Dear Sirs,
 
 I've downloaded the GNU software but when I try to run the 
 WGET.exe file
 it keeps telling me 'is linked to missing export 
 LIBEAY32.DLL:3212' and
 also ' A device attached to the system is not working'.
  
  I've scoured the different links trying to find some help 
 with the setup
 but can find nothing.  Can you help??  Is there a source that 
 covers setup of this program
 for beginners like me?
 
 I'd really appreciate your help.
 Shell
 
 Guitar Musician
 http://www.guitarmusician.com
 


RE: Win32 binary without FTP vulnerability

2003-08-25 Thread Herold Heiko
However as a matter of fact that could still suffers from a similar direct
drive access bug (instead of dot dot use driveletter:). I don't think
anybody ever check if in that case access to an absolute path on another
drive would be possible or if that would be thwarted later by the file
renaming routine (which would change ':' to '@').

I've always wanted to implement a small additional path for that but never
did it since I don't have a patched rogue ftp server handy to test it.

What would be needed to be patched is has_insecure_name_p() in fnmatch.c,
#ifdef WINDOWS check if the second character is ':' .

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Doug Kaufman [mailto:[EMAIL PROTECTED]
 Sent: Sunday, August 17, 2003 6:51 AM
 To: Vesselin Peev
 Cc: [EMAIL PROTECTED]
 Subject: Re: Win32 binary without FTP vulnerability
 
 
 On Sun, 17 Aug 2003, Vesselin Peev wrote:
 
  I have previously looked at the same downloads.
  However, the security advisory is dated December 2002, 
 while the 1.8.2
  version I downloaded from Heiko Herold's wget sport is 
 dated 2002/05/29. Is
 
 You need to download the 1.9 beta version. It is available there.
  Doug
 
 
 -- 
 Doug Kaufman
 Internet: [EMAIL PROTECTED]
 


RE: FTP Change Directories?

2003-07-29 Thread Herold Heiko
If I remember correctly this has been corrected in recent versions, but I
don't remember when, sorry.
Try wget 1.8 or 1.9-dev.
See http://wget.sunsite.dk/wgetdev.html#development for getting the
development sources from cvs, or grap a copy of the windows binary from
http://space.tin.it/computer/hherold/ (read the table and take the correct
ssl libraries) or get the (zipped) sources from the same place. Dos2unix
them if you plan to use those zipped sources on unix.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: !jeff!{InterVerse} [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, July 29, 2003 4:34 PM
 To: [EMAIL PROTECTED]
 Subject: FTP Change Directories?
 
 
 Hi,
 
 I am using wget to download a file on an ftp server.  the FTP 
 server logs 
 me into /incoming, but the file is in /outbound.
 
 when I try this...
 
 ftp://user:[EMAIL PROTECTED]/outbound/myfile.txt
 
 I get
 unknown directory /incoming/outbound/
 
 How do I tell wget to go UP a directory and then into outbound?
 
 TIA,
 Jeff
 


RE: WELCOME to wget@sunsite.dk

2003-07-22 Thread Herold Heiko
Did you know you can get ntp software (server/client) for windows ? See
http://www.ntp.org , get the source and compile it or grab a binary from the
links page http://www.ntp.org/links.html .

Then install the server on a windows or unix pc with access to the internet,
configured in order to syncronize the clock with a free public ntp server on
the internet  - a good list is here:
http://www.eecis.udel.edu/~mills/ntp/clock2a.html .
Then sync the as400 clock to that pc (periodically with ntpdate or if
possible in the proper way, with another ntp server installed on the as400
and configured in order to sync to the pc (if ntp is availble on the as400,
that is).

Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Mr. Wilson [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, July 22, 2003 11:54 AM
 To: [EMAIL PROTECTED]
 Subject: RE: WELCOME to [EMAIL PROTECTED]
 
 
 You can use ntpd or ntpdate to connect to this server.
 At first glance (I will do a bigger read later) this will not help me.
 
 The reason it will not help and why I use wget is this:
 I am in charge of various aspects of a mid-range computer 
 called an AS/400
 (IBM) due to company policy this machine is allowed NO direct 
 connection to
 the outside world, no VPN, no www, no HTML server etc. I have 
 been asked by
 the auditors to automate a 'check' on the system time so that it is
 accurate. I am allowed access to a PC that has www access via 
 *REXEC (note:
 PC control and access is monitored by a separate division).  
 I am allowed to
 run commands to the www (once they are approved by PC Admin) 
 as long as they
 have NO effect on the PC involved. OK, still with me :) this 
 means that a
 program like wget is perfect. It allows me to issue commands 
 via *REXEC to
 scrape sites and bring back that info to the 400 in a text 
 file. I then
 write a string extractor and compare the time etc. Thus my question.
 
 I apologise for being verbose but perhaps an example of how 
 wget is used by
 me in the bureaucratic world of computer management would be 
 of interest to
 you all.
 
 -Original Message-
 From: Nicolas Schodet [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, 22 July 2003 5:36 AM
 To: [EMAIL PROTECTED]
 Subject: Re: WELCOME to [EMAIL PROTECTED]
 
 
 * Mr. Wilson [EMAIL PROTECTED] [030718 14:56]:
  I am running the wget (freeware batch web scraping) program 
 on the NTP
 time
  server at http://132.163.4.101:14
  and can not scrape this site. I have tried with the proxy 
 setting both on
  and off but I get the following error code.
  ( wget -Oh.txt -Yoff http://132.163.4.101:14 ) gives
  Connecting to 132.163.4.101:14... connected!
  HTTP request sent, fetching headers... done.
  22:23:21 ERROR -1: Malformed status line.
  I have 2 questions:
  1: even if the ntp site above has no html headers etc, why 
 doesn't this
  work?
  2: how can I make it work.
 
 GNU Wget is a [1]free software package for retrieving files using
 [2]HTTP, HTTPS and FTP.
 
 
  [1] GNU Wget is free software, not freeware, see the page below to
  learn the difference :
 
  http://www.gnu.org/philosophy/free-sw.html
 
  [2] GNU Wget only retrieves files using HTTP, HTTPS and FTP. 
 Your date
  server is likely to use NTP, so wget can not connect to it 
 to retrieve
  the date. You can use ntpd or ntpdate to connect to this server.
 
 Ni.
 
 --
 Pouvez-vous faire confiance a votre ordinateur ?
 http://www.gnu.org/philosophy/can-you-trust.fr.html
 
 


RE: Feature Request: Fixed wait

2003-06-18 Thread Herold Heiko
Quite the opposite at that time, wait was used for retries and between
normal connections, so a high wait time (avoid hammering) meant slow
downloads even for working connections.
So the idea at that time was having a possibility of wait 0 (between normal
connections) and waitretry 0..x (used between retries, in order to avoid
hammering).

If I remember correctly you should be able to set explicitely wait to some
value and waitretry to 0, in that case it should use the wait time only (but
I can't test that just now, so do check that).
Hmm the code says:
if (opt.waitretry  count  1)
wait count, max out at opt.waitretry
else if (opt.wait)
wait opt.wait
so I think that's correct (but still, test that as I didn't right now).

Now, if you want a fixed time for wait between normal downloads and another
fixed time between retries.. that is not available currently, but probably
trivial to add - just look for wait in options.h, main.c, init.c, retr.c
in order to change what you need.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Wu-Kung Sun [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, June 17, 2003 11:13 PM
 To: Aaron S. Hawley
 Cc: [EMAIL PROTECTED]
 Subject: Re: Feature Request: Fixed wait
 
 
 --- Aaron S. Hawley [EMAIL PROTECTED] wrote:
  how is your request different than --wait ?
  
 
 I'm not in position to verify right now and it's been
 a while since I really knew the ins and outs of wget. 
 But IIRC, --wait is only the time between getting
 files and not the time between initial connection
 attempts (and maybe time between broken connections?).
  Hence, --waitretry was added.
 
 __
 Do you Yahoo!?
 SBC Yahoo! DSL - Now only $29.95 per month!
 http://sbc.yahoo.com
 


RE: request for features

2003-06-18 Thread Herold Heiko
Just a quick note regarding the trash at end of file problem: usually that
means a broken/braindead proxy (possibly transparent), not a wget fault.
For the rest, don't expect to much, currently wget is in stasis for lack of
active maintainer.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: dEth [mailto:[EMAIL PROTECTED]
 Sent: Wednesday, June 18, 2003 2:46 AM
 To: [EMAIL PROTECTED]
 Subject: request for features
 
 
 Hi all!
 
 My situation: a modem on an unlimited night dialup access.
 About 4 people edit their url.txt files to post there files they want.
 Then I use cron to divide 9.5 hours of online-time to download all
 these files.
 
 I faces with some problems and need for some features:
 If there's a working copy of wget in memory - if you try do exec the
 second one with all of the keys same as at the first copy - the second
 copy must just exit with error message. Here in Russia we call it
 stupid user protection
 
 Seems like if wget was killed using kill -KILL, the tail of the
 downloaded file is being filled with trash (i'll test it today more
 precisely). This could be sovled in many ways:
 - correct termination on receiving SIGTERM (now I use KILL because
   there's no reaction on plain `kill ID`);
 - cropping the tail of the file if wget was started with -c and the
   file isn't finished yet
 
 It would also be nice to see a couple of there features:
 - reading command-line options from specified file... now I use to put
   a special script that generates a really long command 
 lines, i can see
   almost nothing using `ps aux`. if we make it, it'll be enough to put
   just `wget -C /path/user1.conf`; wgetrc are very different thing.
 - temporary rename files that are not finished yet. GetRight for Win
   uses this way, it adds .GetRight extension for such files so you can
   always see what's ready.
   
 Don't say I need to much. I actually can make almost all of this using
 sh-scripts and perl, but writing features realisation is not the thing
 that used has to do.
 
 I also program some C, but.. maybe there are some guys who are already
 inside-the-code and doesn't need time to look around inside the
 sources?
 -- 
 Best regards,
  dEth
 


RE: Downloading from a Site, not a Subdirectory

2003-06-16 Thread Herold Heiko
Well, since you are using a -np it shouldn't be very difficult adding a -l0
.
Or, look at the structure of the pages and try to understand why wget
doesn't download anymore.
Or, run wget with -d -a wget.log and check the logfile afterwards in order
to understand EXACTLY why wget stopped, but you'll have to wade through a
lot of output.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: Kyle Lundstedt [mailto:[EMAIL PROTECTED]
 Sent: Monday, June 16, 2003 2:11 AM
 To: [EMAIL PROTECTED]
 Subject: Downloading from a Site, not a Subdirectory
 
 
 Hi,
 I'm trying to mirror a site which includes a large number of PDF
 files.  If I use the following command:
  
 wget -m -A pdf -np http://faculty.haas.berkeley.edu/wallace/
  
 I'm able to obtain all of the PDF files in the wallace directory and
 its subdirectories.
 However, I'm interested in obtaining all of the PDF files on the
 faculty.haas.berkeley.edu site.
 When I use the following command:
  
 wget -m -A pdf -np http://faculty.haas.berkeley.edu/
  
 I don't get any PDF files at all.  Anyone know how I can make 
 this work?
  
 Thanks,
 Kyle
 


RE: Only get the index page

2003-06-13 Thread Herold Heiko
Try wget -v and check the output.
Try wget -d and check the output.
Check the wget version (1.8.2 or 1.9-beta is reccomended).
For me the usual 
wget -vkKrp -l0 -np 
did the trick. 
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Sent: Thursday, June 12, 2003 11:32 AM
 To: [EMAIL PROTECTED]
 Subject: Only get the index page
 
 
 Moin,
 
 I tried to mirror a website with «wget -m 
 http://www.mteege.de» but I only
 get index.html. If I tried «wget -m http://www.gnu.org» wget 
 mirrors the
 complete site.
 
 What do I missing?
 
 I use GNU Wget 1.8.2 und FreeBSD.
 
 Many thanks
 Matthias
  
 


RE: i tried to run the new wget for windows and this is what i got

2003-06-04 Thread Herold Heiko
If you got that binary from my site you should have read the relevant
description.
So you'd have downloaded and installed the correct ssl libraries.
If you've got it from somewhere else contact who provided that binary.
Beside that, sending a screenshot in order to transmit a simple text error
message usually is considered rude.

Heiko 

 -Original Message-
 From: Ernst, Yehuda [mailto:[EMAIL PROTECTED]
 Sent: Tuesday, June 03, 2003 11:15 AM
 To: [EMAIL PROTECTED]
 Subject: i tried to run the new wget for windows and this is 
 what i got
 
 
  ole0.bmp 
 
 
 any ides?
 **
 *
 Information contained in this email message is intended only 
 for use of the individual or entity named above. If the 
 reader of this message is not the intended recipient, or the 
 employee or agent responsible to deliver it to the intended 
 recipient, you are hereby notified that any dissemination, 
 distribution or copying of this communication is strictly 
 prohibited. If you have received this communication in error, 
 please immediately notify the [EMAIL PROTECTED] and 
 destroy the original message.
 **
 *
 


  1   2   >