Patches for building Wget with Mingw32

2003-07-29 Thread MadDog1202
I have built Wget on Mingw32 using MSYS to run configure.
I had to manually edit the Makefiles and (since I have Emacs) am using the 
Info documentation.
The changes to the Makefile consisted of changing DEFS to "-DWINDOWS 
-DHAVE_CONFIG_H" and adding "-lwsock32" to LIBS.
Below is a unified diff of my changes to the code (I hope AOL doesn't mangle 
this.):

--- src/ftp.c.orig  2002-05-17 22:05:16.0 -0500
+++ src/ftp.c   2003-07-28 13:55:50.0 -0500
@@ -837,9 +837,15 @@
  /* This will silently fail for streams that don't correspond
 to regular files, but that's OK.  */
  rewind (fp);
+#if !defined(WINDOWS) || !defined(__GNUC__)
  /* ftruncate is needed because opt.dfp is opened in append
 mode if opt.always_rest is set.  */
  ftruncate (fileno (fp), 0);
+#else /* GCC on Windows (Mingw32) */
+ /* SetEndOfFile is like ftruncate, but the second
+argument to ftruncate is taken from the file pointer.  */
+ SetEndOfFile ((HANDLE) _get_osfhandle (_fileno (fp)));
+#endif
  clearerr (fp);
}
 }
--- src/http.c.orig 2002-05-18 22:04:54.0 -0500
+++ src/http.c  2003-07-28 13:56:16.0 -0500
@@ -1356,9 +1356,15 @@
  /* This will silently fail for streams that don't correspond
 to regular files, but that's OK.  */
  rewind (fp);
+#if !defined(WINDOWS) || !defined(__GNUC__)
  /* ftruncate is needed because opt.dfp is opened in append
 mode if opt.always_rest is set.  */
  ftruncate (fileno (fp), 0);
+#else /* GCC on Windows (Mingw32) */
+ /* SetEndOfFile is like ftruncate, but the second
+argument to ftruncate is taken from the file pointer.  */
+ SetEndOfFile ((HANDLE) _get_osfhandle (_fileno (fp)));
+#endif
  clearerr (fp);
}
 }
--- src/init.c.orig 2002-05-17 22:05:20.0 -0500
+++ src/init.c  2003-07-25 18:29:54.0 -0500
@@ -192,7 +192,9 @@
   { "sslcertkey",  &opt.sslcertkey,cmd_file },
   { "egdfile", &opt.sslegdsock,cmd_file },
 #endif /* HAVE_SSL */
+#ifdef HAVE_SELECT
   { "timeout", &opt.timeout,   cmd_time },
+#endif /* HAVE_SELECT */
   { "timestamping",&opt.timestamping,  cmd_boolean },
   { "tries",   &opt.ntry,  cmd_number_inf },
   { "useproxy",&opt.use_proxy, cmd_boolean },
--- src/main.c.orig 2002-05-17 22:05:20.0 -0500
+++ src/main.c  2003-07-25 19:19:12.0 -0500
@@ -87,7 +87,9 @@
 void log_close PARAMS ((void));
 void log_request_redirect_output PARAMS ((const char *));
 
+#ifdef HAVE_SIGNAL
 static RETSIGTYPE redirect_output_signal PARAMS ((int));
+#endif /* HAVE_SIGNAL */
 
 const char *exec_name;
 
--- src/mswindows.c.orig2002-05-17 22:05:20.0 -0500
+++ src/mswindows.c 2003-07-25 20:16:32.0 -0500
@@ -81,7 +81,7 @@
   HKEY result;
   DWORD size = *len;
   DWORD type = REG_SZ;
-  if (RegOpenKeyEx (hkey, subkey, NULL, KEY_READ, &result) != ERROR_SUCCESS)
+  if (RegOpenKeyEx (hkey, subkey, 0L, KEY_READ, &result) != ERROR_SUCCESS)
 return NULL;
   if (RegQueryValueEx (result, valuename, NULL, &type, buf, &size) != 
ERROR_SUCCESS)
 buf = NULL;
@@ -123,7 +123,7 @@
   int changedp = 0;
 
   if (!opt.lfilename)
-{
+{/* GCC 2.95.2 warns about this but I don't know why */
   opt.lfilename = unique_name (DEFAULT_LOGFILE);
   changedp = 1;
 }
@@ -210,7 +210,7 @@
   if (stat (buf, &sbuf) == 0) 
{
   printf (_("Starting WinHelp %s\n"), buf);
-  WinHelp (NULL, buf, HELP_INDEX, NULL);
+  WinHelp (NULL, buf, HELP_INDEX, (DWORD) NULL);
 }
   else
 {
--- src/sysdep.h.orig   2002-05-17 22:05:22.0 -0500
+++ src/sysdep.h2003-07-28 13:44:54.0 -0500
@@ -53,6 +53,10 @@
their declarations, as well as some additional declarations and
macros.  This must come first, so it can set things up.  */
 #include 
+/* This should never hurt on Windows, and is needed with Mingw32
+   for http.c and ftp.c.  Putting it here just seems more
+   appropriate.  */
+#include 
 #endif /* WINDOWS */
 
 /* Watcom-specific stuff.  In practice this is probably specific to
--- src/utils.c.orig2002-05-17 22:05:22.0 -0500
+++ src/utils.c 2003-07-25 20:00:48.0 -0500
@@ -1504,8 +1504,13 @@
   SYSTEMTIME st;
   GetSystemTime (&st);
   SystemTimeToFileTime (&st, &ft);
+#ifndef __GNUC__
   wt->wintime.HighPart = ft.dwHighDateTime;
   wt->wintime.LowPart  = ft.dwLowDateTime;
+#else
+  wt->wintime.u.HighPart = ft.dwHighDateTime;
+  wt->wintime.u.LowPart  = ft.dwLowDateTime;
+#endif
 #endif
 }
 
@@ -1527,14 +1532,19 @@
   return 1000 * (now - wt->secs);
 #endif
 
-#ifdef WINDOWS
+#ifdef TIMER_WINDOWS
   FILETIME ft;
   SYSTEMTIME st;
   ULARGE_INTEGER uli;
   GetSystemTime (&st);
   SystemTimeToFileTime (&st, &ft);
+#ifndef __GNUC__
   uli.HighPart = ft.dwHighDateTime;
   uli.LowPart = ft.dwLowDateTime;
+#else
+  uli.u.HighPart = ft.dwHighDateTime;
+  uli.u.LowPart = ft.dwLowDateTime;
+#endif
   retur

RE: -N option

2003-07-29 Thread Post, Mark K
Other than the "--ignore-length" option I mentioned previously, no.  Sorry.

Mark Post

-Original Message-
From: Preston [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 29, 2003 7:01 PM
To: [EMAIL PROTECTED]
Subject: Re: -N option


Aaron S. Hawley wrote:

>On Tue, 29 Jul 2003, Post, Mark K wrote:
>
>  
>
>>..
>>So, perhaps you need to modify your work practices rather than diddle with
>>the software.  Copy the locally updated files to another location so
they're
>>not clobbered when the remote version changes.
>>
>>
>
>indeed.  consider creating local "copies" by instead just tracking
>versions of your image files with RCS if its available for your system
>(and if you aren't already using it):
>
>http://www.gnu.org/software/rcs/
>  
>

To answer questons asked so far:  We are using wget version 1.8.2
I have checked the dates on the local file and the remote file and the 
local file date is newer.  The reason I thought it was still clobbering 
despite the newer date on the local was because of the size difference.  
I read that in the online manual here:
 http://www.gnu.org/manual/wget/html_chapter/wget_5.html#SEC22

At the bottom it says,

"If the local file does not exist, or the sizes of the files do not 
match, Wget will download the remote file no matter what the time-stamps 
say."

I do want newer files on the remote to replace older files on the local 
server.  Essentially, I want the newest file to remain on the local.  
The problem I am having, however is that if we change/update files on 
the local, if they are of a different size, the remote copy is 
downloaded and clobbers the local no matter what the dates are.  I hope 
this is clear, sorry if I have not explained the problem well.  Let me 
know if you have anymore ideas and if you need me to try again to 
explain.  Thanks for your help.

Preston
[EMAIL PROTECTED]


[Fwd: Recursive]

2003-07-29 Thread Morris Hooten - SES System Admin


I'm having a problem with downloading all files within the 
ftp://sun:[EMAIL PROTECTED]/
directory.

I've tried to use the -r for recursive but the only way I can download a 
file is specify the file name
as in example 2.



1ST EXAMPLE---

/usr/local/bin/wget -r ftp://sun:[EMAIL PROTECTED]/
--13:17:16--  ftp://sun:[EMAIL PROTECTED]/
  => `ftp.certmanager.net/index.html'
Resolving webcache1.central.sun.com... done.
Connecting to webcache1.central.sun.com[129.147.62.16]:8080... connected.
Proxy request sent, awaiting response... 200 Ok
Length: unspecified [text/html]
   [ 
<=>   ] 
4544.62K/s

13:17:18 (4.62 KB/s) - `ftp.certmanager.net/index.html' saved [454]

FINISHED --13:17:18--
Downloaded: 454 bytes in 1 files




2ND EXAMPLE---

/usr/local/bin/wget 
ftp://sun:[EMAIL PROTECTED]/DD_WR_Training_Centers_Update_04_22_03.sxc
--12:48:45--  
ftp://sun:[EMAIL PROTECTED]/DD_WR_Training_Centers_Update_04_22_03.sxc
  => 
`ftp.certmanager.net/DD_WR_Training_Centers_Update_04_22_03.sxc'
Resolving webcache1.central.sun.com... done.
Connecting to webcache1.central.sun.com[129.147.62.16]:8080... connected.
Proxy request sent, awaiting response... 200 Ok
Length: unspecified [application/octet-stream]

   [  
<=>   ] 
25,53274.21K/s
12:48:47 (74.21 KB/s) - 
`ftp.certmanager.net/DD_WR_Training_Centers_Update_04_22_03.sxc' saved 
[25532]

FINISHED --12:48:47--
Downloaded: 25,532 bytes in 1 files








Re: -N option

2003-07-29 Thread Preston
Aaron S. Hawley wrote:

On Tue, 29 Jul 2003, Post, Mark K wrote:

 

..
So, perhaps you need to modify your work practices rather than diddle with
the software.  Copy the locally updated files to another location so they're
not clobbered when the remote version changes.
   

indeed.  consider creating local "copies" by instead just tracking
versions of your image files with RCS if its available for your system
(and if you aren't already using it):
http://www.gnu.org/software/rcs/
 

To answer questons asked so far:  We are using wget version 1.8.2
I have checked the dates on the local file and the remote file and the 
local file date is newer.  The reason I thought it was still clobbering 
despite the newer date on the local was because of the size difference.  
I read that in the online manual here:
http://www.gnu.org/manual/wget/html_chapter/wget_5.html#SEC22

At the bottom it says,

"If the local file does not exist, or the sizes of the files do not 
match, Wget will download the remote file no matter what the time-stamps 
say."

I do want newer files on the remote to replace older files on the local 
server.  Essentially, I want the newest file to remain on the local.  
The problem I am having, however is that if we change/update files on 
the local, if they are of a different size, the remote copy is 
downloaded and clobbers the local no matter what the dates are.  I hope 
this is clear, sorry if I have not explained the problem well.  Let me 
know if you have anymore ideas and if you need me to try again to 
explain.  Thanks for your help.

Preston
[EMAIL PROTECTED]


RE: wget and procmail

2003-07-29 Thread Post, Mark K
Does the PATH of procmail contain the directory where wget lives?


Mark Post

-Original Message-
From: Michel Lombart [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 29, 2003 6:51 PM
To: [EMAIL PROTECTED]
Subject: wget and procmail


Hello,

I've an issue with wget and procmail.

I install the forum software mailgust ( http://mailgust.phpoutsourcing.com/
) on a Cobalt/Sun Raq4. I need, in order to use incoming e-mail, to install
a .procmailrc file calling wget.

When I type the complete command on console wget works fine. When wget is
called by procmail it does nothing.

I've enabled a verbose logfile for procmail and I see in the log the call of
wget without error.

Any idea ?

Thank for your help

Michel Lombart


wget and procmail

2003-07-29 Thread Michel Lombart
Hello,

I've an issue with wget and procmail.

I install the forum software mailgust ( http://mailgust.phpoutsourcing.com/ ) on a 
Cobalt/Sun Raq4. I need, in order to use incoming e-mail, to install a .procmailrc 
file calling wget.

When I type the complete command on console wget works fine. When wget is called by 
procmail it does nothing.

I've enabled a verbose logfile for procmail and I see in the log the call of wget 
without error.

Any idea ?

Thank for your help

Michel Lombart

RE: -N option

2003-07-29 Thread Post, Mark K
Preston,

The "--ignore-length" option _may_ do what you want.  As Aaron pointed out,
if they update the file on the remote server after the photo has been
updated locally, it will get wiped out based on the date, not the length.
So, perhaps you need to modify your work practices rather than diddle with
the software.  Copy the locally updated files to another location so they're
not clobbered when the remote version changes.


Mark Post

-Original Message-
From: Preston [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 29, 2003 5:03 PM
To: [EMAIL PROTECTED]
Subject: -N option


Hello all,

I have searched the mailling list archives, but cannot find an answer to 
my problem.  The company I work for currently uses wget to get photos 
for our website from an ftp server.  Sometimes, photos are updated with 
newer versions and we always only want to get the newer versions.  We 
are currently calling wget as follows:

wget --passive-ftp -N -r -l2 --no-parent -A.jpg 
ftp://username:[EMAIL PROTECTED]/photo_dir/

This works great with one exception:  If we want to go in and change the 
photos locally, then when our wget script runs, it clobbers any photos 
we have added.  This, as I understand it, is because when wget uses the 
-N flag it checks timestamps and file sizes.  If the file sizes are 
different, then it gets the file from the remote no matter what.  The 
problem is that almost always, the photo we update is a different size 
than the old photo. 

My question:  Is there anyway to use wget where it only downloads based 
upon dates and not file sizes?  Thanks for you help.

Sincerely,

Preston
[EMAIL PROTECTED]


Re: -N option

2003-07-29 Thread Aaron S. Hawley
i'm not so sure about the file size issue, but the documentation says:

"Wget will ask the server for the last-modified date. If the local file is
newer, the remote file will not be re-fetched. However, if the remote file
is more recent, Wget will proceed fetching it normally."



does the FTP server give dates with ls -l? is there a substantial
difference between the local file system and the ftp server?

which of version of Wget are you using?

/a

On Tue, 29 Jul 2003, Preston wrote:

> Hello all,
>
> I have searched the mailling list archives, but cannot find an answer to
> my problem.  The company I work for currently uses wget to get photos
> for our website from an ftp server.  Sometimes, photos are updated with
> newer versions and we always only want to get the newer versions.  We
> are currently calling wget as follows:
>
> wget --passive-ftp -N -r -l2 --no-parent -A.jpg
> ftp://username:[EMAIL PROTECTED]/photo_dir/
>
> This works great with one exception:  If we want to go in and change the
> photos locally, then when our wget script runs, it clobbers any photos
> we have added.  This, as I understand it, is because when wget uses the
> -N flag it checks timestamps and file sizes.  If the file sizes are
> different, then it gets the file from the remote no matter what.  The
> problem is that almost always, the photo we update is a different size
> than the old photo.
>
> My question:  Is there anyway to use wget where it only downloads based
> upon dates and not file sizes?  Thanks for you help.
>
> Sincerely,
>
> Preston
> [EMAIL PROTECTED]


-N option

2003-07-29 Thread Preston
Hello all,

I have searched the mailling list archives, but cannot find an answer to 
my problem.  The company I work for currently uses wget to get photos 
for our website from an ftp server.  Sometimes, photos are updated with 
newer versions and we always only want to get the newer versions.  We 
are currently calling wget as follows:

wget --passive-ftp -N -r -l2 --no-parent -A.jpg 
ftp://username:[EMAIL PROTECTED]/photo_dir/

This works great with one exception:  If we want to go in and change the 
photos locally, then when our wget script runs, it clobbers any photos 
we have added.  This, as I understand it, is because when wget uses the 
-N flag it checks timestamps and file sizes.  If the file sizes are 
different, then it gets the file from the remote no matter what.  The 
problem is that almost always, the photo we update is a different size 
than the old photo. 

My question:  Is there anyway to use wget where it only downloads based 
upon dates and not file sizes?  Thanks for you help.

Sincerely,

Preston
[EMAIL PROTECTED]


RE: FTP Change Directories?

2003-07-29 Thread Herold Heiko
If I remember correctly this has been corrected in recent versions, but I
don't remember when, sorry.
Try wget 1.8 or 1.9-dev.
See http://wget.sunsite.dk/wgetdev.html#development for getting the
development sources from cvs, or grap a copy of the windows binary from
http://space.tin.it/computer/hherold/ (read the table and take the correct
ssl libraries) or get the (zipped) sources from the same place. Dos2unix
them if you plan to use those zipped sources on unix.

Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

> -Original Message-
> From: !jeff!{InterVerse} [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, July 29, 2003 4:34 PM
> To: [EMAIL PROTECTED]
> Subject: FTP Change Directories?
> 
> 
> Hi,
> 
> I am using wget to download a file on an ftp server.  the FTP 
> server logs 
> me into /incoming, but the file is in /outbound.
> 
> when I try this...
> 
> ftp://user:[EMAIL PROTECTED]/outbound/myfile.txt
> 
> I get
> unknown directory /incoming/outbound/
> 
> How do I tell wget to go UP a directory and then into outbound?
> 
> TIA,
> Jeff
> 


FTP Change Directories?

2003-07-29 Thread !jeff!{InterVerse}
Hi,

I am using wget to download a file on an ftp server.  the FTP server logs 
me into /incoming, but the file is in /outbound.

when I try this...

ftp://user:[EMAIL PROTECTED]/outbound/myfile.txt

I get
unknown directory /incoming/outbound/
How do I tell wget to go UP a directory and then into outbound?

TIA,
Jeff