Re: Large Files Support for Wget

2004-05-10 Thread Hrvoje Niksic
Daniel Stenberg [EMAIL PROTECTED] writes:

 On Mon, 10 May 2004, [iso-8859-2] Dra?en Ka?ar wrote:

  * Change most (all?) occurrences of `long' in the code to `off_t'.  Or
should we go the next logical step and just use uintmax_t right
away?

 Just use off_t.

 ... but Windows has no off_t... ;-)

That in itself is not a problem because, under Windows, off_t will be
typedeffed to a 64-bit type if LFS is available, and to `int'
otherwise.

The point of my question was: should low-level code even care whether
LFS is in use (by using off_t for various variables), or should it use
intmax_t to get the largest representation available on the system?
The latter is in principle sort of like using `long', except you're
not tied to the actual size of `long'.



RE: Large Files Support for Wget

2004-05-10 Thread Herold Heiko
 -Original Message-
 From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
 
 * Profit!

I think you'd really deserve some.
Heiko 

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax


Re: Large Files Support for Wget

2004-05-10 Thread Dražen Kačar
Hrvoje Niksic wrote:
 David Fritz [EMAIL PROTECTED] writes:
 
  IIUC, GNU coreutils uses uintmax_t to store large numbers relating to
  the file system and prints them with something like this:
 
 char buf[INT_BUFSIZE_BOUND (uintmax_t)];
 printf (_(The file is %s octets long.\n), umaxtostr (size, buf));
 
 That's probably the most portable way to do it.

For the time being. However, in C99 %ju is the correct format for printing
uintmax_t. There are systems which have uintmax_t, but don't have the j
modifier, so the whole thing is a problem if you want to write failsafe
configure check. And there might be run-time problems, as well.

 * Change most (all?) occurrences of `long' in the code to `off_t'.  Or
   should we go the next logical step and just use uintmax_t right
   away?

Just use off_t.

-- 
 .-.   .-.Yes, I am an agent of Satan, but my duties are largely
(_  \ /  _)   ceremonial.
 |
 |[EMAIL PROTECTED]


Re: Large Files Support for Wget

2004-05-10 Thread Daniel Stenberg
On Mon, 10 May 2004, [iso-8859-2] Dra?en Ka?ar wrote:

  * Change most (all?) occurrences of `long' in the code to `off_t'.  Or
should we go the next logical step and just use uintmax_t right
away?

 Just use off_t.

... but Windows has no off_t... ;-)

-- 
 -=- Daniel Stenberg -=- http://daniel.haxx.se -=-
  ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol


Site Mirror

2004-05-10 Thread Kelvin F
Hi,

I am trying to mirror a web site that has many
hierarchical levels.  

I am using the command 

wget -m -k $site

which allows me to view the site fine.  

However, I wish the mirror to make a directory
structure that also mimics the website rather than
having the html files in a single directory.

Would anybody have any suggestions on how this could
be acheived 

Kelvin Francis





__
Do you Yahoo!?
Win a $20,000 Career Makeover at Yahoo! HotJobs  
http://hotjobs.sweepstakes.yahoo.com/careermakeover 


Site Mirror

2004-05-10 Thread Kelvin F
Hi,

I am trying to mirror a web site that has many
hierarchical levels.  

I am using the command 

wget -m -k $site

which allows me to view the site fine.  

However, I wish the mirror to make a directory
structure that also mimics the website rather than
having the html files in a single directory.

Would anybody have any suggestions on how this could
be acheived 

Kelvin Francis





__
Do you Yahoo!?
Win a $20,000 Career Makeover at Yahoo! HotJobs  
http://hotjobs.sweepstakes.yahoo.com/careermakeover 


Re: wget: strdup: Not enough memory

2004-05-10 Thread Axel Pettinger
David Fritz wrote:
 
 Hmm, you might try upgrading to a newer version of mingw (see
 http://www.mingw.org/).  

Thanks, I wasn't aware of the fact that there's a newer version. I
downloaded MinGW from http://gnuwin.epfl.ch/apps/mingw/en/index.html.
They offer version 2.0.0.3.

 Alternatively, you could try to comment-out the #define
 HAVE_UTIME_H 1 line in config.h.mingw or add a utime.h to your mingw 
 include directory that consists of the following line:
 
 #include sys/utime.h

I did the latter, it worked! :) Thank you!
And now I'm going to replace MinGW 2.0.0.3 with the current version ...

Regards,
Axel Pettinger


Re: wget: strdup: Not enough memory

2004-05-10 Thread Axel Pettinger
Hrvoje Niksic wrote:
 
 This patch should fix the problem.  Please let me know if it works for
 you:
 
 2004-05-08  Hrvoje Niksic  [EMAIL PROTECTED]
 
 * ftp-basic.c (ftp_pwd): Handle PWD response not containing 
 double quote.
 
 Index: src/ftp-basic.c
 ===
 RCS file: /pack/anoncvs/wget/src/ftp-basic.c,v
 retrieving revision 1.23.2.2
 diff -u -r1.23.2.2 ftp-basic.c
 --- src/ftp-basic.c 2003/11/16 19:19:03 1.23.2.2
 +++ src/ftp-basic.c 2004/05/08 15:18:15
 @@ -815,6 +815,11 @@
   and everything following it. */
strtok (respline, \);
request = strtok (NULL, \);
 +  if (request == NULL)
 +{
 +  xfree (respline);
 +  return FTPSRVERR;
 +}
 
/* Has the `pwd' been already allocated?  Free! */
FREE_MAYBE (*pwd);

I added the five lines to ftp-basic.c and recompiled Wget. Now I can say
that your patch is indeed working for me![1] Thank you very much.

BTW, I was a little bit confused because of the last line in your patch.
Instead of FREE_MAYBE (*pwd); my ftp-basic.c contains xfree_null
(*pwd);. I wasn't sure whether I should replace it or not. I didn't
replace the line, and everything worked fine.

Again, thank you. Wget is a great tool!

Regards,
Axel Pettinger


[1]
DEBUG output created by Wget 1.9+cvs-dev on mingw32.

--07:22:28--  ftp://anonymous:*password*@ip:port/up.exe;type=i
   = `up.exe.1'
Connecting to ip:port... seconds 0.00, connected.
Created socket 108.
Releasing 00440DD0 (new refcount 0).
Deleting unused 00440DD0.
Logging in as anonymous ... 
220 OK
-- USER anonymous


331 OK
-- PASS Turtle Power!

230 OK
-- SYST


226 OK
-- PWD


226 OK
-- TYPE I


226 OK
== CWD not needed.
conaddr is: 217.234.180.133
Local socket fd 24 bound.
binding to address 217.234.180.133 using port 1550.
-- PORT 217,234,180,133,6,14


200 OK
-- RETR up.exe


150 OK
Accepted client at socket 120.

0K .. . 10.82
KB/s

Closed fd 120
Closed fd 24
226 OK
Closed fd 108
07:22:31 (10.82 KB/s) - `up.exe.1' saved [15872]


Re: wget: strdup: Not enough memory

2004-05-10 Thread Hrvoje Niksic
Axel Pettinger [EMAIL PROTECTED] writes:

 I added the five lines to ftp-basic.c and recompiled Wget. Now I can say
 that your patch is indeed working for me![1] Thank you very much.

 BTW, I was a little bit confused because of the last line in your patch.
 Instead of FREE_MAYBE (*pwd); my ftp-basic.c contains xfree_null
 (*pwd);.

xfree_null is in CVS; I thought you said you were using Wget 1.9.1, so
I provided a patch for that version.  Anyway, as you correctly
surmised, the two are functionally equivalent.

 I wasn't sure whether I should replace it or not. I didn't replace
 the line, and everything worked fine.

Thanks for verifying this.



Re: Large Files Support for Wget

2004-05-10 Thread David Fritz
IIUC, GNU coreutils uses uintmax_t to store large numbers relating to the file 
system and prints them with something like this:

  char buf[INT_BUFSIZE_BOUND (uintmax_t)];
  printf (_(The file is %s octets long.\n), umaxtostr (size, buf));
where umaxtostr() has the following prototype:

char *umaxtostr (uintmax_t, char *);

and it returns its second argument (the address of the buffer provided by the 
caller) so it can be used easily as an argument in printf calls.





Re: Large Files Support for Wget

2004-05-10 Thread Hrvoje Niksic
David Fritz [EMAIL PROTECTED] writes:

 IIUC, GNU coreutils uses uintmax_t to store large numbers relating to
 the file system and prints them with something like this:

char buf[INT_BUFSIZE_BOUND (uintmax_t)];
printf (_(The file is %s octets long.\n), umaxtostr (size, buf));

That's probably the most portable way to do it.

I guess that solves the remaining technical difficulty.  The things
that need to be done for large file support then are:

* Use AC_SYS_LARGEFILE in configure.in.  Make sure the compilation
  flags contain the proper large file support incantations.

* Change most (all?) occurrences of `long' in the code to `off_t'.  Or
  should we go the next logical step and just use uintmax_t right
  away?

* Profit!

Some of this has already been done in the submitted patch.