Re: wget and VMS

2001-08-13 Thread Jan Prikryl

 I've recently upgraded from wget-1.5.3 to wget-1.7 and now
 cannot download files from OpenVMS via ftp.
 My home directory at VAX is USERB2:[SYS] and file I need is in
USERB4:[RA].
 wget-1.5.3 can download it with this command:

 export WGETRC=authfile # here are my username and password
 wget ftp://host/../userb4/ra/filename

 Now, with wget-1.7 this does not work, wget tries to CWD to nonexistent
 directory [userb4.ra] instead of userb4:[ra] and fails.
 What is correct syntax now?

Your syntax is the correct one. It's just that the current version has a new
part that was written specifically to deal with VMS and that does not
suppose you need to ditinguish between different disks on the machine - I
will have to have a look what the URL specification says in this case as my
feeling is that there is no way of specifying things like
C:/directory/file or USERDISK:[DIRECTORY.FILE] given the current URL
form. Let's hope I'm wrong ...

--jan




Re: error building the doc/ directory

2001-08-01 Thread Jan Prikryl

Quoting Ivan D Nestlerode ([EMAIL PROTECTED]):

 [...] It compiled the stuff in src, and then tried
 to make the documents in doc.
 
 This is where the trouble started.

Dear Ivan,

this is an error in 1.7 makefiles. It's repaired in CVS - look at
http://sunsite.dk/wget/wgetdev.html for more information about WGET
CVS.

The release that repaires this and some other build problems (mainly
related to SSL library detection) will be 1.7.1 .

Thanks for reporting this.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Segfault on Linux/390 for wget 1.6 and 1.7

2001-07-19 Thread Jan Prikryl

Quoting Post, Mark K ([EMAIL PROTECTED]):

 When I compile wget with -O0 to turn off optimization, wget works, but I get
 some garbage in the output as follows:

Could you please try 

(1) to run wget with the -d parameter to switch on the debugging
output 

(2) compile wget using -O2 -g and have a look what

  gdb wget core

reports? It shall be able to provide us with the content of the
call stack in the moment of crash that in turn would reveal the
place where wget crashes.

Thanks,

-- jan

---+
  Dr. Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: make problem - not *really* a bug!?!

2001-07-16 Thread Jan Prikryl

Quoting PONA-Boy ([EMAIL PROTECTED]):

 Here's the error I'm getting:
 
 make[1]: Entering directory `/root/save_stuff/wget-1.7/src' gcc -I. -I.
-DHAVE_CONFIG_H
 -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\ -DLOCALEDIR=\/usr/local/share/locale\ -g 
-O2 -c utils.c
 utils.c: In function `read_file':
 utils.c:980: `MAP_FAILED' undeclared (first use in this function)
 utils.c:980: (Each undeclared identifier is reported only once
 utils.c:980: for each function it appears in.)
 make[1]: *** [utils.o] Error 1
 make[1]: Leaving directory `/root/save_stuff/wget-1.7/src'
 make: *** [src] Error 2

This error is repaired in current CVS and in versions 1.7.1 upwards
(please note that 1.7.1 has possibly not been released yet, I've just
returned after two weeks of holidays).

Consult http://sunsite.dk/wget/ for more information about read-only
CVS access to wget sources.

-- jan

---+
  Dr. Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Ever heard of that version?

2001-07-16 Thread Jan Prikryl

Quoting Jens Roesner ([EMAIL PROTECTED]):

 I just stumbled over 
 http://bay4.de/FWget/
 Are his changes incorporated into Wget 1.7?
 Any opinions on that software?
 I think with WinME *yuck* as OS, this is out of question for me, 
 but...

1.7 has got hash tables instead of linked lists. I think that code
comes directly from Hrvoje and not from this German guy.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: wget hp-ux/sco

2001-07-04 Thread Jan Prikryl

Quoting R. Daneel Olivaw ([EMAIL PROTECTED]):

 are there any known issues regarding wget retrieving files through
 ftp from hp-ux 10.20/11.00 servers ?  I'm using wget version 1.6 on
 a RH7.0 linux server (installed from RPM).

As far as I remember (I do not have an access to any HP/UX machine
right now), wget was able of FTP downloads from HP/UX 10.20 FTP
server. If you are experiencing any difficulties, please, give us more
details - preferrably a debug output generated by `-d' option.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: wget hp-ux/sco

2001-07-04 Thread Jan Prikryl

Quoting R. Daneel Olivaw ([EMAIL PROTECTED]):

 wget -d ftp://user:pass@HOST//tmp/TESTDIR

Given the URL syntax, TESTDIR is a file. If you need the contents of a
directory, ftp://user:pass@HOST//tmp/TESTDIR/; is the correct syntax.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: wget hp-ux/sco

2001-07-04 Thread Jan Prikryl

Quoting R. Daneel Olivaw ([EMAIL PROTECTED]):

 I tried that, and it gives me even more strange results ;p

It actually works. It's the listing parser that fails in this case. 

 -rw-rw-rw-   1 taranistaranis207  4 juil 11:21 TESTFILE1
 
 PLAINFILE; perms 666;
 Skipping.

I would bet that this Skipping. is results from the French date
output. The listing parser supposes that the dates are in English
(actually, even my fully Czech-localized machine sends ftp listings
completely in English).

-- jan



Re: [Fwd: type=a problem with wget 1.6]

2001-06-18 Thread Jan Prikryl

Quoting Maureen O'Drisceoil ([EMAIL PROTECTED]):

  I'm not sure what part of the debug log is relevant, so here's the 
 whole thing.  Thank you.
 
 wget -d ftp://hostname.harvard.edu/CURRENT.URLS2.TXT;type=a

Try the following

wget -d 'ftp://hostname.harvard.edu/CURRENT.URLS2.TXT;type=a'

(one has to escape characters that have a special meaning to your
shell, as `?', `', `*', or `;').

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: build outside of source dir breaks installation of wget.1

2001-06-14 Thread Jan Prikryl

Mike Castle wrote:

 I try to build all autoconfed packages outside of the source directory.
 (It is suggested that they allow this type of build in the GNU Coding
 Standards.)
 
 The generated man page, wget.1, ends up in the build directory, but install
 looks for it in srcdir:

Yes, this is a bug in Makefile system of wget 1.7 . The current CVS
version of wget (see http://sunsite.dk/wget/) shall work properly, so
you may expect that wget 1.7.1 will be OK.

Thanks for the patch anyway.

-- jan



Re: Problems with wget - Bulletproof ftpd (Win32)

2001-06-13 Thread Jan Prikryl

Dings-Da wrote:

 I got a question on Wget's behaviour with Windows-based Bulletproof FTPd.
 Though i am aware that wget had some problems with Windows-based ftp
 servers, which were quite often discussed here in the mailing list, i tried
 out the latest wget-version v1.7 (not dev!) and encountered the following:

 [...]

 421 Too many users logged for this account. Try again later.

That's it. You are logged in more times than that you are allowed to.
Seems clear to me. 

 Is this a known issue? Perhaps it's more a wrong behaviour of Bulletproof
 instead of wget, but since i'm not sure about that, i decided to post it
 here :)

We can easily verify that: In the moment when your wget session fails,
just try a normal FTP session (possibly without any proxies inbetween as
this may cause additional problems). If, after your login, the FTP
session fails as well, it's the Bulletproof FTPd. Otherwise it may be
some problem in wget (although I personally doubt it). In such a case
please send us a complete log of that "normal" FTP session with "set
debug on" so that we can compare what is going wrong. 

-- jan



Re: dynamic web page using wget?

2001-06-09 Thread Jan Prikryl

Quoting Jingwen Jin ([EMAIL PROTECTED]):

 Hi, Do any of you know if wget allows us to retrieve dynamic query pages?

In certain cases, yes.

 I tested
   wget http://altavista.com/sites/search/web?q=musickl=XXpg=q
 
 which queries music at altavista. But wget doesn't work with this...

Try

wget 'http://altavista.com/sites/search/web?q=musickl=XXpg=q'

On UNIX one has to quote URLs that contain special shell characters
like ? *  - in not quoted, these characters will be interpreted by
your shell.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: How do I get SSL support to work in 1.7?

2001-06-07 Thread Jan Prikryl

Quoting [EMAIL PROTECTED] ([EMAIL PROTECTED]):

 Is Wget available via CVS somewhere or should patches be against 1.7?

See http://sunsite.dk/wget/wgetdev.html - I guess patches against 1.7
are fine, as the current difference to CVS is almost NULL.  

Thanks for your help!

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Wget 1.7-pre1 available for testing

2001-06-06 Thread Jan Prikryl

 Jan Prikryl [EMAIL PROTECTED] writes:
 
  It seems that -lsocket is not found as it requires -lnsl for
  linking. -lnsl is not detected as it does not contain
  `gethostbyname()' function.
 
 That's weird.  What does libnsl contain if not gethostbyname()?

It seems to contain `gethostname()' ... see the config.log submitted
in one of the previous emails. But it's a very long distance shot: if,
after adding -lsocket -lnsl everything works correctly and if with
-lsocket only the linker complains about missing 'yp_*()' functions
and also missing `gethostname()' and `getdomainname()', I thinks it's
likely that these functions are defined in -lnsl. Of course, if -lnsl
has built in dependency on some other library, the situation might be
completely different.

 Jan, you must be confusing something here.  gethostname() only gets
 the local host name, and is just a wrapper for the appropriate
 uname() or sysinfo() call.  It has nothing to do with name server
 lookups, which is what libnsl is supposed to do.

Probably, but are you sure that this is true on _all_ systems?

 Perhaps we really should try to write a libtool-based macro named
 WGET_CHECK_EXTERNAL_LIB.

Perhaps it would be more portable then.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: wget 1.7, linux, -rpath

2001-06-06 Thread Jan Prikryl

Quoting [EMAIL PROTECTED] ([EMAIL PROTECTED]):

 The ssl support is much appreciated in wget 1.7.  But there is a problem
 with the configure support that makes it think ssl can't be used, at
 least with gcc 2.95.2 on my redhat 6.2 system:

Thanks for the report. Unfortunately the SSL test does not work on
linux at all. Replacing -rpath  with -Wl,rpath  will solve part of
the problems. You may want to try if the attached patch works for
you. Note that this is an unofficial patch and while it may help
solving the SSL check problem, it may break other things.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--


Index: configure.in
===
RCS file: /pack/anoncvs/wget/configure.in,v
retrieving revision 1.17
diff -u -r1.17 configure.in
--- configure.in2001/05/28 22:02:47 1.17
+++ configure.in2001/06/06 05:26:20
@@ -174,8 +174,12 @@
 AC_CHECK_FUNCS(strdup strstr strcasecmp strncasecmp)
 AC_CHECK_FUNCS(gettimeofday mktime strptime)
 AC_CHECK_FUNCS(strerror snprintf vsnprintf select signal symlink access isatty)
-AC_CHECK_FUNCS(uname gethostname)
+AC_CHECK_FUNCS(uname)
 
+AC_CHECK_FUNCS(gethostname, [], [
+  AC_CHECK_LIB(nsl, gethostname)
+])
+
 AC_CHECK_FUNCS(gethostbyname, [], [
   AC_CHECK_LIB(nsl, gethostbyname)
 ])
@@ -205,14 +209,18 @@
 AC_MSG_CHECKING(for runtime libraries flag)
 case $host_os in
   sol2 ) dash_r=-R ;;
-  decosf* | linux* | irix*) dash_r=-rpath  ;;
+  decosf* | irix*) dash_r=-rpath  ;;
+  linux*) dash_r=-Wl,rpath  ;;
   *)
 dash_r=
 for try_dash_r in -R -R  -rpath ; do
   OLD_LDFLAGS=$LDFLAGS
   LDFLAGS=${try_dash_r}/no/such/file-or-directory $LDFLAGS
+  dnl gcc seems to only produce a warning about nonexistent option
+  dnl `-R/no/such/file-or-directory' so the test comes thru
+  dnl (tested with gcc version 3.0 20010308 (prerelease))
   AC_TRY_LINK(, , dash_r=$try_dash_r)
-  LDFLAGS=$ODL_LDFLAGS
+  LDFLAGS=$OLD_LDFLAGS
   test -n $dash_r  break
 done ;;
 esac
@@ -235,9 +243,6 @@
 ssl_all_roots=$with_ssl
   fi
 
-  OLD_LIBS=$LIBS
-  OLD_LDFLAGS=$LDFLAGS
-
   dnl Unfortunately, as of this writing (OpenSSL 0.9.6), the libcrypto
   dnl shared library doesn't record its dependency on libdl, so we
   dnl need to check for it ourselves so we won't fail to link due to a
@@ -245,6 +250,9 @@
   dnl shl_load().
   AC_CHECK_LIB(dl,dlopen)
   AC_CHECK_LIB(dl,shl_load)
+
+  OLD_LIBS=$LIBS
+  OLD_LDFLAGS=$LDFLAGS
 
   ssl_linked=no
 



Re: WGET is changing my URL!

2001-06-06 Thread Jan Prikryl

Quoting Kohler Roberto ([EMAIL PROTECTED]):

 I am using wget 1.5.3 to get the URL
 cache_object://localhost:/info (cache statistics from Squid proxy)
 and the program is changing the URL to
 ftp://cache_object:21/%2Flocalhost/info; (guess because the
 protocol cache_object is not known).

Exactly. Wget knows nothing about cache_object: URL identifier and
it assumes you wanted a FTP one.

 I could not find an option so the program would accept the URL as
 is.

There is no such an option, sorry. You may only use http://,
https://, or ftp:// URLs with wget.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: make install, wget.1, builddir != srcdir

2001-06-05 Thread Jan Prikryl

Quoting Ryan Lovett ([EMAIL PROTECTED]):

 `make install' fails to install the man page if the build directory
 is not the same as the source directory. It tries to find the man
 page in srcdir/doc/, but the man page gets built into builddir/doc/.

Thanks for the report. Apparently, also wget.info and message
catalogues are either not build or not installed when build !=
src. I'm working on that.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



gettext not found after libssl check failed

2001-06-05 Thread Jan Prikryl

Hello,

subjects says it all:

When trying to compile with -lssl, an already reported bug in libssl
detection causes the library not to be found. However, it seems that
after the failure of the libssl test some things are seriously broken:
at least on my system the configure script will later not find
`gettext' ...

(1) ./configure --with-ssl=/usr

| checking for runtime libraries flag... -rpath 
| checking for dlopen in -ldl... yes
| checking for shl_load in -ldl... no
| Looking for SSL libraries in /usr
| checking for RSA_new in -lcrypto... no
| checking for SSL_new in -lssl... no
| 
| WARNING: Failed to link with OpenSSL libraries in /usr/lib.
|  Wget will be built without support for https://... URLs.
| 
| checking whether NLS is requested... yes
| language catalogs: cs da de el es et fr gl hr it ja nl no pl pt_BR ru
| sk sl sv tr zh_TW
| checking for msgfmt... /usr/bin/msgfmt
| checking for xgettext... /usr/bin/xgettext
| checking for gmsgfmt... /usr/bin/msgfmt
| checking for locale.h... yes
| checking for libintl.h... yes
| checking for gettext... no
| checking for gettext in -lintl... no
| gettext not found; disabling NLS
| checking for makeinfo... makeinfo

(It seems that the SSL check forces `-rpath' and gcc does not like
it. Moreover, the SSL checks for -ldl and ingores it afterwards.)

(1) ./configure

| checking for socket in -lsocket... no
| checking for runtime libraries flag... -rpath 
| checking whether NLS is requested... yes
| language catalogs: cs da de el es et fr gl hr it ja nl no pl pt_BR ru
| sk sl sv tr zh_TW
| checking for msgfmt... /usr/bin/msgfmt
| checking for xgettext... /usr/bin/xgettext
| checking for gmsgfmt... /usr/bin/msgfmt
| checking for locale.h... yes
| checking for libintl.h... yes
| checking for gettext... yes
| checking for makeinfo... makeinfo
 
The system is RedHat 6.2 CZ + numerous updates, compiler is a
prerelease of gcc 3.0 (gcc version 3.0 20010308 (prerelease))

I'm attaching a patch against current CVS version, which is almost 1.7
anyway. It's WORKSFORME, the SSL handling is certainly not optimal.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--


Index: configure.in
===
RCS file: /pack/anoncvs/wget/configure.in,v
retrieving revision 1.17
diff -u -r1.17 configure.in
--- configure.in2001/05/28 22:02:47 1.17
+++ configure.in2001/06/05 21:27:09
@@ -205,14 +205,18 @@
 AC_MSG_CHECKING(for runtime libraries flag)
 case $host_os in
   sol2 ) dash_r=-R ;;
-  decosf* | linux* | irix*) dash_r=-rpath  ;;
+  decosf* | irix*) dash_r=-rpath  ;;
+  linux*) dash_r= ;;
   *)
 dash_r=
 for try_dash_r in -R -R  -rpath ; do
   OLD_LDFLAGS=$LDFLAGS
   LDFLAGS=${try_dash_r}/no/such/file-or-directory $LDFLAGS
+  dnl gcc seems to only produce a warning about nonexistent option
+  dnl `-R/no/such/file-or-directory' so the test comes thru
+  dnl (tested with gcc version 3.0 20010308 (prerelease))
   AC_TRY_LINK(, , dash_r=$try_dash_r)
-  LDFLAGS=$ODL_LDFLAGS
+  LDFLAGS=$OLD_LDFLAGS
   test -n $dash_r  break
 done ;;
 esac
@@ -235,9 +239,6 @@
 ssl_all_roots=$with_ssl
   fi
 
-  OLD_LIBS=$LIBS
-  OLD_LDFLAGS=$LDFLAGS
-
   dnl Unfortunately, as of this writing (OpenSSL 0.9.6), the libcrypto
   dnl shared library doesn't record its dependency on libdl, so we
   dnl need to check for it ourselves so we won't fail to link due to a
@@ -245,6 +246,9 @@
   dnl shl_load().
   AC_CHECK_LIB(dl,dlopen)
   AC_CHECK_LIB(dl,shl_load)
+
+  OLD_LIBS=$LIBS
+  OLD_LDFLAGS=$LDFLAGS
 
   ssl_linked=no
 
Index: doc/Makefile.in
===
RCS file: /pack/anoncvs/wget/doc/Makefile.in,v
retrieving revision 1.13
diff -u -r1.13 Makefile.in
--- doc/Makefile.in 2001/04/12 12:25:22 1.13
+++ doc/Makefile.in 2001/06/05 21:27:09
@@ -66,7 +66,7 @@
sed s/@/@@/g $  $@
 
 wget.info: $(SAMPLERCTEXI) $(srcdir)/wget.texi
-   -$(MAKEINFO)
+   $(MAKEINFO) -I$(srcdir)
 
 $(TEXI2POD): $(srcdir)/$(TEXI2POD).in
sed s,/usr/bin/perl,@PERL@, $  $@
@@ -115,7 +115,7 @@
 # install man page, creating install directory if necessary
 install.man: $(MAN)
$(top_srcdir)/mkinstalldirs $(DESTDIR)$(mandir)/man$(manext)
-   $(INSTALL_DATA) $(srcdir)/$(MAN) $(DESTDIR)$(mandir)/man$(manext)/$(MAN)
+   $(INSTALL_DATA) $(MAN) $(DESTDIR)$(mandir)/man$(manext)/$(MAN)
 
 # install sample.wgetrc
 install.wgetrc: $(srcdir)/sample.wgetrc



Re: wget 1.7 configure errors

2001-06-05 Thread Jan Prikryl

Quoting tenthumbs ([EMAIL PROTECTED]):

 I said ./configure --with-ssl but the script said it couldn't find
 ssl libs. They're in the default /usr/local/ssl location. Looking at
 config.log, I see that gcc is upset, claiming that -rpath is an
 invalid option. That's right. It's a linker option so gcc should see
 -Wl,-rpath.  If I make that change, then configure finds the ssl
 libs and the build proceeds.

Yes, the SSL detection is broken on some systems. We are working on a
fix. 

 In the midst of trying to debug this, I tried using another gcc by
 saying
   CC=/usr/new/bin/gcc ./configure ...
 This syntax is described in every GNU INSTALL document I've ever seen.
 Your script does not honor it. Your script really should.

Could you provide an example? On my machine, GCC is in /usr/local/bin
by default (I'm using the GCC 3.0 development version which has some
bugs, so I prevent to keep tow versions of the compiler). In this
setup, `CC=/usr/bin/gcc ./configure ... ; make' will build wget with
the old GCC 2.95.2 just fine.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Wget 1.7-pre1 available for testing

2001-06-05 Thread Jan Prikryl

Quoting Andre Majorel ([EMAIL PROTECTED]):

 Tuesday is today. config.log for 1.6 and 1.7-pre1 attached. 1.7
 is identical to 1.7-pre1.

It seems that -lsocket is not found as it requires -lnsl for
linking. -lnsl is not detected as it does not contain
`gethostbyname()' function.

Would the attacheed patch to configure.in solve the problem? Please
not that the patch tries to correct somee other problems with
configure.in and these corrections may make it crash on your
system. The important change for you would be the
AC_CHECK_FUNCS(gethostname, [] ...) part.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--


Index: configure.in
===
RCS file: /pack/anoncvs/wget/configure.in,v
retrieving revision 1.17
diff -u -r1.17 configure.in
--- configure.in2001/05/28 22:02:47 1.17
+++ configure.in2001/06/06 05:26:20
@@ -174,8 +174,12 @@
 AC_CHECK_FUNCS(strdup strstr strcasecmp strncasecmp)
 AC_CHECK_FUNCS(gettimeofday mktime strptime)
 AC_CHECK_FUNCS(strerror snprintf vsnprintf select signal symlink access isatty)
-AC_CHECK_FUNCS(uname gethostname)
+AC_CHECK_FUNCS(uname)
 
+AC_CHECK_FUNCS(gethostname, [], [
+  AC_CHECK_LIB(nsl, gethostname)
+])
+
 AC_CHECK_FUNCS(gethostbyname, [], [
   AC_CHECK_LIB(nsl, gethostbyname)
 ])
@@ -205,14 +209,18 @@
 AC_MSG_CHECKING(for runtime libraries flag)
 case $host_os in
   sol2 ) dash_r=-R ;;
-  decosf* | linux* | irix*) dash_r=-rpath  ;;
+  decosf* | irix*) dash_r=-rpath  ;;
+  linux*) dash_r=-Wl,rpath  ;;
   *)
 dash_r=
 for try_dash_r in -R -R  -rpath ; do
   OLD_LDFLAGS=$LDFLAGS
   LDFLAGS=${try_dash_r}/no/such/file-or-directory $LDFLAGS
+  dnl gcc seems to only produce a warning about nonexistent option
+  dnl `-R/no/such/file-or-directory' so the test comes thru
+  dnl (tested with gcc version 3.0 20010308 (prerelease))
   AC_TRY_LINK(, , dash_r=$try_dash_r)
-  LDFLAGS=$ODL_LDFLAGS
+  LDFLAGS=$OLD_LDFLAGS
   test -n $dash_r  break
 done ;;
 esac
@@ -235,9 +243,6 @@
 ssl_all_roots=$with_ssl
   fi
 
-  OLD_LIBS=$LIBS
-  OLD_LDFLAGS=$LDFLAGS
-
   dnl Unfortunately, as of this writing (OpenSSL 0.9.6), the libcrypto
   dnl shared library doesn't record its dependency on libdl, so we
   dnl need to check for it ourselves so we won't fail to link due to a
@@ -245,6 +250,9 @@
   dnl shl_load().
   AC_CHECK_LIB(dl,dlopen)
   AC_CHECK_LIB(dl,shl_load)
+
+  OLD_LIBS=$LIBS
+  OLD_LDFLAGS=$LDFLAGS
 
   ssl_linked=no
 



Re: WGET suggestion

2001-06-04 Thread Jan Prikryl

\Quoting Michael Widowitz ([EMAIL PROTECTED]):

 I'm using wget and prefer it to a number of GUI-programs. It only
 seems to me that Style Sheets (css-files) aren't downloaded. Is this
 true, or am I doing something wrong? If not, I would suggest that
 stylesheets should also be retrieved by wget.

Michael,

which version of wget do you use? I guess (but maybe I'm mistaken)
that versions 1.6 and upwards do download CSS when doing recursive
traversal (or --page-requisities).

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Page move

2001-06-04 Thread Jan Prikryl

Quoting Herold Heiko ([EMAIL PROTECTED]):

 I'd like to move that page from
 http://www.geocities.com/heiko_herold to a host at my Ips's domain
 (having a bit more control there) at
 http://space.tin.it/computer/hherold/ .
 
 However I've no idea how reachable that host is from around the world
 (at different times of the day), geocities (as other free sites living
 from ads in fact) at least should be somewhat reachable from everywhere.

Tried it now (2001/06/04 18:10 CET, Holiday in Austria), works fast
enough (13-31 kB/s). Your server seems to reset the connection sometimes
(happened twice).

I'll update the WWW page accordingly.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: cgi scripts and wget

2001-06-04 Thread Jan Prikryl

Quoting Samer Nassar ([EMAIL PROTECTED]):

 I am an undergrad student in University of Alberta, and downloaded
 wget recently to mirror a site for research purposes. However, wget
 seems to be having trouble pulling pages whose urls are cgi. I went
 through wget manual and didn't see anything about this. Any hints?

Pages that are automatically generated (by a CGI, for example) are not
always easy to download. As long as the CGI does not require any human
input (forms, etc), I guess wget shall be able to download those
pages. If this does not work, it would help to see an example where
wget fails. But before this, please upgrade to wget 1.7 which has been
released just few moments ago.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: What do you think my chances are of getting wget to work on H P-UXare ? :-))))))))) aarrrggggghhh !!!

2001-05-31 Thread Jan Prikryl

Quoting Alan Barrow ([EMAIL PROTECTED]):

 Hi, back again, just a quick one, ever come across :execute
 permission denied when you try and run wget on HP-UX ?

Next time, please contact the wget list and not me directly.

As you say that the execute permissions have been checked ... have you
tried to execute wget with the full pathname
(/installation_path/bin/wget) ? Isn't' it on a partition that has
been mounted as 'no_exec' or similar?

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Query regarding wget SSL...

2001-05-30 Thread Jan Prikryl

Quoting Dominic Caffey ([EMAIL PROTECTED]):

 Has anyone had any experience with using wget to retrieve data from
 secure sites using SSL?  I scanned the manpage but can't find any
 references on how to use wget to make data requests of a secure site
 using SSL.

You will need the forthcoming 1.7 version for that. If you search the
mailing list archive about a week back, you will find a posting
regarding the pre-1.7 tarball. Additionally, you may check it out from
CVS at sunsite.dk. Please, consult http://sunsite.dk/wget/ for more
info (the page is in the process of moving to www.gnu.org).

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Percentage indicator still does not work properly with --continue

2001-05-24 Thread Jan Prikryl

Hi,

when downloading RedHat 7.1 CZ ISO images and I have just witnessed
the following (sorry for the long lines):

| erwin:honza ~/Devel/wget/src/wget --continue 
|ftp://ftp.linux.cz/pub/linux/redhat-cz/7.1/iso/redhat-7.1cz-disk1-respin.iso.bz2
| --09:09:11--  
|ftp://ftp.linux.cz/pub/linux/redhat-cz/7.1/iso/redhat-7.1cz-disk1-respin.iso.bz2
|= `redhat-7.1cz-disk1-respin.iso.bz2'
| Connecting to ftp.linux.cz:21... connected!
| Logging in as anonymous ... Logged in!
| == SYST ... done.== PWD ... done.
| == TYPE I ... done.  == CWD /pub/linux/redhat-cz/7.1/iso ... done.
| == PORT ... done.== REST 474641832 ... done.
| == RETR redhat-7.1cz-disk1-respin.iso.bz2 ... done.
| Length: 116,502,509 [-358,139,323 to go] (unauthoritative)
| 
|   [ skipping 463500K ]
| 463500K ,, ,,,... .. .. ..407% @  71.93 KB/s
| 463550K .. .. .. .. ..407% @  17.17 KB/s
| 463600K .. .. .. .. ..407% @  33.36 KB/s
| 463650K .. .. .. .. ..407% @  33.33 KB/s
| 463700K .. .. .. .. ..407% @  49.31 KB/s
| 463750K .. .. .. .. ..407% @  31.59 KB/s
| 463800K .. .. .. .. ..407% @  35.36 KB/s


It's the current CVS version. The 1.6 behaves the same way.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: --spider download

2001-05-19 Thread Jan Prikryl

Quoting Tom Gordon ([EMAIL PROTECTED]):

 [...]
 --09:58:49--  ftp://metalab.unc.edu/README
 [...]

Oh, I see. I'm afraid `--spider' will not work with FTP, it's been
probably meant for HTTP only (Hrvoje?) .

The wget user manual says:

|  --spider
|  When invoked with this option, Wget will behave as a
|  Web spider, which means that it will not download the
|  pages, just check that they are there.  You can use it
|  to check your bookmarks, e.g. with:
| 
| 
|  wget --spider --force-html -i bookmarks.html
| 
|  This feature needs much more work for Wget to get
|  close to the functionality of real WWW spiders.

Actually, I guess wget shall be able to work as a spider with FTP
links as well. Will look into that.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: recursive web-suck ignores directory contents

2001-05-16 Thread Jan Prikryl

Quoting J Scott Jaderholm ([EMAIL PROTECTED]):

 It will download the files in foo for me, and the directories also,
 but often not the contents of those directories.

Please note that by default the hierarchy depth for a recursive
download is limited - try `-l inf' as parameter. 

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: wget reading urls from files

2001-05-14 Thread Jan Prikryl

Quoting Arkem ([EMAIL PROTECTED]):

 I'm currently wondering how/when wget reads URLs from files.  I'm
 wondering if it is possible to append URLs to the file list while
 wget is running or whether these would just be ignored by wget.

As far as I understand the code, they will be just ignored. The input
file is parsed once at the beginning of retrieval, URLs are stored
into a list and then retireved.

 My goal is to have it so I could append URLs on the fly to a file or
 a directory so that wget would download them when it got down to
 that part of the file list.

It should be possible to write a simple script that make this
possible.  Just start a wget session with a list of URLs to
donwload. In the meantime, put new URLs one after another down to
another file, and when the current wget session finishes, replace the
wget input file with the file containing those new URLs and start
wget again ...

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Recursive output fails after first file when writing to stdout

2001-05-14 Thread Jan Prikryl

Quoting Greg Robinson ([EMAIL PROTECTED]):

 I'm having a problem with wget.  I need to have the program (while
 running recursively) output to stdout so that I can pipe the output to a
 separate filter process.  Unfortunately, wget will only download the
 first file from any site I point it at when stdout is specified as the
 file to write to.

The difficulty here is the recursive download: When recursively
donwloading, wget requires a physical copies of files to exist in
order to extract URLs form those files. No chance in the moment to do
it directly when downloading, sorry.

 Does that mean it's trying to write to the non-existant www.google.com
 directory on my drive, or does it mean that there's no index.html file
 on any server I want to suck from?

The message comes probably from the URL parser and it means that no
local copy of index.html from www.google.com exists on your drive (and
therefore no URLs can be extracted and the recursive download will
fail).

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: ftp_bug

2001-05-14 Thread Jan Prikryl

Quoting [EMAIL PROTECTED] ([EMAIL PROTECTED]):

 Is it correct?
 Command: wget -m -o log -d ftp://user:passwd@host/

Well. I guess you're using either 1.5.3 or 1.6. This bug has been
removed in the current CVS version.

 P.S. have you FAQs?

No. But look at the wget homepage at http://sunsite.dk/wget/ - you will
find there some links to searchable archives of this mailing list.

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: POST encoding

2001-05-01 Thread Jan Prikryl

Quoting Haydn Haines - Sun UK - Partner Support ([EMAIL PROTECTED]):

 Where's the archive for the wget-patches alias? I would like to try a 
 few of these patches as I would prefer to use wget instead of that 
 stupid lwp-request when I need a post method.

To my knowledge, every mailing list at sunsite.sk is being
automatically archived at the site itself. You will have to send a
request to the mailserver (ezmlm) asking for the messages you
want. See http://sunsite.dk/wget/wgetdoc.html for some pointers to the
documentation. As far as I know, you may request a listing of message
headers as well.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



make uninstall (was Re: Make Errors)

2001-04-30 Thread Jan Prikryl

Quoting SoloCDM ([EMAIL PROTECTED]):

 After a make install, is there a make uninstall -- if an installation
 doesn't go as planned (btw, everything is just fine)?

A quick look at the Makefile reveals that although there are
sub-targets prepared for the uninstall procedure (uninstall the
binary, uninstall the info files and manpages), there is no generic
target for `make uninstall'.

I do not know if it has been left out intentionally or if it's just
been forgotten. Maybe Hrvoje can comment on that.  

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Netrc Error

2001-04-29 Thread Jan Prikryl

Quoting SoloCDM ([EMAIL PROTECTED]):

 Yesterday evening, I downloaded the most recent CVS version of wget
 with cvs -d :pserver:[EMAIL PROTECTED]:/pack/anoncvs checkout wget. 
 Does that include the wget-dev?

The most recent CVS version of wget is indeed the recent development
version.   

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Make Errors

2001-04-29 Thread Jan Prikryl

Quoting SoloCDM ([EMAIL PROTECTED]):

 I executed the following commands without any errors.  I'm still
 getting version 1.6 with wget -V.  Is there a make install or is
 wget officially installed?
 
 autoconf;configure;make

I suppose you have 1.6 installed on your computer somewhere. Just try
the wget executable in the src directory of the wget source
tree. And yes, there is `make install' - it will install the whole lot
(binary, manpage, po file, system-wide configuration file) into
/usr/local by default. 

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Make Errors

2001-04-27 Thread Jan Prikryl

Quoting SoloCDM ([EMAIL PROTECTED]):

 The following errors occurred at the end of the make command:
 
 hr.po:188: `msgid' and `msgstr' entries do not both begin with '\n'
 hr.po:550: `msgid' and `msgstr' entries do not both begin with '\n'
 [...]
 
 Do I need to be concerned; if so, what must I do?

Uptade hr.po in your CVS sources (cvs update ...). I have just added
the two missing newlines to the CVS version of hr.po.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Netrc Error

2001-04-27 Thread Jan Prikryl

Quoting SoloCDM ([EMAIL PROTECTED]):

 Why do I get the following error from ~/.netrc when I execute
 wget -v --follow-ftp --retr-symlinks --glob=on -N -r -l0 --tries=0
 --no-parent -k -o log - $(date +%r %Z -:- %A, %B %d, %Y)':
 
 wget: /home/[user_dir]/.netrc:4: unknown token 

Are you sure you have the last CVS version of wget (1.7-dev) ? As far
as I remember, at some point in time there was a change in the code
that caused the .netrc parser to choke on empty lines, but the error
has been already corrected.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Is there a version of Wget that is 100% Java under GPL or LGPL?

2001-04-26 Thread Jan Prikryl

Quoting Mike Kanaley ([EMAIL PROTECTED]):

 Subject: Re: Is there a version of Wget that is 100% Java under GPL
 or LGPL?

No, wget is written in C.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Windows FTP servers

2001-04-08 Thread Jan Prikryl

Quoting Joseph A. Knapka ([EMAIL PROTECTED]):

 OK, I looked at the new ftp-ls code, and of course it is much
 nicer than my patch for Windows FTP. However, I noticed one
 tiny problem on line 460 of ftp-ls.c:
 
   if (*tok == 'P') hour += 12;
 
 This code works fine except for one case: for "12:01PM", it gives us
 hour 24, minute 01, which is probably not what we want :-)

Not, this is really not exactly what we want ;-). I have just fixed it
in CVS.

-- jan

+------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: wget 1.6 porting problem with snprintf and isdigit on Solaris 2.5.1

2001-04-08 Thread Jan Prikryl

Quoting Paul Eggert ([EMAIL PROTECTED]):

 When building wget 1.6 on Solaris 2.5.1 with GCC 2.95.3, I ran
 into the following porting problem.
 
 snprintf.c: In function `dopr':
 snprintf.c:230: warning: subscript has type `char'
 snprintf.c:254: warning: subscript has type `char'
 
 This is warning that isdigit doesn't work on negative characters
 (which are possible on hosts where characters are signed).
 Here is a patch.

Wouldn't just an explicit type cast to `(unsigned char)ch' suffice?

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Windows FTP servers

2001-04-05 Thread Jan Prikryl

Dear Joseph,

thanks for the patch.

 Recently I discovered that Wget would not talk to Windows FTP
 servers.

This applies to all wget versions that have been released until
now. The current development version that is going to be released in a
month or so (see Hrvoje's posting in this list from a few days ago)
can cope with MS IIS and some other non-UNIX FTP servers. As a
candidate release, this version is pretty stable and can be
anonymously checked out from CVS. If you are interested, consult
http://sunsite.dk/wget/wgetdev.html for more info.
 
 I searched around on the web a bit and found references
 to a patch to make it work, but I could not find the patch
 itself. Therefore, to preserve my own sanity, I hacked together
 my own Windows FTP patch. 

The problem is that the mail archive of this list seems to be partly
broken. For certain reasons it contains messages starting with middle
of February, approximately. We are working on this but it's going to
take some time [1].

 It applies cleanly against the Wget-1.6 source tree.
 It has been tested with only a single Windows NT FTP server,
 one with which I must frequently have contact; I have no
 idea if it will work in general, since it makes a very simple
 and probably wrong assumption about how to detect a Windows
 server

Actually you did the best thing. In praxis, MS IIS FTP may deliver
either "UNIX compatible" listings or "DOS-like" listings and so it is
not sufficient to test for the server system type at the very
beginning. Current CVS version does exactly the same check as you did.

If you feel like that, you may try the MS IIS support in the current
CVS version of wget.

Thanks for your support.

-- jan

[1] I had to resubmit the old messages to www.mail-archive.com after
the change of the SunSite's address. However, it seems that due to
some "clever" anti-spam [2] e-mail checks at my university only about
40 e-mails really got resubmitted. The rest has been probably
classified as spam and silently (!) dropped. I will be resubmitting
the rest probably once more after some more testing at
mail-archive.com and this time from another address.

[2] offtopic As I recently learned, these rules check only
_outgoing_ e-mail. The spam coming from outside gets happily routed
though. /offtopic

+----------
Jan Prikryl | vr|vis center for virtual reality and visualisation
[EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Compiling problems on 1.6

2001-03-19 Thread Jan Prikryl

Quoting Volker Moell ([EMAIL PROTECTED]):

 [...]
 file=./`echo sl | sed 's,.*/,,'`.gmo \
rm -f $file  PATH=../src:$PATH msgfmt -o $file sl.po
 usage: msgfmt [ -dv ] [ - ] [ name ... ]
  ^

Could it be that configure picked up a shell script called `msgfmt'
instead of the real `msgfmt' programm that comes from gettext package
(/usr/bin/msgfmt on my computer)? What happens when you run `msgfmt
--help' from your command prompt? What kind of Linux system is that?

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: suggestion for wget

2001-03-18 Thread Jan Prikryl

Quoting Jonathan Nichols ([EMAIL PROTECTED]):

i have a suggestion for the wget program.  would it be possible to
 have a command line option that, when invoked, would tell wget to
 preserve the modification date when transfering the file?

i guess that `-N' (or `--timestamping') is what you're looking
for.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: MISSING FILES USING WGET

2001-03-07 Thread Jan Prikryl

Quoting Thierry Pichevin ([EMAIL PROTECTED]):

 I used the command:
 wget -r -l6 -np -k http://www.apec.asso.fr/metiers/environnement
 
 1. small problem: it creates an arborescence 
 www.apec.asso.fr/metiers/environnement, whereas I would have
 expected only the subdirectories of 'environnement' to come

This is the general behaviour of wget. If you want to get just
sibdirectories, you will need to use `--cut-dirs' and
`--no-host-directories'.

 2. Big problem: many files don't come in: for example file
 'environnement/directeur_environnement/temoignage.html'.  This file
 is normally obtained from the main page by cliking
 'directeur_environnement' (Under title "communication et mediation")
 and on the next page by clicking on' Dlgu Rgional de l'Ademe
 Haute-Normandie' (under title 'temoignage', on the right).  Note
 that other in 'environnement/directeur_environnement/' come
 in... The missing files seem to have a common feature: they are
 viewed via a popup window when clicking on the link.. is this the
 problem?

These URLs are acutally javascript calls. Wget ignores javascript as
it cannot interpret it in any way. It would be probably possible to
modify wget's interal HTML parser to try some heuristic to extract
possible URLs from a `javascript:' URL, but noone has written the code
yet.   

-- jan

+------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: for the wishlist

2001-03-06 Thread Jan Prikryl

Quoting Dan Harkless ([EMAIL PROTECTED]):

  the file's size). This feature would enable the writing of cool scripts to
  do something like multi-threaded retrieval at file level.
 [...]
 
 Hi, Alec.  You're the second person within a few days to ask for such a
 feature.  I've added it to the TODO list.

I would object in this case. While the ability to retrieve only part
of the file might make sense, we sould have in mind that a tool like
wget shall behave decently with respect to servers on the other side
of the connection. I have nothing against starting several wget jobs
to download data from several sites. Hammering servers by making X
connections each of them retrieving 1/X of the file in concern doesn't
look to me as a very good policy.

Has anyone really measured how much can be gained this way, compared
to, say, persistent http connections? I would say that several partial
downloads make only sense in case that the http server limits the
bandwidth for a single connection ... 
 
-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: TODO list

2001-03-01 Thread Jan Prikryl

 Just browsing through the 1.7+dev TODO file:
 
 * Recognize more FTP servers (VMS).
 
 I thought this has been implemented in the latest code, or did I
 misunderstand ?

Right. Seems that I forgot to commit this particular change to TODO
(the same that happens to me all the time with ChangeLog entries).

I'll change it.
-- jan




Re: wget ftp url syntax is wrong

2001-02-28 Thread Jan Prikryl

  By the way, neither "//" nor "/%2F" works in 1.7-dev.  Perhaps we
  broke that when we fixed the problem where recursive FTP 'wget's
  assumed that logging in always put you in '/'?
 
 I believe some of Jan's changes broke it.  Also, the standard idiom:
 
 wget -r ftp://username:password@host//path/to/home/something
 
 no longer works.

Aargh. I will have a look at it.

-- jan 




Re: wget ftp url syntax is wrong

2001-02-26 Thread Jan Prikryl

Quoting Hanno Foest ([EMAIL PROTECTED]):

 On Mon, Feb 26, 2001 at 12:46:51AM -0800, Jamie Zawinski wrote:
 
  Netscape can retrieve this URL: 
 
ftp://ftp.redhat.com/pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
  
  wget cannot.   wget wants it to be:
  
ftp://ftp.redhat.com//pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
  
  I believe the Netscape behavior is right and the wget behavior is wrong.
 
 I don't think so. The double slash in front of the path part of the URL
 starts the path in the ftp server's root, while the single slash starts
 it in the default directory you log into when doing anonymous ftp. The
 default directory isn't the server's root in this case, but "pub".

Right. On the other hand, wget shall be probably able to handle the
missing slash at the beginning (as Netscape does).

-- jan

+------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: wget ftp url syntax is wrong

2001-02-26 Thread Jan Prikryl

Quoting Jamie Zawinski ([EMAIL PROTECTED]):

 However, that said, I still think wget should do what Netscape does,
 because that's what everyone expects.  The concept of a "default 
 directory" in a URL is silly.

The correct approach would be to try "CWD url/dir/path/" (the correct
meaning) and if this does not work, try "CWD /url/dir/path/".

-- jan

+----------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: FTP retrieval not functioning

2001-02-26 Thread Jan Prikryl

Quoting Chunks ([EMAIL PROTECTED]):

 I did RTFM, and the links to any mailing list archives I could find
 were broken. Please accept my apologies in advance if this is
 something covered elsewhere. Perhaps ignoring permissions will take
 care of it?

Could you tell us which links were actually broken? 

 I am running GNU Wget 1.5.3.1, win32 compilation and have also tried
 wget 1.5.3 linux compilation with identical results.

As Hack already suggested, try using the latest CVS version - it may
solve your problems. If not, please send us a complete debug ouput so
that we can try to fix what is broken.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: problem with recursive ftp downloading

2001-02-15 Thread Jan Prikryl

Quoting Simon Windows ([EMAIL PROTECTED]):

 [...] For example...
 
 wget connects to an ftp and the default directory is "/pub" rather than 
 "/". Wget looks and sees a directory called "upload" and tries to change 
 into it using the command "cd /upload" when the directory is actually 
 "/pub/upload" and cannot continue.
 
 Is there a workaround to this behaviour (for example a way to make wget 
 execute a pwd command on logging in)? If so I would really appreciate 
 any help you can offer me.

I guess you're using 1.5.3 or 1.6 ... try the latest CVS, it should
work as expected. 

-- jan

-------+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: FTP directory listing

2001-02-15 Thread Jan Prikryl

Quoting Florian Fuessl ([EMAIL PROTECTED]):

 # wget -O- ftp://ftp.mcafee.com -d
 DEBUG output created by Wget 1.5.3 on linux-gnu.
 [...]
 Logging in as anonymous ... 220 sncwebftp2 Microsoft FTP Service 
 (Version 5.0).

Problem #1: 1.5.3 does not support MS FTP service.

 [...]
 226 Transfer complete.
 16:52:06 (289.06 KB/s) - `-' saved [296]

I'm not sure if you wanted to save to "-" ...

 I think the "UNKOWN; perms 0;" is the cause for the error ...
 Does there already exist a newer version of wget? I'm currently using 
 the copy of the Debian 2.2r2 Distribution.

As the last release of wget (1.6) still lacks support for MS IIS, I'd
recommend trying the CVS version. Have a look at
http://sunsite.dk/wget/wgetdev.html for more information about how to
check out the CVS version.

-- jan

---+------------
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: Any way to delete (synchronize)? (fwd)

2001-01-21 Thread Jan Prikryl

Quoting Harry Putnam ([EMAIL PROTECTED]):

  Given there is a well-working implementation of rsync server and rsync
  client, does it make sense to add this functionality to wget?
 
 Whoa, wait a minute.  Doesn't this assume the user has control over
 both master and slave site?  Of course that is wrong for many of us.

Many sites that run anonymous FTP allow anonymous rsync access as
well. But in principle you are right - a site that offers FTP access
doees not have to be necesarilly accessible via rsync.

 I would think the full mirroring capability would be a fairly
 important addition.

Right.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: Wget and secure servers

2001-01-19 Thread Jan Prikryl

Quoting Randy Sweeten ([EMAIL PROTECTED]):

 I just tried wget for Windows.  It looks like it would do everything I need,
 except that next month the website I need to access will be on a secure
 server, https.  It looks like wget 1.5.3 does not support https.  Any chance
 for such an enhancement soon?

Current development version 1.7-dev supports secure HTTP, however the
current release (1.6) does not. The 1.7-dev is availiable as C source
via CVS; Windows binaries are time from time kindly produced by
Heiko Herold - see http://sunsite.dk/wget for details). I do not know
howerver if Heiko's binaries already contain support for HTTP over
SSL. 

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: undocumented behavior of wget

2001-01-15 Thread Jan Prikryl

Quoting [EMAIL PROTECTED] ([EMAIL PROTECTED]):

 The problem ("bad gateway") happens for the second file here. From this uotput,
 can you tell me if this error message comes from wget or from the system ?
 
 [...]
 --07:00:47--
 ftp://***:[EMAIL PROTECTED]:21/consensus/uk/weekly/tcd_
 uk.zip
= `tcd_uk.zip'
 Connecting to 194.3.173.3:3128... connected!
 Proxy request sent, awaiting response... 502 Bad Gateway
 07:00:47 ERROR 502: Bad Gateway.

It seems that the "502 Bad Gateway" is a HTTP response of your
proxy; is does not necessarily have anything to do with wget, it might
be a transient failure of your proxy.

To say more, one would really need a debug log to see the proxy
request header and the corresponding answer.

-- jan

+----------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: How do I update working dir to wget-1.5.3?

2001-01-15 Thread Jan Prikryl

Quoting Adrian Aichner ([EMAIL PROTECTED]):

 [...]
 I have still used sunsite.auc.dk accidentally until today.
 
 Today I have started over with a cvs login and fresh checkout into
 non-existing directory.
 
 After that I still cannot update to WGET_1_6.
 
 [...]

As "cvs", I was able to checkout wget, and update to tag
`WGET_1_5_3', and also to `release-1_6', but I got number of
complaints about no `WGET_1_6' tag available.

As "janp", I was able to checkout, and update to tags `WGET_1_6',
`WGET_1_5_3', and `release-1_6'. The last time I was probably logged
in as "janp" and I did not notice.

Might it be a file permission problem?

-- jan

+----------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--



Re: FORW: Mailinglist setup again

2001-01-11 Thread Jan Prikryl

Quoting Dan Harkless ([EMAIL PROTECTED]):

 Given the following (attached mail), I'll change the website to
 reference sunsite.dk instead of sunsite.auc.dk, and change the
 warning to just say that you have to be careful to avoid
 sunsite.auc.dk when doing administrative commands.  Unless Jan beats
 me to these changes, that is.

I'd be more strict and ask people to avoid using addresses at
sunsite.auc.dk completely. Maybe we can from now on for some time 
regularly post a message to the list asking people to change their
e-mail settings to [EMAIL PROTECTED].

BTW, I've just found that the newer messages from the list are also
being archived at http://www.geocrawler.com/archives/3/409/ . And that
there is  a lot of places that reference the current e-mail list
address (Google found about 300 references).  

I'll also check with people at www.mail-archive.com which changes are
needed at their site as I suppose that as soon as the mailing list
starts to use the new address, it won't be archived at their website.

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--




Re: WGET: failure for latest WGET to get ftp site

2001-01-11 Thread Jan Prikryl

Quoting Dan Harkless ([EMAIL PROTECTED]):

  I'm afraid so.  Case in point: ftpparse.c compiles with some
  warnings.  Am I allowed to modify it?  The copyright statement is
  vague, and Jan's inquiries have fallen on deaf ears.
 
 And I guess it'd be tough to find some other open source FTP listing
 parser out there (except maybe in the form of a Perl module or
 something).  Not the type of thing most tools need to do.

Well, we almost have it: There were only minimum changes needed to
support MacOS FTP (NetPresenz), VMS seems to work as well. Both of
them work for Lachlan Cranswick for more than a week now, so it's less
likely that I did something seriously wrong [1]. I'll check it in
during the weekend. After that, wget shall support UNIX ls output,
Microsoft IIS output, VMS, and MacOS. I'd say that this covers a
substantial portion of servers that are being used today. Naturally,
if someone points me to an Amiga, Atari, IBM VM-SP, DOS, or whatever
other server that is not completely broken, a parser for that server
will be implemented.

 I know the "lftp" client also implements some degree of
 listing-parsing, as it implements timestamp preservation.  Maybe we
 could take a look at it.  Here's what I had on where to find it as
 of 1999-04-12:

Thanks, I'll have a look what they have.

-- jan

[1] Imagine I spend today more than twenty minutes trying to find why
scanf("%f %f %f",a,b,c), where a,b, and c were _integers_, gives
very suspective results. Bah. I should probably return my diploma.

+----------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--




Re: SUGGESTION: rollback like GetRight

2001-01-10 Thread Jan Prikryl

Quoting ZIGLIO Frediano ([EMAIL PROTECTED]):

 I suggest two parameter:
 - rollback-size
 - rollback-check-size
 where 0 = rollback-check-size = rollback-size
 The first for calculate the beginning of range (filesize - rollback-size)
 and the second for check (wget should check the range [filesize -
 rollback-size,filesize - rollback-size + rollback-check-size) )

My understanding of the rollback problem is that there are some broken
proxies that do add some additional text garabge after the conection
has timed out for example. Then, for `--rollback-size=NUM' after
timing-out, wget shall cut the last NUM bytes of the file and try to
resume the download.

Chould you elaborate more on the situation where something like
`--rollback-check-size' would be needed? What shall be checked there?

-- jan

+--
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--




Re: WGET: failure for latest WGET to get ftp site

2001-01-10 Thread Jan Prikryl

Quoting Dan Harkless ([EMAIL PROTECTED]):

   It really might be a ftpparse "bug" - the version included in 1.7dev
   does not support certain NT servers. [...]
  
  Yes, it appears that we'll be forced to remove ftpparse.  :-(
 
 Does this remain true?

If you mean NT support, no, the native NT support should have already
been checked in. If you mean removing ftpparse.c, given that I have
got almost no reponse to my inquiries about the copyright and no
permissions to chenge the code at some places, I'm afraid that we will
have to drop it out ...

I have a patch that hopefully adds full VMS support and I have also
the MAC timestamping issue resolved. I have to review the changes once
more though to increase my confidence that I'm not submitting a
complete crap (as was the issue with '@' in passwords).

-- jan

+------
 Jan Prikryl| vr|vis center for virtual reality and visualisation
 [EMAIL PROTECTED] | http://www.vrvis.at
+--