Re: Help needed! How to pass XML message to webserver
On Mon, 8 Sep 2003, Vasudha Chiluka wrote: Hi , I need to pass XML message to a webserver using http. Could anybody tell me how I can accomplish this using wget. Any help is greatly appreciated.. why would you want to use wget for this? try with nc6: echo -en PUT http://theurl HTTP 1.1\n`cat file.xml` | nc6 servername 80 you can find nc6 at http://www.deepspace6.net. if the server is ipv4 and you don't need the advanced functions of nc6, you can also use the plain old nc instead of nc6. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Deep Space 6 - IPv6 with Linux http://www.deepspace6.net Ferrara Linux User Grouphttp://www.ferrara.linux.it
Re: --disable-dns-cache patch
On Sun, 7 Sep 2003, Hrvoje Niksic wrote: Jeremy Reeve [EMAIL PROTECTED] writes: Please consider this, my trivial --disable-dns-cache patch for wget. ChangeLog should read something like: 2003-09-07 Jeremy S. Reeve [EMAIL PROTECTED] * host.c, init.c, main.c, options.h: Added --disable-dns-cache option to turn off caching of hostname lookups. Thanks for the patch. I'm curious, in what circumstances would one want to use this option? (I'm also asking because of the manual in which I'd like to explain why the option is useful.) e.g., with RFC 3041 temporary ipv6 addresses. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Deep Space 6 - IPv6 with Linux http://www.deepspace6.net Ferrara Linux User Grouphttp://www.ferrara.linux.it
content-disposition header support
Hi, recursive downloading of cgi output gave me useless filenames. wget doesn't seem to support the content-disposition header that is part of RFC2616 (section 19.5.1), at least the string disposition occurs nowhere in the source distribution. In the list archive I found messages from 2001 which said it would be put in the TODO file. Anything new on the matter? -- Tobias Weber
Re: --disable-dns-cache patch
Mauro Tortonesi [EMAIL PROTECTED] writes: Thanks for the patch. I'm curious, in what circumstances would one want to use this option? (I'm also asking because of the manual in which I'd like to explain why the option is useful.) e.g., with RFC 3041 temporary ipv6 addresses. Do they really change within a Wget run? Remember that Wget's cache is not written anywhere on disk.
Re: --disable-dns-cache patch
On Wed, 10 Sep 2003, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: Thanks for the patch. I'm curious, in what circumstances would one want to use this option? (I'm also asking because of the manual in which I'd like to explain why the option is useful.) e.g., with RFC 3041 temporary ipv6 addresses. Do they really change within a Wget run? Remember that Wget's cache is not written anywhere on disk. if, e.g., wget would make a TCP connection for each HTTP retrieval, there could be a problem with temporary ipv6 addresses. the probability of finding this situation nears 0%, though ;-) -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Deep Space 6 - IPv6 with Linux http://www.deepspace6.net Ferrara Linux User Grouphttp://www.ferrara.linux.it
Re: autoconf 2.5 patch for wget
[ I'm Cc-ing the list because this might be interesting to others. ] Mauro Tortonesi [EMAIL PROTECTED] writes: ok, i agree here. but, in order to help me with my work on wget, could you please tell me: * how do you generate a wget tarball for a new release With the script `dist-wget' in the util directory. Ideally the `make dist' target should do the same job, but it gets some things wrong. Take a look at what `dist-wget' does, AFAIR it's pretty clearly written. * how do you generate/maintain gettext-related files (e.g. the files in the po directory The `.po' files are from the translation project. POTFILES.IN is generated by hand when a new `.c' file is added. * how do you generate/maintain libtool-related files (e.g. ltmain.sh) When a new libtool release comes out, ltmain.sh is replaced with the new one and aclocal.m4 is updated with the latest libtool.m4. config.sub and config.guess are updated as needed. * how do you generate/maintain automake-related files (e.g. aclocal.m4, mkinstalldirs, install-sh, etc...) I don't use Automake. mkinstalldirs and install-sh are standard Autoconf stuff that probably hasn't changed for years. If a bug is discovered, you can get the latest version from the latest Autoconf or wherever. it would be impossible for me to keep working on the autoconf-related part of wget without these info. I hope the above helped. There's really not much into it. BTW: could you please tell me what of these changes are acceptable for you: * Re-organized all wget-specific autoconf macros in the config directory As long as you're very careful not to break things, I'm fine with that. But be careful: take into account that Wget doesn't ship with libintl, that it doesn't use Automake, etc. When in doubt, ask. If possible, start with small things. * Re-libtoolized and re-gettextized the package I believe that libtoolization and gettextization are tied with Automake, but I could be wrong. I'm pretty sure that the gettextization process was wrong for Wget. * Updated aclocal.m4, config.guess, config.sub Note that Wget doesn't use a pre-generated (or auto-generated) aclocal.m4. Updating config.guess and config.sub is, of course, fine. * Added IPv6 stack detection to the configuration process Please be careful: Wget doesn't need the kind of stack detection that I've seen in many programs patched to support IPv6. Specifically, I don't want to cater to old buggy or obsolete IPv6 stacks. That's what I liked about Daniel's patch: it was straightforward and seemed to do the trick. If at all possible, go along those lines. * Re-named configure.in to configure.ac and modified the file for better autoconf 2.5x compliance That's fine, as long as it's uncoupled from other changes. Specifically, it should be possible to test all Autoconf-related changes. * Added profiling support to the configure script I'm not sure what you mean here. Why does configure need to be aware of profilers? * Re-named the realclean target to maintainer-clean in the Makefiles for better integration with po/Makefile.in.in and conformance to the de-facto standards That should be fine. * Modified the invocation of config.status in the targets in the Dependencies for maintenance section of Makefile.in, according to the new syntax introduced by autoconf 2.5x I haven't studied the new Autoconf in detail, but I trust that you know what you're doing here. util/Makefile.in: added rmold.pl target, just like texi2pod.pl in doc/Makefile.in src/wget.h: added better handling of HAVE_ALLOCA_H and changed USE_NLS to ENABLE_NLS Sounds fine. BTW what do you mean by better handling of HAVE_ALLOCA_H? Do you actually know that Wget's code was broken on some platforms, or are you just replacing old Autoconf boilerplate code with new one? Thanks for the work you've put in.
Re: autoconf 2.5 patch for wget
On Wed, 10 Sep 2003, Hrvoje Niksic wrote: [ I'm Cc-ing the list because this might be interesting to others. ] Mauro Tortonesi [EMAIL PROTECTED] writes: ok, i agree here. but, in order to help me with my work on wget, could you please tell me: * how do you generate a wget tarball for a new release With the script `dist-wget' in the util directory. Ideally the `make dist' target should do the same job, but it gets some things wrong. Take a look at what `dist-wget' does, AFAIR it's pretty clearly written. right. * how do you generate/maintain gettext-related files (e.g. the files in the po directory The `.po' files are from the translation project. POTFILES.IN is generated by hand when a new `.c' file is added. ok, but what about Makefile.in.in and wget.pot? * how do you generate/maintain libtool-related files (e.g. ltmain.sh) When a new libtool release comes out, ltmain.sh is replaced with the new one and aclocal.m4 is updated with the latest libtool.m4. config.sub and config.guess are updated as needed. do you mean that you simply copy these files manually from other packages? how do you update aclocal.m4? please, notice that i am __NOT__ criticizing this. i only want to know what's the update/maintenance procedure for these files, and possibly the rationale behind it. * how do you generate/maintain automake-related files (e.g. aclocal.m4, mkinstalldirs, install-sh, etc...) I don't use Automake. mkinstalldirs and install-sh are standard Autoconf stuff true. that probably hasn't changed for years. i am not so sure about this. If a bug is discovered, you can get the latest version from the latest Autoconf or wherever. ok. it would be impossible for me to keep working on the autoconf-related part of wget without these info. I hope the above helped. There's really not much into it. yes, certainly. BTW: could you please tell me what of these changes are acceptable for you: * Re-organized all wget-specific autoconf macros in the config directory As long as you're very careful not to break things, I'm fine with that. But be careful: take into account that Wget doesn't ship with libintl, that it doesn't use Automake, etc. When in doubt, ask. If possible, start with small things. ok. * Re-libtoolized and re-gettextized the package I believe that libtoolization and gettextization are tied with Automake, but I could be wrong. I'm pretty sure that the gettextization process was wrong for Wget. i agree with you here. * Updated aclocal.m4, config.guess, config.sub Note that Wget doesn't use a pre-generated (or auto-generated) aclocal.m4. Updating config.guess and config.sub is, of course, fine. how do you maintain aclocal.m4, then? by hand? this seems a bit too manual for me :-) and, more important, with this approach your package may keep using broken/obsoleted autoconf macros without your knowledge. that's one of the best reasons for using automake. * Added IPv6 stack detection to the configuration process Please be careful: Wget doesn't need the kind of stack detection that I've seen in many programs patched to support IPv6. i am afraid you're wrong here. usagi or kame stack detection is necessary to link the binary to libinet6 (if present). this lets wget use a version of getaddrinfo which is RFC3493-compliant and supports the AI_ALL, AI_ADDRCONFIG (which is __VERY__ important) and AI_V4MAPPED flags. the implementation of getaddrinfo shipped with glibc is not RFC3493-compliant. Specifically, I don't want to cater to old buggy or obsolete IPv6 stacks. nor do i. That's what I liked about Daniel's patch: it was straightforward and seemed to do the trick. If at all possible, go along those lines. if you want a good IPv6 support, then you'll have to add a bit of complexity to the autoconfiguration process. Daniel's patch is incomplete because: * it does not detect the ipv6 stack flavour (and most important it doesn't check if wget should be linked against libinet6) * it doesn't check if getaddrinfo supports AI_ADDRCONFIG, AI_ALL, AI_V4MAPPED flags * it does not check if the system supports struct sockaddr_in6 * it does not check if struct sockaddr_in6 has the sin6_scope_id member (not so important for wget) * it checks if ipv6 is supported at compile time disabling ipv6 support if it cannot create a PF_INET6 socket - this is a broken behaviour because the user may have forgot to load the ipv6 kernel module and still want wget to compile with ipv6 support for a good IPv6 autoconfiguration example, see nc6: http://cvs.deepspace6.net/view/nc6/ or, even better, oftpd: http://cvs.deepspace6.net/view/oftpd/ * Re-named configure.in to configure.ac and modified the file for better autoconf 2.5x compliance That's fine, as long as it's uncoupled from other changes. Specifically, it should be
Re: autoconf 2.5 patch for wget
Mauro Tortonesi [EMAIL PROTECTED] writes: * how do you generate/maintain gettext-related files (e.g. the files in the po directory The `.po' files are from the translation project. POTFILES.IN is generated by hand when a new `.c' file is added. ok, but what about Makefile.in.in and wget.pot? AFAIR wget.pot is generated by Makefile. (It should probably not be in CVS, though.) Makefile.in.in is not generated, it was originally adapted from the original Makefile.in.in from the gettext distribution. It has served well for years in the current form. * how do you generate/maintain libtool-related files (e.g. ltmain.sh) When a new libtool release comes out, ltmain.sh is replaced with the new one and aclocal.m4 is updated with the latest libtool.m4. config.sub and config.guess are updated as needed. do you mean that you simply copy these files manually from other packages? Yes. I don't do that very often. how do you update aclocal.m4? Wget's aclocal.m4 only contains Wget-specific stuff so it doesn't need special updating. The single exception is, of course, the `libtool.m4' part which needs to be updated along with ltmain.sh, but that is also rare. I really think aclocal.m4 should simply be INCLUDEing libtool.m4, but I wasn't sure how to do that, so I left it at that. (Note that I wasn't the one who introduced libtool to Wget, so it wasn't up to me originally.) please, notice that i am __NOT__ criticizing this. Don't worry, I'm not reading malice in your questions. All your questions are in fact quite valid and responding to them serves to remind myself of why I made the choices I did. I don't use Automake. mkinstalldirs and install-sh are standard Autoconf stuff true. that probably hasn't changed for years. i am not so sure about this. If they've changed and if updating them won't break anything, feel free to update them. (In a separate patch if possible.:-)). * Updated aclocal.m4, config.guess, config.sub Note that Wget doesn't use a pre-generated (or auto-generated) aclocal.m4. Updating config.guess and config.sub is, of course, fine. how do you maintain aclocal.m4, then? by hand? this seems a bit too manual for me :-) I believe Wget's aclocal.m4 is quite different from the ones in Automake-influenced software. I could be wrong, though. Please take another look at it, and please do ignore the libtool stuff which should really be handled with an include. and, more important, with this approach your package may keep using broken/obsoleted autoconf macros without your knowledge. I'm not so sure about that. The way I see it, Wget's configure.in and aclocal.m4 use documented Autoconf macros. Unless Autoconf changes incompatibly (which they shouldn't do without changing the major version), they should keep working. * Added IPv6 stack detection to the configuration process Please be careful: Wget doesn't need the kind of stack detection that I've seen in many programs patched to support IPv6. i am afraid you're wrong here. usagi or kame stack detection is necessary to link the binary to libinet6 (if present). this lets wget use a version of getaddrinfo which is RFC3493-compliant and supports the AI_ALL, AI_ADDRCONFIG (which is __VERY__ important) and AI_V4MAPPED flags. the implementation of getaddrinfo shipped with glibc is not RFC3493-compliant. Shouldn't we simply check for libinet6 in the usual fashion? Furthermore, I don't think that Wget uses any of those flags. Why are should an application that doesn't use them care? Note that I ask this not to annoy you but to learn; you obviously know much more about IPv6 than I do. I have to go now; I'll answer the rest of your message separately. Thanks for your patience and for the detailed reply.
Re: autoconf 2.5 patch for wget
On Wed, 10 Sep 2003, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: * how do you generate/maintain gettext-related files (e.g. the files in the po directory The `.po' files are from the translation project. POTFILES.IN is generated by hand when a new `.c' file is added. ok, but what about Makefile.in.in and wget.pot? AFAIR wget.pot is generated by Makefile. (It should probably not be in CVS, though.) Makefile.in.in is not generated, it was originally adapted from the original Makefile.in.in from the gettext distribution. It has served well for years in the current form. ok. i'll see if the new Makefile.in.in which ships with the latest gettext is worth an upgrade. * how do you generate/maintain libtool-related files (e.g. ltmain.sh) When a new libtool release comes out, ltmain.sh is replaced with the new one and aclocal.m4 is updated with the latest libtool.m4. config.sub and config.guess are updated as needed. do you mean that you simply copy these files manually from other packages? Yes. I don't do that very often. i can imagine ;-) how do you update aclocal.m4? Wget's aclocal.m4 only contains Wget-specific stuff so it doesn't need special updating. The single exception is, of course, the `libtool.m4' part which needs to be updated along with ltmain.sh, but that is also rare. I really think aclocal.m4 should simply be INCLUDEing libtool.m4, but I wasn't sure how to do that, so I left it at that. (Note that I wasn't the one who introduced libtool to Wget, so it wasn't up to me originally.) ok, so you simply take libtool.m4 or maybe only a part of it, and add all wget-specific macros to it. please, notice that i am __NOT__ criticizing this. Don't worry, I'm not reading malice in your questions. All your questions are in fact quite valid and responding to them serves to remind myself of why I made the choices I did. very good :-) that probably hasn't changed for years. i am not so sure about this. If they've changed and if updating them won't break anything, feel free to update them. (In a separate patch if possible.:-)). ok, but i don't care too much about this. * Updated aclocal.m4, config.guess, config.sub Note that Wget doesn't use a pre-generated (or auto-generated) aclocal.m4. Updating config.guess and config.sub is, of course, fine. how do you maintain aclocal.m4, then? by hand? this seems a bit too manual for me :-) I believe Wget's aclocal.m4 is quite different from the ones in Automake-influenced software. I could be wrong, though. Please take another look at it, and please do ignore the libtool stuff which should really be handled with an include. ok. and, more important, with this approach your package may keep using broken/obsoleted autoconf macros without your knowledge. I'm not so sure about that. The way I see it, Wget's configure.in and aclocal.m4 use documented Autoconf macros. Unless Autoconf changes incompatibly (which they shouldn't do without changing the major version), they should keep working. ok. * Added IPv6 stack detection to the configuration process Please be careful: Wget doesn't need the kind of stack detection that I've seen in many programs patched to support IPv6. i am afraid you're wrong here. usagi or kame stack detection is necessary to link the binary to libinet6 (if present). this lets wget use a version of getaddrinfo which is RFC3493-compliant and supports the AI_ALL, AI_ADDRCONFIG (which is __VERY__ important) and AI_V4MAPPED flags. the implementation of getaddrinfo shipped with glibc is not RFC3493-compliant. Shouldn't we simply check for libinet6 in the usual fashion? this could be another solution. but i think it would be much better to do it only for kame and usagi stack. Furthermore, I don't think that Wget uses any of those flags. Why are should an application that doesn't use them care? Note that I ask this not to annoy you but to learn; you obviously know much more about IPv6 than I do. well, it is very important using AI_ADDRCONFIG with getaddrinfo. in this way you get resolution of records only if you have ipv6 working on your host (and, less important, resolution of A records only if you have ipv4 working on your host). dns resolution in a mixed ipv4 and ipv6 environment is a nightmare and AI_ADDRCONFIG can save you a lot of headaches. but you have to handle the AI_ADDRCONFIG flag with care; first of all you must check that the system supports it and then you mustn't treat unsupported family socket errors as fatal. please, see nc6 code for this. i'll send a patch for this ASAP. -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Deep Space 6 - IPv6 with Linux http://www.deepspace6.net
wget --spider issue
Hi, i have found a problem regarding wget --spider. It works great for any files over http or ftp, but as soon as one of these two conditions occur, wget starts downloading the file: 1. linked files (i'm not 100% sure about this) 2. download scripts (i.e. http://www.nothing.com/download.php?file=12345;) i have included one link that starts downloading even if using the --spider option: http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587Section=5Product=MotherboardsModel=AX59%20ProType=ManualDownSize=8388 (MoBo Bios file); so this actually starts downloading: $ wget --spider 'http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587Section=5Product=MotherboardsModel=AX59%20ProType=ManualDownSize=8388' If there is no conlclusion to this problem using wget can anyone recommend another Link-Verifier? What i want to do is: check the existence of som 200k links store in a database. So far i was trying to use /usr/bin/wget --spider \' . $link . \' 21 | tail -2 | head -1 in a simple php script. Thanks for any help! - Best Regards, Andreas Belitz CIO TCTK - Database Solutions Nordanlage 3 35390 Giessen Germany Phone: +49 (0) 641 3019 446 Fax : +49 (0) 641 3019 535 Mobile : +49 (0) 176 700 16161 E-mail : mailto:[EMAIL PROTECTED] Internet : http://www.tctk.de
Re: wget --spider issue
On Wed, 10 Sep 2003, Andreas Belitz wrote: Hi, i have found a problem regarding wget --spider. It works great for any files over http or ftp, but as soon as one of these two conditions occur, wget starts downloading the file: 1. linked files (i'm not 100% sure about this) 2. download scripts (i.e. http://www.nothing.com/download.php?file=12345;) i have included one link that starts downloading even if using the --spider option: http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587Section=5Product=MotherboardsModel=AX59%20ProType=ManualDownSize=8388 (MoBo Bios file); so this actually starts downloading: $ wget --spider 'http://club.aopen.com.tw/downloads/Download.asp?RecNo=3587Section=5Product=MotherboardsModel=AX59%20ProType=ManualDownSize=8388' actually, what you call download scripts are actually HTTP redirects, and in this case the redirect is to an FTP server and if you double-check i think you'll find Wget does not know how to spider in ftp. end run-on-sentence. If there is no conlclusion to this problem using wget can anyone recommend another Link-Verifier? What i want to do is: check the existence of som 200k links store in a database. So far i was trying to use /usr/bin/wget --spider \' . $link . \' 21 | tail -2 | head -1 in a simple php script. I do something similar with Wget (using shell scripting instead), and I am pleased with the outcome. Since you are calling Wget for each link and if you note that Wget does a good job of returning success or failure, you can actually do this.. wget --spider '$link' || echo '$link' badlinks.txt I can send you my shell scripts if you're interested. /a Thanks for any help! -- Our armies do not come into your cities and lands as conquerors or enemies, but as liberators. - British Lt. Gen. Stanley Maude. Proclamation to the People of the Wilayat of Baghdad. March 8, 1917.
Re: wget --spider issue
Hi Aaron S. Hawley, On Wed, 10. September 2003 you wrote: ASH actually, what you call download scripts are actually HTTP redirects, and ASH in this case the redirect is to an FTP server and if you double-check i ASH think you'll find Wget does not know how to spider in ftp. end ASH run-on-sentence. Ok. This seems to be the reason. Thanks. Is there any way to make wget spider ftp adresses? ASH I can send you my shell scripts if you're interested. ASH /a That would be great! - Mit freundlichen GrĂ¼ssen Andreas Belitz CIO TCTK - Database Solutions Nordanlage 3 35390 Giessen Germany Phone: +49 (0) 641 3019 446 Fax : +49 (0) 641 3019 535 Mobile : +49 (0) 176 700 16161 E-mail : mailto:[EMAIL PROTECTED] Internet : http://www.tctk.de
Re: autoconf 2.5 patch for wget
Mauro Tortonesi [EMAIL PROTECTED] writes: AFAIR wget.pot is generated by Makefile. (It should probably not be in CVS, though.) Makefile.in.in is not generated, it was originally adapted from the original Makefile.in.in from the gettext distribution. It has served well for years in the current form. ok. i'll see if the new Makefile.in.in which ships with the latest gettext is worth an upgrade. Note that Wget's Makefile.in.in is likely quite different than the canonical version because of the lack of libintl bundling. That's as it should be. how do you update aclocal.m4? Wget's aclocal.m4 only contains Wget-specific stuff so it doesn't need special updating. The single exception is, of course, the `libtool.m4' part which needs to be updated along with ltmain.sh, but that is also rare. I really think aclocal.m4 should simply be INCLUDEing libtool.m4, but I wasn't sure how to do that, so I left it at that. (Note that I wasn't the one who introduced libtool to Wget, so it wasn't up to me originally.) ok, so you simply take libtool.m4 or maybe only a part of it, and add all wget-specific macros to it. Or the other way around: leave Wget-specific macros and replace libtool.m4 contents. aclocal.m4 has this part: # We embed libtool.m4 from libtool distribution. # -- embedded libtool.m4 begins here -- [ ... contents of libtool.m4 follows ... ] # -- embedded libtool.m4 ends here -- When you need to update libtool.m4, you do the obvious -- replace the old contents of libtool.m4 with the new contents. As I said, it would be even better if it said something like AC_INCLUDE([libtool.m4]) (or whatever the correct syntax is), so you can simply drop in the new libtool.m4 without the need for editing. Shouldn't we simply check for libinet6 in the usual fashion? this could be another solution. but i think it would be much better to do it only for kame and usagi stack. Hmm. Checking for stacks by names is not the Autoconf way. Isn't it better to test for needed features? Daniel's test was written in that spirit. Furthermore, I don't think that Wget uses any of those flags. Why are should an application that doesn't use them care? Note that I ask this not to annoy you but to learn; you obviously know much more about IPv6 than I do. well, it is very important using AI_ADDRCONFIG with getaddrinfo. in this way you get resolution of records only if you have ipv6 working on your host (and, less important, resolution of A records only if you have ipv4 working on your host). dns resolution in a mixed ipv4 and ipv6 environment is a nightmare and AI_ADDRCONFIG can save you a lot of headaches. Very interesting. So what you're saying is that programs that simply follow the getaddrinfo man page (including IPv6-enabled Wget in Debian) don't work in mixed environments? That's really strange.
Re: using host-cache configurable via command line
Patrick Cernko [EMAIL PROTECTED] writes: I discovered a small problem with the increasing number of servers with canching IPs but constant name (provided by Nameservers like dyndns.org). If the download with wget is interrupted by a IP change (e.g. a dialup host whose provider killed the connection), wget retries the download using the previously cached IP. This will fail as the host (specified by its dyndns-Hostname) is no longer reachable via this old IP. Instead he is reachable over a new IP (assigned by its provider). But it is still reachable via its hostname as the host updated the DNS entry with its new IP. So I patched wget to tell it, not to use the cached IPs from earlier but instead do a new host lookup like for the first time connecting the host. Patrick, thanks for the patch and the explanation. A similar change, probably with invocation `--dns-cache=off', is scheduled to appear in the next release. Your contribution is also important because we've been looking for a suitable text for the manual that explains why it is sometimes beneficial to turn off the DNS cache.
Re: wget --spider issue
On Wed, 10 Sep 2003, Andreas Belitz wrote: Hi Aaron S. Hawley, On Wed, 10. September 2003 you wrote: ASH actually, what you call download scripts are actually HTTP redirects, and ASH in this case the redirect is to an FTP server and if you double-check i ASH think you'll find Wget does not know how to spider in ftp. end ASH run-on-sentence. Ok. This seems to be the reason. Thanks. Is there any way to make wget spider ftp adresses? I sent a patch to this list over the winter. it's included with the shell scripts i spoke of and have attached to this message. ASH I can send you my shell scripts if you're interested. ASH /a That would be great! gnurls-0.1.tar.gz Description: Binary data
wget and MM_openBrWindow
Hi, I've found a page with a definition function MM_openBrWindow(theURL,winName,features) { //v2.0 window.open(theURL,winName,features); } and its use a href=andressa.htm#img src=f4andremin.jpg width=90 height=120 border=0 onClick =MM_openBrWindow('f4and.htm','','width=315,height=420')/a The idea is to click and a window pops up (f4and.htm) wich simply loads an image. The trouble is that wget does not follow the link, base-url/f4and.htm, so the page and itrs contents (the image) is not downloaded. Any help would be appreciated. Regards, Wilson
Re: wget and MM_openBrWindow
Wget doesn't interpret Javascript, only regular HTML (AFAIK). Wget won't be able to follow any links that are only setup thru a Javascript function like MM_openBrWindow(). Adam -- Adam Stein @ Xerox Corporation Email: [EMAIL PROTECTED] Disclaimer: All views expressed here have been proved to be my own. [http://www.csh.rit.edu/~adam/] Mailing-List: contact [EMAIL PROTECTED]; run by ezmlm Delivered-To: mailing list [EMAIL PROTECTED] Delivered-To: [EMAIL PROTECTED] X-MessageWall-Score: 0 (sunsite.dk) Date: Fri, 05 Sep 2003 22:28:40 -0300 From: Wilson Rosa [EMAIL PROTECTED] User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.0.2) Gecko/20021120 Netscape/7.01 X-Accept-Language: en-us, en To: [EMAIL PROTECTED] Subject: wget and MM_openBrWindow Content-Transfer-Encoding: 7bit X-Spam-Status: No, hits=-1.8 required=5.0 tests=BAYES_10,HTML_20_30,HTML_IMAGE_ONLY_04, USER_AGENT_MOZILLA_UA,X_MAILING_LIST version=2.55-iset1 X-Spam-Level: X-Spam-Checker-Version: SpamAssassin 2.55-iset1 (1.174.2.19-2003-05-19-exp) Hi, I've found a page with a definition function MM_openBrWindow(theURL,winName,features) { //v2.0 window.open(theURL,winName,features); } and its use a href=andressa.htm#img src=f4andremin.jpg width=90 height=120 border=0 onClick =MM_openBrWindow('f4and.htm','','width=315,height=420')/a The idea is to click and a window pops up (f4and.htm) wich simply loads an image. The trouble is that wget does not follow the link, base-url/f4and.htm, so the page and itrs contents (the image) is not downloaded. Any help would be appreciated. Regards, Wilson