Re: Wget 1.7-pre1 available for testing
On 2001-05-26 13:35 +0200, Hrvoje Niksic wrote: > The pre-release is available at: > > ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.7-pre1.tar.gz Tried it on NCR MP-RAS 3.0 : - The CC "-cX" problem in wget 1.6 is fixed. Good. :-) - The empty LIBS problem remains (add -lsocket -lnsl). - utils.c doesn't compile because MAP_FAILED is not defined in the system headers, so : diff -rc wget-1.7-pre1/src/utils.c wget-1.7-pre1_aym/src/utils.c *** wget-1.7-pre1/src/utils.c Sat May 12 15:06:41 2001 --- wget-1.7-pre1_aym/src/utils.c Fri Jun 1 13:46:34 2001 *** *** 976,983 efficiency, but at some cost to generality. */ fm->content = mmap (NULL, fm->length, PROT_READ | PROT_WRITE, MAP_PRIVATE, fd, 0); ! if (fm->content == (char *)MAP_FAILED) ! goto mmap_lose; if (!inhibit_close) close (fd); --- 976,990 efficiency, but at some cost to generality. */ fm->content = mmap (NULL, fm->length, PROT_READ | PROT_WRITE, MAP_PRIVATE, fd, 0); ! { ! #ifdef MAP_FAILED ! const char *const fail = (char *)MAP_FAILED; ! #else ! const char *const fail = (char *)-1; ! #endif ! if (fm->content == fail) ! goto mmap_lose; ! } if (!inhibit_close) close (fd); Otherwise, it seems to work (passes a few simple tests). -- André Majorel Work: <[EMAIL PROTECTED]> Home: <[EMAIL PROTECTED]> http://www.teaser.fr/~amajorel/
Re: SVR4 compile error
On 2001-05-26 11:10 +0200, Hrvoje Niksic wrote: > Andre Majorel <[EMAIL PROTECTED]> writes: > > > Compiling Wget 1.6 on an SVR4 derivative (NCR MP-RAS 3.0), I got > > this strange error: > > I think the problem is that Wget 1.6 tried to force "strict ANSI mode" > out of the compiler. > > Try running make like this: > > make CC=cc CFLAGS=-g > > See if it compiles then. After removing "-cX" from $(CC) and adding "-lsocket -lnsl" to $(LIBS), it compiled. I guess autoconf has not been given much testing on this platform. :-) The binary seems fine. Is there a central repository for wget binaries ? -- André Majorel Work: <[EMAIL PROTECTED]> Home: <[EMAIL PROTECTED]> http://www.teaser.fr/~amajorel/
Any speculation as to when GA(generally available & stable for public consumption) version of wget v 1.7 w/SSL will be released...
Any speculation as to when GA(generally available & stable for public consumption) version of wget v 1.7 w/SSL will be released. I have a project that wget w/SSL would be ideal for. Thanks, Dom
RE: Is there a way to override wgetrc options on command line?
Thanks! That worked. --Dave -Original Message- From: Hack Kampbjørn [mailto:[EMAIL PROTECTED]] Sent: Friday, June 01, 2001 2:25 AM To: Humes, David G. Cc: '[EMAIL PROTECTED]' Subject: Re: Is there a way to override wgetrc options on command line? "Humes, David G." wrote: > > Hello, > > I have several cronjobs using wget and the wgetrc file turns on passive-ftp > by default. I have one site where strangely enough passive ftp does not > work but active does work. I'd rather leave the passive ftp default set and > just change the one cronjob that requires active ftp. Is there any way to > tell wget to either disregard the wgetrc file or to override one or more of > its options? > > Thanks. What about --execute=COMMAND ? $ wget --help GNU Wget 1.7-pre1, a non-interactive network retriever. Usage: wget [OPTION]... [URL]... Mandatory arguments to long options are mandatory for short options too. Startup: -V, --version display the version of Wget and exit. -h, --help print this help. -b, --backgroundgo to background after startup. -e, --execute=COMMAND execute a `.wgetrc'-style command. [...] -- Med venlig hilsen / Kind regards Hack Kampbjørn [EMAIL PROTECTED] HackLine +45 2031 7799
Re: fooling around...
Herold Heiko <[EMAIL PROTECTED]> writes: > in a moment of madness. Windows binary pre-1.7 is involved, so beware. > First problem: possibly there is some problem in the win binary related > to cookies, with a large cookies.txt file (>500) wget does seem to > segfault ("Memory could not be read...") while loading the cookies. It would be of great help if you could run Wget under a debugger and pinpoint at which line the problem happens. It is quite possible that Wget has a bug that only shows under Windows. I'll take a look at the "goofball" thing.
fooling around...
in a moment of madness. Windows binary pre-1.7 is involved, so beware. First problem: possibly there is some problem in the win binary related to cookies, with a large cookies.txt file (>500) wget does seem to segfault ("Memory could not be read...") while loading the cookies. However I haven't been yet able to pin it down further except it doeasn't happen with the same cookies.txt on linux (same version of wget). OTOH, consider www.goofball.com, create a login, login with the "remember me" checkbox set, take the cookies and put them in a separate file, say cookies2.txt . Will be something like .goofball.com TRUE/ FALSE 1300736097 gbperm .goofball.com TRUE/ FALSE 1937832397 password .goofball.com TRUE/ FALSE 1937832397 username .goofball.com TRUE/ FALSE 991831757 gbsession Now try to index the whole site with something like wget --load-cookies=cookies2.txt --save-cookies=cookies2s.txt-e "robots=off" -nc -e "timestamping=off" -v -a html.log -r -l0 -Ahtm,html -X/searchbin http://www.goofball.com wget does seem to go into a infite loop, trying to load /index.html over and over again. I expect it to load every page once only, even if generated on the fly without timestamps (due to no timestamping, -r and no-clobber). Either I don't understand something in the manual or (I fear) there's a bug. Anyone ? Heiko -- -- PREVINET S.p.A.[EMAIL PROTECTED] -- Via Ferretto, 1ph x39-041-5907073 -- I-31021 Mogliano V.to (TV) fax x39-041-5907087 -- ITALY
Re: wget 1.6 problems with FTP globbing through a Squid firewall
Paul Eggert <[EMAIL PROTECTED]> writes: > * A command like "wget -rl1 'ftp://elsie.nci.nih.gov/pub/' -A'tz*.tar.gz'" > is less natural than "wget 'ftp://elsie.nci.nih.gov/pub/tz*.tar.gz'". Yes. I feel that the Right Thing would be for Wget to DWIM and translate the glob pattern into a -A form when required. I might attempt to implement something like that for the next release, but I'm not sure if that can work in the general case -- merging the glob with a bunch of existing -R/-A options might be non-trivial. > Sorry, I don't understand this point. I thought that (in principle, > at least) wget should interpret file name globbing consistently, > regardless of whether it is using a proxy. It's true that this should be the case, but that's not the current situation. Currently if you want to retrieve a '*' file from FTP, you have to use '\*', whereas on HTTP '*' will suffice and '\*' will retrieve literal '\*'. I will try to consolidate all this for the next release. > (Please understand that I am not complaining -- I'm just trying to > help wget get better.) Don't worry, I'm taking your comments in good spirit!