Re: Annyoing behaviour with --input-file

2003-11-25 Thread Fred Holmes
I pointed this out about a year ago. As I recall, the response I got back then was that fixing it is too hard. I'm looking for any way to download new/newer files on a specific list (wild cards won't make the proper selection) where wget makes one connection and keeps it for the entire

GNU Wget 1.8.2 --output-document and --page-requisites incompatible

2003-11-25 Thread Lars Noodén
I've been using wget for a few years now (it's been great) and find it increasingly useful. Right now I've got GNU Wget 1.8.2 and have noticed a quirk: --output-document and --page-requisites don't seem to like to work together. e.g. bash-2.05a$ wget --non-verbose

Re: keep alive connections

2003-11-25 Thread Hrvoje Niksic
Alain Bench [EMAIL PROTECTED] writes: | /* Return if we have no intention of further downloading. */ | if (!(*dt RETROKF) || (*dt HEAD_ONLY)) |{ | /* In case the caller cares to look... */ | hs-len = 0L; | hs-res = 0; | FREE_MAYBE (type); | FREE_MAYBE

Re: Recursive ftp broken

2003-11-25 Thread Hrvoje Niksic
Thanks for the report, this is most likely caused by my recent changes that eliminate rbuf* from the code. (Unfortunately, the FTP code kept some state in struct rbuf, and my changes might have broken things.) To be absolutely sure, see if it works under 1.9.1 or under CVS from one week ago.

Re: Wget dies with file size limit exceeded on files 2 gigs

2003-11-25 Thread Hrvoje Niksic
Tony Lewis [EMAIL PROTECTED] writes: A patch was recently submitted for this issue. I don't know if anything has made it into the CVS or not. Hrvoje didn't like its dependence on long long so it might not have. The patch uses `long long' without bothering to check whether the compiler accepts

Re: GNU Wget 1.8.2 --output-document and --page-requisites incompatible

2003-11-25 Thread Hrvoje Niksic
Thanks for the report. This is a known bug, that is unfortunately also present in 1.9.x. I hope to fix it for the next release.

correct processing of redirections

2003-11-25 Thread Peter Kohts
Hi there. Let me explain the problem: 1) I'm trying to prepare for being a mirror of www.gnu.org (which is not the most ashamed thing to do, I suppose). 2) I'm somewhat devoted to wget and do not want to use other software. 3) There're some redirects at www.gnu.org to other hosts like

Re: Recursive ftp broken

2003-11-25 Thread Hrvoje Niksic
Gisle Vanem [EMAIL PROTECTED] writes: [...] == SYST ... done.== PWD ... done. ! is '/' here == TYPE I ... done. == CWD not required. == PORT ... done.== RETR BAN-SHIM.ZIP ... No such file `BAN-SHIM.ZIP'. ... Interestingly, I can't repeat this. Still, to be on the safe side,

Re: Annyoing behaviour with --input-file

2003-11-25 Thread Fred Holmes
At 06:30 PM 11/25/2003, Hrvoje Niksic wrote: Are you using --timestamping (-N)? If so, can you do without it, or replace it with --no-clobber? But then you will only download new files, not newer files? But I want the newer files (updated virus definition files from ftp.f-prot.com). And I

can you authenticate to a http proxy with a username that contains a space?

2003-11-25 Thread antonio taylor
example: http://fisrtname lastname:[EMAIL PROTECTED] thanks, T

Re: can you authenticate to a http proxy with a username that contains a space?

2003-11-25 Thread Tony Lewis
antonio taylor wrote: http://fisrtname lastname:[EMAIL PROTECTED] Have you tried http://fisrtname%20lastname:[EMAIL PROTECTED] ?