Wget 1.9.1 is now available on ftp.gnu.org and its mirrors. It is a
bugfix release that contains fixes for several problem noted in the
1.9 release. Unless further serious bugs are discovered, it is likely
to remain the last in the 1.9.x series.
[EMAIL PROTECTED] writes:
Thank you for your reply. Yes the gif file was downloaded.
Then this sounds like a bug. Many bugs have been fixed in the several
years between the 1.7 and the 1.9.1 release, so it's all the more
reason to try the new one.
I am not the administrator of the server
, shouldn't really be done, IPV6 code is in development and
so on, so this is just a FYI.
Thanks for the report. This patch should fix it:
2003-11-14 Hrvoje Niksic [EMAIL PROTECTED]
* main.c: Enable -4 and -6 only if IPv6 is enabled.
Index: src/main.c
Hrvoje Niksic [EMAIL PROTECTED] writes:
Gisle Vanem [EMAIL PROTECTED] writes:
Running wget -6 url.. on a machine with no IPv6 installed
silently uses IPv4. A warning with fallback to IPv4 is IMHO
okay. Or an exit?
I gave this question some more thought. Here are some conclusions.
* -4
Maarten Boekhold [EMAIL PROTECTED] writes:
Is there anything obvious here that I'm doing wrong?
Mirroring FTP over proxies is broken in Wget 1.8 and later, sorry.
I'll look into fixing them for 1.10.
Dan Jacobson [EMAIL PROTECTED] writes:
But I want a
--second-guess-the-dns=ADDRESS
Aside from `--second-guess-the-dns' being an awful name (sorry), what
is the usage scenario for this kind of option? I.e. why would anyone
want to use it?
Perhaps the user should do all this in the
Post, Mark K [EMAIL PROTECTED] writes:
You can do this now:
wget http://216.46.192.85/
Using DNS is just a convenience after all, not a requirement.
Unfortunately, widespread use of name-based virtual hosting made it a
requirement in practice. ISP's typically host a bunch of web sites on
Andrew Bachmann [EMAIL PROTECTED] writes:
To compile wget, I had to comment out this line from ftp-basic.c:
#include arpa/inet.h
Thanks, I'll remove it. In fact, I suspect that the inet includes are
no longer even necessary in ftp-basic.c.
I'm not sure how to manage the MACHINES file in the distribution. As
very few people keep testing the old operating systems documented in
MACHINES, it's impossible to guarantee that new versions of Wget will
compile or work on them.
One way to fix this would be to accopmany the OS entries in
Marius Andreiana [EMAIL PROTECTED] writes:
wget -r --no-parent http://www.desktoplinux.com/files/article003/
stops at the first slides (4-6) instead of getting all presentation.
Which version of Wget are you using? Can you send the debug output
from the Wget run?
Marius Andreiana [EMAIL PROTECTED] writes:
On Du, 2003-11-16 at 23:08, Hrvoje Niksic wrote:
Which version of Wget are you using? Can you send the debug output
from the Wget run?
GNU Wget 1.8.2 (Fedora Core 1).
Attached debuged output.
As you see, it won't get at slide 7.
Maximum
DervishD [EMAIL PROTECTED] writes:
One way to fix this would be to accopmany the OS entries in MACHINES
with the version of Wget that they apply to. But the problem is that,
as each version is released, you will only see which machines the
*previous* versions worked on.
But you have
Sergey Vasilevsky [EMAIL PROTECTED] writes:
Wget 1.9.1
.wgetrc:
reject = *.[Ee][xX][Ee]*
follow_ftp = off
Follow ftp is off by default, so you shouldn't need to set it
explicitly.
What might have happened in your case is that a http URL *redirected*
to ftp, which was followed as a
Come to think of it, I've had need for this before; the switch makes
at least as much sense as `--bind-address', which I've never needed
myself.
Maybe `--connect-address' would be a good name for the option? It
would nicely parallel `--bind-address'.
Are there any takers to implement it?
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
This is a binary compiled and run on windows nt 4, which
doesn't support
IPV6, so the -4 should probably be a no-op ?
Or not work at all.
I was thinking (rather late, I see you have changed other
Hrvoje Niksic [EMAIL PROTECTED] writes:
* If the machine doesn't support AI_ADDRCONFIG and Wget sets -4
behind your back, then you shouldn't be allowed to specify -6
because it clearly contradicts with the automagically set -4.
(But even then you can still use `--no-inet4-only -6
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Maybe `--connect-address' would be a good name for the option? It
would nicely parallel `--bind-address'.
I was wondering if it should be possibile to pass more than one name
to address change
Maciej W. Rozycki [EMAIL PROTECTED] writes:
Hmm, couldn't --header Host: hostname work? I think it could,
but now wget appends it instead of replacing its own generated
one...
It's not very hard to fix `--header' to replace Wget-generated
values.
Is there consensus that this is a good
Dan Jacobson [EMAIL PROTECTED] writes:
$ wget --spider BAD_URL GOOD_URL; echo $?
0
$ wget --spider GOOD_URL BAD_URL; echo $?
1
I say they both should be 1.
If anything bad happens, return 1 or some other non-zero value.
By BAD, I mean a producer of e.g.,
ERROR 503: Service Unavailable.
[EMAIL PROTECTED] writes:
Upgrading wget to 1.8.1 (my server admin won't put a newer version
on)
??? Why 1.8.1? 1.8.2 fixed many *bugs* that were present in 1.8.1.
This sounds like the sticking-to-Debian-stable brain damage.
If I were you, I would ask the admin the permission to compile the
I'm considering the use of lrand48/drand48 (where available) to
generate random integer and floats. The code Wget uses now is
portable, but very primitive, especially for generating floats. But
the Linux man page says this about *rand48:
NOTES
These functions are declared obsolete
Manfred Schwarb [EMAIL PROTECTED] writes:
I just installed wget 1.9.1, works fine. But on my machine,
translations are broken somehow, all special characters are
scrambled. With wget 1.9 this didn't happen.
There was no change in handling translation from 1.9 to 1.9.1, except
perhaps for a
Dan Jacobson [EMAIL PROTECTED] writes:
I sure hope that when one sees
Connecting to jidanni.org[216.46.192.85]:80... connected.
that there is no interference along the way, that that IP is really
where we are going, to wget's best ability.
I can guarantee that much -- the entire point of
Dan Jacobson [EMAIL PROTECTED] writes:
And stop making me have to confirm each and every mail to this list.
Hrvoje Currently the only way to avoid confirmations is to
Hrvoje subscribe to the list. I'll try to contact the list owners
Hrvoje to see if the mechanism can be improved.
Herold Heiko [EMAIL PROTECTED] writes:
Attached a little patch needed for current cvs in order to compile
on windows nt 4 (any system without IPV6 really).
Thanks. Note that the function isn't even called when IPv6 is
unavailable, so you can feel free to wrap the entire function in
#ifdef
Gisle Vanem [EMAIL PROTECTED] writes:
Due to lack of inet_ntop() on Windows, I used this instead:
struct sockaddr_in6 addr6;
addr6.sin6_family = AF_INET6;
memcpy (addr6.sin6_addr, address, sizeof(addr6.sin6_addr));
if (getnameinfo ((const struct sockaddr*)addr6, sizeof(addr6),
Gisle Vanem [EMAIL PROTECTED] writes:
but Wget doesn't check return value of inet_ntop(). Hint hint.
I wasn't aware that inet_ntop could really fail. Why did
getaddrinfo return the address if I can't print it?
getaddrinfo() on Win-XP seems to be a thin wrapper over the DNS
client which
Does anyone know whether the MSG_PEEK flag can be relied upon? I'd
like to use peeking to get rid of the ad hoc rbuf layer used in Wget
since time immemorial.
Peeking would require additional work under SSL, but I think I know
how to make it work. But I'm more worried about TCP/IP stacks that
Mauro Tortonesi [EMAIL PROTECTED] writes:
hi to hrvoje and all, i am still alive ;-) and i am finally catching
up with the changes you've done at wget ipv6 code. from what i've
seen so far, it seems that you've done a great job (especially on
lookup_host and resolve_bind_address and on the
Alain Bench [EMAIL PROTECTED] writes:
| /* Return if we have no intention of further downloading. */
| if (!(*dt RETROKF) || (*dt HEAD_ONLY))
|{
| /* In case the caller cares to look... */
| hs-len = 0L;
| hs-res = 0;
| FREE_MAYBE (type);
| FREE_MAYBE
Thanks for the report, this is most likely caused by my recent changes
that eliminate rbuf* from the code. (Unfortunately, the FTP code kept
some state in struct rbuf, and my changes might have broken things.)
To be absolutely sure, see if it works under 1.9.1 or under CVS from
one week ago.
Tony Lewis [EMAIL PROTECTED] writes:
A patch was recently submitted for this issue. I don't know if
anything has made it into the CVS or not. Hrvoje didn't like its
dependence on long long so it might not have.
The patch uses `long long' without bothering to check whether the
compiler accepts
Thanks for the report. This is a known bug, that is unfortunately
also present in 1.9.x. I hope to fix it for the next release.
Gisle Vanem [EMAIL PROTECTED] writes:
[...]
== SYST ... done.== PWD ... done. ! is '/' here
== TYPE I ... done. == CWD not required.
== PORT ... done.== RETR BAN-SHIM.ZIP ...
No such file `BAN-SHIM.ZIP'.
...
Interestingly, I can't repeat this. Still, to be on the safe side,
Adam Klobukowski [EMAIL PROTECTED] writes:
Adam Klobukowski [EMAIL PROTECTED] writes:
If wget is used with --input-file option, it gets directory
listing for each file specified in input file (if ftp protocol)
before downloading each file,
This is not specific to --input-file, it
By the way, can you please clarify the intention behind AI_V4MAPPED
and AI_ALL, which configure tests for, but nothing uses?
Tony Lewis [EMAIL PROTECTED] writes:
antonio taylor wrote:
http://fisrtname lastname:[EMAIL PROTECTED]
Have you tried http://fisrtname%20lastname:[EMAIL PROTECTED] ?
Or simply quotes, as in wget http://firstname lastname:[EMAIL PROTECTED].
Does someone have access to a BEOS machine with a compiler? I'd like
to verify whether the current CVS works on BEOS, i.e. whether it's
still true that BEOS doesn't support MSG_PEEK.
Speaking of testing, please be sure to test the latest CVS on Windows
as well, where MSG_PEEK is said to be
Daniel Stenberg [EMAIL PROTECTED] writes:
Out of curiosity, why are you introducing this peeking? I mean,
what's the gain?
Simplifying the code. Getting rid of the unfinished and undocumented
rbuf abstraction layer. Buffering is unnecessary when downloading
the body, and is mostly
Peter Kohts [EMAIL PROTECTED] writes:
4) When I'm doing straight-forward wget -m -nH http://www.gnu.org;
everything is excellent, except the redirections: the files which we
get because of the redirections overwrite any currently existing
files with the same filenames.
I see your point.
anything, and if it makes wget more
useful, i can think of no reason this capability shouldn't be added.
Agreed. This patch should fix your case. It applies to the latest
CVS sources, but it can be easily retrofitted to earlier versions as
well.
2003-11-26 Hrvoje Niksic [EMAIL PROTECTED
.
Here's a fix, against the current CVS. (I applied a similar fix to
the 1.9 branch as well.) Thanks for the report.
2003-11-27 Hrvoje Niksic [EMAIL PROTECTED]
* connect.c (bind_local): Rename sa_len to addrlen because IRIX
headers define sa_len as a macro.
Index: src/connect.c
Karsten Hopp [EMAIL PROTECTED] writes:
I'm working on it ;-) I'd like to put a CVS version into rawhide to
get more feedback about the current status but that makes only sense
when the next official version will be released before 'Fedora Core
2' as I won't put a beta version into a Fedora
Alain Bench [EMAIL PROTECTED] writes:
Wget 1.9.1: I sometimes seem to be stuck in an overly long (like
more than 1 hour) timeout on closing connection, when server went
down or modem hangup during a read or just before close. I use
Wget's default timeout (0, 0, 900), or sometimes --timeout=30
John Burwell [EMAIL PROTECTED] writes:
I'm having trouble with wget burrowing infinitely into circular
symlinks when I don't think it's supposed to be following these
symlinks at all.
Indeed it isn't. What does the directory listing look like?
John Burwell [EMAIL PROTECTED] writes:
lrwxrwxrwx1 40 Nov 13 16:27 ad@ - /Documents/WebServer/ads
[...]
drwxrwxrwx1 5610 Nov 13 16:27 beta@
You can see 'ad' pointing to 'ads' in this case, and wget behaves.
However, you can see where beta is marked as a
Stan Behrens [EMAIL PROTECTED] writes:
i mean this is a bug:
--- code --
# wget -dr ftp://hladdons:[EMAIL PROTECTED]/metamod
DEBUG output created by Wget 1.9.1 on linux-gnu.
Using `80.239.144.146/.listing' as listing tmp file.
--14:22:12-- ftp://hladdons:[EMAIL
Erez Doron [EMAIL PROTECTED] writes:
i am trying to mirror a site. wget keeps escaping the filenames (
e.g. i get '%20' instead of a space character)
Which version of Wget are you using?
Noèl Köthe [EMAIL PROTECTED] writes:
I built 1.9.1 with these options:
CFLAGS=$$CFLAGS -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -g -Wall \
NB: don't use _LARGEFILE_SOURCE with Wget, it won't work correctly.
./configure \
--prefix=/usr \
Erez Doron [EMAIL PROTECTED] writes:
I am using the latest RPM i found which was 1.8.2
Yes, 1.8.2 still aggressively quoted file names derived from URLs.
Ill upgrade at least to 1.9.1
It shouldn't be hard to compile Wget from source. In most cases, it
amounts to unpacking the source and
Erez Doron [EMAIL PROTECTED] writes:
i have downloaded and compiled 1.9.1 (which seems to be the latest)
now i get space characters instead of '%20' but still i get other
non-english chars as escaped chars
i tried both '--restrict-file-names=unix' and
'--restrict-file-names=windows'
[...]
Erez Doron [EMAIL PROTECTED] writes:
Use `--restrict-file-names=nocontrol'. Have you looked at the
documentation?
i did 'wget --restrict-file-names=help' and got the options 'unix' and
'windows' only.
`help' is not the valid keyword, what you got was just an error
message, not full
[EMAIL PROTECTED] [EMAIL PROTECTED] writes:
New info: I found and tried a different version (1.9.1) and it
worked correctly. So it looks like the problem is just with the dev
version (1.9+cvs-dev-200311280902) I was using.
Ah, I see. Heiko's page doesn't make it obvious which version is the
It's hard to tell without more information, but I don't think the
problem is Solaris or sparc specific. Later versions support the
norobots specs that the earlier ones didn't, which is sometimes a
problem. A good start would be to try `-e robots=off' and see if it
helps.
Michael Helm [EMAIL PROTECTED] writes:
What should I do to debug this, or what additional info can I
provide to help figure out what's gone wrong with the solaris
version?
Sending the `-d' output of both would probably be very helpful.
Dan Jacobson [EMAIL PROTECTED] writes:
excuse me, you said will not download the file. I just wanted to
know how big it was, not get it. GNU Wget 1.8.2
This should work better with the latest version.
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
..
released one that we want (most) users to download. Heiko, would you
consider reordering the table so that the 1.9.1 release row comes
first, followed by development version, (optionally) followed
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Hmm. Then how about separating the development snapshots, and older
entries, to a separate page? It seems simpler for there to be only
Yes, ok, I'll do something like that. Maybe even one single
In this case, the problem is that Wget 1.7.1 considered www.es.net
and es.net to be the same host, while Wget 1.9.1 doesn't.
(Conflation of hosts that resolve to the same address caused problems
with virtual hosts.)
A workaround is to either download from www.es.net to begin with, or
to allow
You're not making a mistake, recursive download over FTP proxies is
currently broken.
Currently there is no way around it. In fact, the problem might get
worse when we implement the support for the `Content-Disposition'
header.
My plan for a future version was to change the behavior of `-P' so
that it works more like Mozilla's save entire web page. Then you
could do something
Noèl Köthe [EMAIL PROTECTED] writes:
I configure passive-ftp as default in the Debian packages, too. Is
it possible to make it the default for wget or is there a reason
against it, which I dont see?
IIRC passive FTP is not documented by RFC 959, so it wasn't the
default. I don't have a
Thierry Vignaud [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
i'm maintaining wget in mandrake linux distribution.
here're some patches we apply on top of wget:
[...]
Thanks for sharing the patches. The file names imply that they
apply to different versions
I think this is a very good idea. People often ask for matching based
on query strings and such, which is currently not possible. This way
it would be fairly simple to match based on various parts of the URL,
such as host name, path, query string, etc.
Note, though, that your example won't do
wei ye [EMAIL PROTECTED] writes:
I'm thinking of reuse your code into my c++ application,
Wget is a C program that has never supported compilation under C++.
If it were a library, it might still make sense to allow its header
files to be included from C++. But since it's not, its C API is not
Grant Giddens [EMAIL PROTECTED] writes:
I am new to wget and running the windows version
1.9.1 that I downloaded from:
http://xoomer.virgilio.it/hherold/
I am trying to grab the Best Buy site (www.bestbuy.com) and I think
I'm having a problem with cookies. I have tried all sorts of
[ Please keep the bug address in the Cc, unless you explicitly wish to
exclude it. ]
Grant Giddens [EMAIL PROTECTED] writes:
The first thing I tried was:
wget -vr -o bestbuy.log http://www.bestbuy.com
That grabbed a few files, but the main index page basically said
that the site couldn't
Herold Heiko [EMAIL PROTECTED] writes:
Content-Length: Content-Length: 35
[...]
The line
Content-Length: Content-Length: 35
certainly seems strange.
Yup, that's where the bug is. This should fix it:
2003-12-16 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): Fix generation
What options are you using to download the file? As far as I'm aware,
Wget will not touch the contents of the files it downloads by default.
Marco Correia [EMAIL PROTECTED] writes:
On Wednesday 17 December 2003 14:59, you wrote:
What options are you using to download the file? As far as I'm
aware, Wget will not touch the contents of the files it downloads
by default.
I've tryied with no options at all and with several options,
Marco Correia [EMAIL PROTECTED] writes:
no luck, doing this
wget -U Mozilla/5.0 --referer=http://www.cotonete.iol.pt/
http://www.cotonete.iol.pt/listen/radio_playlist.asp?audio_sub_type_id=31
also doen't work, and my browser is identifying itself as
Mozilla/5.0
Then I must assume that the
Gisle Vanem [EMAIL PROTECTED] writes:
It could be a useful feature if we attach to the console of calling
process (the shell in most cases) at startup. Then when ^Break is
pressed (or '-b' specified), we free that console and continue
running in the background. That way we will get the shell
Kazu Yamamoto ($B;3K\OBI'(B) [EMAIL PROTECTED] writes:
(B
(B Thank you for supporting IPv6 in wget v 1.9.1. Unfortunately, wget v
(B 1.9.1 does not work well, at least, on NetBSD.
(B
(B NetBSD does not allow to use IPv4-mapped IPv6 addresses
(B[...]
(B When wget is compiled with
Thomas Lussnig [EMAIL PROTECTED] writes:
your page is not realy very usefull. First yes it describe how to code
protocol independent but
the big problem is it is ONLY valid for platforms with are full
compatible with the newest functions.
In the first change of wget i also had reduced the
Zhehong Ying [EMAIL PROTECTED] writes:
Then I moved the wget directory (sub-directory bin etc info man
share) into a staging box (AIX 5.1.0.0). There is no gcc installed
in the staging box because we can not install a C compiler in this
box for security.
Then by running bin/wget as a
[EMAIL PROTECTED] writes:
I've notice that spaces in directories/files are automatically converted
to '@' character. Is there any way to turn this option off?
e.g. template directory = [EMAIL PROTECTED]
I don't see this behavior. Which version of Wget are you running, and
under what
Kairos [EMAIL PROTECTED] writes:
$ cat wget.exe.stackdump
[...]
What were you doing with Wget when it crashed? Which version of Wget
are you running? Was it compiled for Cygwin or natively for Windows?
Juhana Sadeharju [EMAIL PROTECTED] writes:
I were not able to download the following URLs. I will read
possible replies from the mail archives as I'm not subscribed.
[By the way, Mailman list server with webpage interface makes
temporary subscriptions very easy. Digest mode can be turned
[EMAIL PROTECTED] writes:
I'm trying to download the documentation for the Zope application
server, at http://zope.org/Documentation/Books/ZopeBook/2_6Edition/,
and I'm having problems getting the images. For example, when I run
the following command:
/usr/bin/wget -kp \
Juergen Schliessmann [EMAIL PROTECTED] writes:
Wget 1.9.1 fails if a http-proxy and the secure https protocol is
used.
Packet sniffing shows that Wget does not initiate a ssl connection
to the proxy but instead connects directly to the target host
(obvious by a DNS-query) then after that
[EMAIL PROTECTED] writes:
This mail is related to my prior post...
here is a clipboard copy of an wget-1.9.1 output:
ftp.nai.com/pub/antivirus/datfiles/4.x/index.html: Invalid URL A
HREF=42984299.updIMG BORDER=0
SRC=http://ns1.boschrexroth.de:3128/squid-internal-static/icons/anthony-unk
Margaret Adam [EMAIL PROTECTED] writes:
Wget doesn't seem to mirror data-driven sites that well. I've found
that it follows the first level navigation on data driven sites but
doesn't give me 'flat-page versions' of deeper level pages.
Can you please explain what you mean by data-driven
I'm not sure what might be causing this. Do you see the same behavior
with 1.9.1?
Examining the `.listing' file in question might prove useful.
William McKee [EMAIL PROTECTED] writes:
On Fri, Jan 16, 2004 at 10:49:08PM +0100, Hrvoje Niksic wrote:
I'm not sure what might be causing this. Do you see the same behavior
with 1.9.1?
Debian is shipping with 1.8.1 right now. I'll try installing 1.9.1.
Examining the `.listing' file
You're right, it is a race condition. You could avoid it by
specifying explicit log file names which somehow depend on Wget's PID.
Avoiding race conditions in the general case is non-trivial (there is
O_EXCL, but it doesn't always work under NFS). Maybe a better
solution would be to change the
Hans Werner Strube [EMAIL PROTECTED] writes:
There is a name clash in src/connect.c for IRIX 6.2: /usr/include/sys/socket.h
contains a #define sa_len ...
Thanks for the report; this is already fixed in CVS (both in the mail
trunk and in the 1.9 branch).
Simons, Rick [EMAIL PROTECTED] writes:
Greetings all.
I've posted in the past, but never really have gotten connectivity to a
https server I support using the wget application. I've looked in the
manual, on the website and searched the Internet but am not getting very
far.
wget -V
Simons, Rick [EMAIL PROTECTED] writes:
I got wget compiled with ssl support now, and have a followup question ...
I'm getting the local file created but populated with a server response, not
the actual contents of the remote file. See example:
wget -d -S https://server/testfile
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 21 Jan 2004, Hrvoje Niksic wrote:
Thanks for the report. Wget still doesn't handle large files (it doesn't
use off_t yet). I plan to fix this for the next release.
Speaking of this subject, I'm currently doing just about the same
=?Windows-1250?B?VuFjbGF2IEtycGVj?= [EMAIL PROTECTED] writes:
I'm having trouble using wget on [EMAIL PROTECTED]
While trying to do FTP connection wget doesn't understand
wildcards, for example:
$ wget ftp://ftp.fit.vutbr.cz/pub/XFree86/4.3.0/*
Warning: wildcards not supported in HTTP.
don [EMAIL PROTECTED] writes:
I did not specify the passive option, yet it appears to have been used
anyway Here's a short transcript:
[EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip
--21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip
=
Thanks for persisting with this. It doesn't look like a mishandled
redirection -- the response headers exist and they don't request a
redirection or any kind of refresh.
access_log shows that 30 bytes have been transmitted. As it happens,
the string Virtual user ricks logged in.\n is exactly
Daniel Stenberg [EMAIL PROTECTED] writes:
On Thu, 22 Jan 2004, Simons, Rick wrote:
curl https://server/file -uuser:pass
Virtual user user logged in.
[...]
In my eyes, this looks like the correct output from curl. Wasn't it?
I think that Rick expects to see a complete HTML page rather than
Peter Mikeska [EMAIL PROTECTED] writes:
Hi,
im trying get all from
wget -r -l 0 ftp://19.24.24.24/some/datase/C#Tool/
Replace the # with %23 and it should work.
Post, Mark K [EMAIL PROTECTED] writes:
It's more likely your system/shell that is doing it, if you're using
Linux or UNIX.
The shell will typically only consider pound signs comments if they
are preceded with whitespace (and appear outside quotes, etc.) In
this case, the culprit is Wget's URL
Thanks for the patch. A similar fix is already in CVS.
Václav Krpec [EMAIL PROTECTED] writes:
$ wget -e ftp_proxy=ftp://192.168.35.1:3128/ ftp://ftp.fit.vutbr.cz/pub/XFree86
/4.3.0/fixes/*
Error in proxy URL ftp://192.168.35.1:3128/: Must be HTTP.
so, it seems that the proxy is not really ftp proxy, but http that
deals with ftp requests as
Jesper Louis Andersen [EMAIL PROTECTED] writes:
There are a couple of problems with this approach. First, we have
from the NetBSD man-page of usleep(3):
The microseconds argument must be less than 1,000,000. This renders
the sleep impossible for other values of opt.wait than 1. NetBSD
Hrvoje Niksic [EMAIL PROTECTED] writes:
The latest source (available in CVS) has a better sleeping function
that uses nanosleep where available, and that handles usleep's
wraparound for long sleeps. But it still can call usleep with values
larger than 1,000,000. I've attached a patch
Hrvoje Niksic [EMAIL PROTECTED] writes:
May I suggest using sleep(3) instead. It is used in the code in other
places and has the semantics you want.
sleep(3) cannot sleep for less than a second. I like the idea of
being able to specify `--wait 0.5' for Wget to wait for half a
second
701 - 800 of 1457 matches
Mail list logo