Robert Lupton the Good [EMAIL PROTECTED] writes:
This appears to be an over-enthusistic interpretation of %26 == ''
in wget.
I submit a URL (which is in fact a SQL query) with some embedded s
(logical ORs). These are encoded as %26, and the URL works just fine
with netscape and lynx. It
Jamie Zawinski [EMAIL PROTECTED] writes:
Please also set an exit alarm around your calls to connect() based
on the -T option.
This is requested frequently. I'll include it in the next release.
The reason why it's not already there is simply that I was lucky never
to be bitten by that
Noel Koethe [EMAIL PROTECTED] writes:
wget 1.8.1 is shipped with the files in doc/
wget.info
wget.info-1
wget.info-2
wget.info-3
wget.info-4
Yes.
As Ian said, this is so that people without `makeinfo' installed can
still read the documentation. (In fact, Info pages can even be read
Noel Koethe [EMAIL PROTECTED] writes:
OK. No problem for me. I just wrote this because the more
interesting doc, the manpage, is not shipped with the source.
I don't know how the man page is more interesting since it's a mere
subset of the Info documentation. All the GNU programs are shipped
is not an https* url.
Thanks for the report. The problem you described should be fixed by
this patch:
2002-02-19 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (url_parse): Don't treat '?' as query string separator
when parsing FTP URLs.
Index: src/url.c
Peteris Krumins [EMAIL PROTECTED] writes:
GNU Wget 1.8
get: progress.c:673: create_image: Assertion `p - bp-buffer = bp-width' failed.
This problem has been fixed in Wget 1.8.1. Please upgrade.
John A Ogren [EMAIL PROTECTED] writes:
I'd like to use 'wget' to mirror a remote ftp directory, but it
requires a username and password to access the server. I don't see
any mention of command-line options for supplying this information
for an FTP server, only for an HTTP server. Is this a
It's a known problem. Timestamping doesn't work with FTP URLs over
proxy because the HEAD request is not honored by the proxy for FTP.
Note that your Wget is very old and you should upgrade -- but not
because of this, because this problem has remained.
Currently this is a known problem. Wget doesn't span hosts or
schemes with -p, although it probably should.
It's a known issue. Wget's wildcard magic only works when using the
FTP protocol. HTTP is used for communication with proxies, so
wildcarding doesn't work. But you should be able to simulate it
using:
wget -nd -rl1 -A foo*bar ftp://server/dir/
It's not elegant, but it works for me.
Again, thanks for taking the time to research this. Next time
sometimes ask this question, we'll forward him this email.
Thanks for the report, Paul. This patch, which I'm about to apply to
CVS, should fix it.
2002-02-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Handle the case when start_url doesn't
parse.
Index: src/recur.c
Thanks for looking into this. I've written a slightly different fix
before I saw the one from you.
Your patch was *almost* correct -- one minor detail is that you don't
take care to free QUEUE and BLACKLIST before exiting, therefore
technically creating a (small) memory leak.
My patch avoids
Samuel Hargis [EMAIL PROTECTED] writes:
I've read through the documentation and it says that (if a name
shows up more than once, the filenames will get extensions '.n')
Would that be like index.html duplicate would be named
index.n.html or index.html.n?
The latter.
Also, how does it
Ian Abbott [EMAIL PROTECTED] writes:
I'd suggest either leaving them alone or adopting the IEC standards
that Henrik referred to, i.e. KiB = kibibyte = 2^10 bytes
Ugh! Never!
Let them keep their kibibytes to themselves. :-)
Michael Dodwell [EMAIL PROTECTED] writes:
Just noticed that wget 1.7 errors with the subject line if you pass
it a protocol, port and username but not a password.
Please upgrade to Wget 1.8.1. I believe this problem has gone away.
' function.
Thanks for the report; this patch should fix the bug:
2002-02-01 Hrvoje Niksic [EMAIL PROTECTED]
* html-url.c (tag_handle_meta): Don't crash on meta
http-equiv=refresh where content is missing.
Index: src/html-url.c
Andre Majorel [EMAIL PROTECTED] writes:
I respectfully disagree. If we can spend the time to read and
answer the poster's question, the poster can spend five minutes
to subscribe/unsubscribe.
For reference, see the netiquette item on posting to newsgroups
and asking for replies by email.
Brent Morgan [EMAIL PROTECTED] writes:
Whats CVS and what is the significance of this version?
CVS stands for Concurrent Versions System, and is the version
control system where the master sources for Wget are kept. I would
not advise the download of the CVS version because it is likely to
be
Jens Röder [EMAIL PROTECTED] writes:
for wget I would suggest a switch that allows to send the output
directly to stdout. It would be easier to use it in pipes.
Are you talking about the log output or the text of the documents Wget
downloads?
* Log output goes to stderr by default, and can
Way, Trevor [EMAIL PROTECTED] writes:
Using the -T, -t and -w parameters but cannot get it to timeout less than 3
minutes.
/usr/bin/wget --output-document=/tmp/performance.html -T5 --wait=2
--waitretry=2 --tries=2
Shuld this timeout after 5 secs, retry twice, waiting 2 secs between
Thanos Siaperas [EMAIL PROTECTED] writes:
Shouldn't wget first get the .listing, find the files needed by the
wildcard, and then request the files from the proxy? This looks like
a bug.
No, when using a proxy, you get HTTP behavior. So to do that, you
have to do it the HTTP way:
wget -rl1
Jan Hnila [EMAIL PROTECTED] writes:
Hello,
please try this(it should work):
wget -r -l2 -A=htm,html,phtml http://www.tunedport.com
(the change is the equals sign.The same for -R. If you take a look
at the output of wget --help, you may notice the equality signs
there(in the longer
Sacha Mallais [EMAIL PROTECTED] writes:
Unable to establish SSL connection.
--
Also note the it does _not_ appear to be retrying the connection. I
have explicitly set --tries=5, and with a non-ssl connection, the
above stuff appears 5 times when it cannot
wget Admin [EMAIL PROTECTED] writes:
I am using wget version 1.5.3 under Solaris and 1.5.2 under IRIX.
Please upgrade. This problem is fixed in Wget 1.8.1.
Do you have any ideas to solve the problem? (Possibly without
having to recompile wget since I am not sysadmin.)
You do not have to
Thomas Lussnig [EMAIL PROTECTED] writes:
i'm building an IPv6 patch for wget. And i'm worried about the point
that i have to add 12 in the sockaddr.
Perhaps it would help if you created a minimal test case for the
problem you're witnessing. For example:
#include stdio.h
#include
Lauri Mägi [EMAIL PROTECTED] writes:
I'm using WGet 1.8.1 for downloading files over FTP protocol.
when filename contain spaces url is like that ftp://server.name/file%20name
and it saves files also with %20 in file names
Prior I was using WGet 1.7 and it saved spaces as the should be.
Michael Jennings [EMAIL PROTECTED] writes:
The issue centers on the documentation. Philosophically, in my
opinion, a program should be written so the documentation is easy to
read. In this case a hidden stripping of useless characters means
that there is one less thing to explain in the
Alexey Aphanasyev [EMAIL PROTECTED] writes:
It works for me. I wish the patch included in the next release.
Thanks for the confirmation. The patch is already in CVS.
Herold Heiko [EMAIL PROTECTED] writes:
My personal idea is:
As a matter of fact no *windows* text editor I know of, even the
supplied windows ones (notepad, wordpad) AFAIK will add the ^Z at the
end of file.txt. Wget is a *windows* program (although running in
console mode), not a *Dos*
Ian Abbott [EMAIL PROTECTED] writes:
Most (all?) of the escape sequences within URLs should be decoded
before transforming to an external file-name.
All, I'd say. Even now u-file and u-dir are not URL-encoded. They
get reencoded later, by url_filename.
The point between the two is my
Ian Abbott [EMAIL PROTECTED] writes:
- asctime (localtime ((time_t *)cookie-expiry_time)),
+ (cookie-expiry_time != ~0UL ?
+asctime (localtime ((time_t *)cookie-expiry_time))
+: UNKNOWN),
cookie-attr, cookie-value));
}
Yes, except for any other
Jeff Bailey [EMAIL PROTECTED] writes:
wget 1.8 fails to link on i686-pc-sco3.2v5.0.6
Does the compiler on that machine really not have alloca()? I'm
usually wary of attempts to compile `alloca.c' because they usually
point out a mistake in the configuration process.
Daniel Stenberg [EMAIL PROTECTED] writes:
`struct addrinfo' contains a `struct sockaddr', which carries the
necessary scoping information (I think). The question at the time
was whether I could extract only the address(es) and ignore
everything else, as it was possible with IPv4. Itojune
[EMAIL PROTECTED] writes:
Funny you mention this. When I first heard about -p (1.7?) I
thought exactly that it would default to [spanning hosts to retrieve
page requisites]. I think it would be really useful if the page
requisites could be wherever they want. I mean, -p is already
Thomas Lussnig [EMAIL PROTECTED] writes:
how the socket part should work fine.
inet_pton and gethostbyname2 only get used if IPV6 is defined
Please don't use gethostbyname2. It's apparently a GNU extension, and
I don't think it will work anywhere except on Linux.
Now it leaves
Thomas Lussnig [EMAIL PROTECTED] writes:
1. without IPv6 there is no longer used new syscalls
(gethostbyname2,inet_ntop,inet_pton)
2. It can on runtime downgreade to IPv4
3. In IPv6 mode it can handle IPv4 Adresses
4. Checked with following input www.ix.de , 217.110.115.160 ,
Ian Abbott [EMAIL PROTECTED] writes:
I came across this extract from a table on a website:
td ALIGN=CENTER VALIGN=CENTER WIDTH=120 HEIGHT=120a
href=66B27885.htm msover1('Pic1','thumbnails/MO66B27885.jpg');
onMouseOut=msout1('Pic1','thumbnails/66B27885.jpg');img
SRC=thumbnails/66B27885.jpg
[EMAIL PROTECTED] writes:
That sounds like they wanted onMouseOver=msover1(...)
Which Wget would, by the way, have handled perfectly.
Ian Abbott [EMAIL PROTECTED] writes:
Here is a patch to deal with the -P C:\temp (and similar) problems
on Windows.
This looks good. I'll apply it as soon as CVS becomes operational
again.
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 16 Jan 2002, Hrvoje Niksic wrote:
The so called scope in IPv6 is emeddeded in the address, so you can't use
IPv6 addresses without getting the scope too.
Are you sure? Here is what itojun said in
[EMAIL PROTECTED]:
due
[EMAIL PROTECTED] writes:
Until there's an ESP package that can guess what the author
intended, I doubt wget has any choice but to ignore the defective
tag.
Seriously, I think you guys are too strict.
Similar discussion have spawned numerous times.
If the HTML code says
a href=URL
to be
terminated at empty lines.
Thanks for the report. This patch should fix the problems; please let
me know if it works for you.
2002-01-17 Hrvoje Niksic [EMAIL PROTECTED]
* netrc.c (parse_netrc): Skip leading whitespace before testing
whether the line is empty. Empty lines still
Hrvoje Niksic [EMAIL PROTECTED] writes:
Ian Abbott [EMAIL PROTECTED] writes:
Here is a patch to deal with the -P C:\temp (and similar) problems
on Windows.
This looks good. I'll apply it as soon as CVS becomes operational
again.
Applied now.
Tay Ngak San [EMAIL PROTECTED] writes:
I have downloaded your source code for wget and tried to make it but
failed due to va_list parameter conflict in stdarg.h and stdio.h.
Please advice.
What OS and compiler are you using to compile Wget?
Markus Buchhorn [EMAIL PROTECTED] writes:
Reading back, that was itojun's proposal, and I suspect probably a
good choice, even if it seems less clean. Itojun is one of the leading
lights in IPv6 development, along with the whole WIDE group in Japan,
and heavily involved in the v6 stacks for
Daniel Stenberg [EMAIL PROTECTED] writes:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what my code extracts. By
extension, the idea was for
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
Well, why extract the addresses when you can just leave them in the
struct and pass a pointer to that?
Because I'm caching the result of the lookup, and making a deep
copy of `struct hostent
Thomas Lussnig [EMAIL PROTECTED] writes:
Ok first we don't need this difference. I think it's not so easy than
it first seem's.
Because IPv6 is an superset of IPv4 there is an representation fo IPv4
Adresses.
But is it desirable to use it in preference to native IPv4 calls?
I apologize if
Rami Lehti [EMAIL PROTECTED] writes:
Wget should try to honor
Content-disposition: filename=foobar
HTTP-response header.
It is really a pain to try to download a file that is created by a script.
Usually the server gives the Content-disposition: header
You would have to save the server
Jonathan Davis [EMAIL PROTECTED] writes:
I recently successfully compiled and installed wget 1.8.1 on my box.
The new OS and architecture reads as follows: Mac OS X
(powerpc-apple-darwin5.2)
Thanks for the report; I've now updated MACHINES.
Boris [EMAIL PROTECTED] writes:
As propose by Hrvoje, I have try with retry option, but no change, every
time I've got 'read error'.
I also test with the new release for windows (1.8.1), but same thing
:(
I have no idea what could be going on. Perhaps a Windows person might
help? On
Dan Lavie [EMAIL PROTECTED] writes:
I have just downloaded and installed WGET on my OS-X.
You didn't say where you downloaded it from or how you installed it,
so I'll assume you're using the standard build process.
1- I can¹t find any documentation.
The documentation is in Info format,
praveen sirivolu [EMAIL PROTECTED] writes:
I have a doubt.when we use wget to recursively retrieve pages from
internet its not bringing files with shtml and jhtml
extensions.is this feature not implemented or if it is there ,could
somebody explain me how to get those HTML pages.
They should
Robin B. Lake [EMAIL PROTECTED] writes:
Someone kindly suggested the -k switch. Here's what I've done:
wget -nH -p -k -E -O OEX
'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24oexsid=0o_symb=%24oexx=33y=24'
Please note that `-O' does not work with `-p'.
Why is the
Jens Rösner [EMAIL PROTECTED] writes:
Can I use -P (Directory prefix) to save files in a user-determinded
folder on another drive under Windows?
You should be able to do that. Try `-P C:/temp/'. Wget doesn't know
anything about windows backslashes, so maybe that's what made it fail.
If it
Bastiaan Stougie [EMAIL PROTECTED] writes:
Executing rpm -ta --clean wget-1.8.1.tgz gives an error, after some searching I
discovered this
is because the version in util/wget.spec is incorrect: Version: 1.7 should be:
Version: 1.8.1.
Furthermore, executing rpm -Fvh wget-1.8.1-1.i686.rpm
John Levon [EMAIL PROTECTED] writes:
moz wget-1.7 188 wget http://www.movementarian.org/oprofile-0.0.8.tar.gz
--20:35:51-- http://www.movementarian.org/oprofile-0.0.8.tar.gz
= `oprofile-0.0.8.tar.gz'
Connecting to www.movementarian.org:80... connected!
HTTP request sent,
Ivan Buttinoni [EMAIL PROTECTED] writes:
- for recursive retrieval, multiple simultaneus gets
This is very hard to do, not easy at all.
- last but not the least: javascrip support (eheheh)
And this is even harder. Javascript is a full programming language
which, as used by the sites,
Jens Rösner [EMAIL PROTECTED] writes:
Can I use -P (Directory prefix) to save files in a user-determinded
folder on another drive under Windows?
You should be able to do that. Try `-P C:/temp/'. Wget doesn't know
anything about windows backslashes, so maybe that's what made it fail.
Fred Holmes [EMAIL PROTECTED] writes:
Is there a syntax such that I can connect to the host once, transfer
the four files, and then disconnect?
Unfortunately, no, not yet.
LWS MAY be
removed without changing the semantics of the field value. Any LWS
that occurs between field-content MAY be replaced with a single SP
before interpreting the field value or forwarding the message
downstream.
Ok, how about this patch:
2002-01-14 Hrvoje Niksic [EMAIL
Brent Morgan [EMAIL PROTECTED] writes:
But I have a problem. I upgraded to 1.8.1 for win9x. I found the
cookie file for netscape 4 and 6 which are different from one
another. I made sure that each had the correct cookie set for the
website in question. I tried both and got the same error
Brent Morgan [EMAIL PROTECTED] writes:
The -d debug option crashes wget just after it reads the input file.
Huh? Ouch! Wget on Windows is much less stable than I imagined. Can
you run it under a debugger and see what causes the crash?
John Levon [EMAIL PROTECTED] writes:
Thanks very much (wouldn't it be good to refer to the clause in the
RFC in the comments ?)
Uh, I suppose so. But it doesn't matter that much -- someone looking
for it will find it anyway. Besides, it's not clear which RFC Wget
conforms to. Web standards
Thomas Lussnig [EMAIL PROTECTED] writes:
1. Now if IPv6 enabled it only fetch IPv6 IPv4 sites faile
This is a problem, and part of the reason why the patch is so simple
in its current form. A correct patch must modify struct address_list
to hold a list of IP addresses, each of which can be
Ryan Daniels [EMAIL PROTECTED] writes:
The following command line causes a Segfault on my system:
wget -spider http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But the bug you've uncovered is real: you can
Thomas Reinke [EMAIL PROTECTED] writes:
Ok, either I've completely misread wget, or it has a problem
mirroring SSL sites. It appears that it is deciding that the
https:// scheme is something that is not to be followed.
That's a bug. Your patch is close to how it should be fixed, with two
Herold Heiko [EMAIL PROTECTED] writes:
But that's not the real issue here - why -i for input but not for others
? A consistent interface should allow something like --file-char=@
-@Rfilename -@Aotherfilename ecc., i.e. accept a filename everywhere a
option is allowed.
This is a neat idea,
Peter Gucwa @ IIS-RTP [EMAIL PROTECTED] writes:
option -k does not work in following call:
wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Robin B. Lake [EMAIL PROTECTED] writes:
In a prior posting, I asked about saving an image from a Web page
instead of just saving the information necessary to re-retrieve that
image. I was advised to try -p -k --html-extension
Using wget-1.8.1-pre2, I still don't see the image data saved
Brendan Ragan [EMAIL PROTECTED] writes:
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirtymonth=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent,
Ian Abbott [EMAIL PROTECTED] writes:
On 4 Jan 2002 at 12:22, Bastiaan Stougie wrote:
wget -P $LOCALDIR -m -np -nH -p --cut-dirs=2
http://host/dir1/dir2/
This works fine, except that wget does not follow all the urls. It
skips urls like:
A HREF=//host/dir1/dir2/filetext/A
Wow, I
Jochen Roderburg [EMAIL PROTECTED] writes:
This release 1.81. has still the problem (bug/feature ?), that *unsafe*
characters are hex-encoded in local filenames.
Yes.
Any plans to repair this ?
For 1.9, hopefully.
Robin B. Lake [EMAIL PROTECTED] writes:
I'm using wget to save a tick chart of a stock index each night.
wget -nH -q -O /QoI/working/CHARTS/$myday+OEX.html
'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24OEXsid=0o_symb=%24OEXx=60y=15freq=9time=1'
[...]
What is saved
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ]
Nuno Ponte [EMAIL PROTECTED] writes:
I get a segmentation fault when invoking:
wget -r
http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html
My Wget version is 1.7-3, the one which is
Edward Manukovsky [EMAIL PROTECTED] writes:
Excuse me, please, but I've got a question.
I cannot set retry timeout for 30 seconds by doing:
wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list
For me, Wget waits for 30 seconds between each retrieval. What
version are you using?
Jean-Edouard BABIN [EMAIL PROTECTED] writes:
I found a little bug when we download from an deleted directory:
[...]
Thanks for the report.
I wouldn't consider it a real bug. Downloading things into a deleted
directory is bound to produce all kinds of problems.
The diagnostic message could
Jiang Wei [EMAIL PROTECTED] writes:
I tried to download a whole directory in a FTP site by using `-r -np'
options, and I have go through some firewall
via http_proxy/ftp_proxy. But I failed, wget-1.8.1 only retrieved the
first indexed ftp file list and stopped working, while wget-1.5.3 can
Thomas Reinke [EMAIL PROTECTED] writes:
Neat...not sure that I really nkown enough to start digging to easily
figure out what went wrong, but it can be reproduced by running the
following:
$ wget -d -r -l 5 -t 1 -T 30 -o x.lg -p -s -P dir -Q 500
--limit-rate=256000 -R mpg,mpeg
Jens Rösner [EMAIL PROTECTED] writes:
I noticed that -nh (no host look-up) seems to be gone in 1.8.1.
Is that right?
That is correct.
At first I thought, Oh, you fool, it is -nH, you mixed it up
But, obviously, these are two different options.
Again, correct.
I read the news file and
Jens Rösner [EMAIL PROTECTED] writes:
I already posted this on the normal wget list, to which I am subscribed.
Problem:
-nh does not work in 1.8 latest windows binary.
By not working I mean that it is not recognized as a valid parameter.
(-nh is no-host look-up and with it on,
two domain
gfa2c [EMAIL PROTECTED] writes:
So wget 1.7.1 is inserting an error in my URL (it is dropping a
slash). How can I convince it to stop?
I'm afraid you can't.
Plerase do not tell me to put http_proxy in wgetrc:
You can also use the environment variable of the same name.
it doe snot work.
Jens Rösner [EMAIL PROTECTED] writes:
1. Is there then now a way to turn off -nh?
So that wget does not distinguish between domain names of the same
IP?
No; there is no longer a way to do that.
Or is this option irrelevant given the net's current structure?
I don't think that option was
FORSAGE [EMAIL PROTECTED] writes:
For some strange reason recent wget win32 compiles (1.6 and up) ignore -w
and -t keys in command line for me :((
ie: wget18.exe -w 60 -t 0 URL acts like wget18.exe URL,
waits and retries are left at default.
That's weird, because it seems to work for me
Wget 1.8 is released. As usual, it should appear on ftp.gnu.org after
a while; until it does, you can get it from:
ftp://ftp.srk.fer.hr/pub/unix/util/wget/wget-1.8.1.tar.gz
MD5 checksum of the archive is:
6ca8e939476e840f0ce69a3b31c13060
Please send bug reports to [EMAIL PROTECTED].
Alexey Aphanasyev [EMAIL PROTECTED] writes:
I got an error (see attachment) during latest CVS Wget
1.8.1-pre2+cvs compilation.
[...]
gcc -O2 -Wall -Wno-implicit -o wget cmpt.o connect.o cookies.o fnmatch.o ftp.o
ftp-basic.o ftp-ls.o ftp-opie.o getopt.o hash.o headers.o host.o html-parse.o
Alan Eldridge [EMAIL PROTECTED] writes:
There's a garbage newline output in http.c. A noticable effect of
this is when updating a directory using -N, you get a blank line for
each file that is considered for download.
I don't think that's a garbage newline; that newline is intentional,
at
then it
correctly downloads a.html only once.
This is informative; thanks. Does this patch fix the problem:
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Enqueue the canonical representation of
start_url, so that the test against dl_url_file_map works.
Index: src
Zvi Har'El [EMAIL PROTECTED] writes:
Even so, adding support for connect might be non-trivial in Wget's
hairy old HTTP code. I think it will have to wait for a cleanup of
the HTTP backend.
This is your decision, of course, but it should be understood that
right now you cannot use
is
there.)
ChangeLog since 1.8.1-pre2:
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* version.c: Wget 1.8.1-pre3 is released.
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Enqueue the canonical representation of
start_url, so that the test against
Thomas Reinke [EMAIL PROTECTED] writes:
We've noted in a few cases that wget can hang on connect() due to a
lack of any form of timeout management. We've made a change to the
routine connect_to_one in connect.c that will implement a
timeout mechanism on connect without the use of signals or
Mike [EMAIL PROTECTED] writes:
Ok thanks, so the full command sequence to
get all the files which have an extension of '.txt' from
http://www.domain.com/subdir1/subdir2 and place them
in my current directory is:-
wget -A *.txt -r -l l -nd http://www.domain.com/subdir1/subdir2
Vladimir Volovich [EMAIL PROTECTED] writes:
this is not strictly speaking a bug, but is an inconsistency.
when i run
wget -x http://some.host/path%20to%20file/file%20name.html
wget saves the result in some.host/path%20to%20file/file name.html
i.e. it decodes %-characters in
Alexey Aphanasyev [EMAIL PROTECTED] writes:
Something is very wrong here. Almost every single line of configure
output is cached. What version of Autoconf are you using?
autoconf-2.13
That version should work. Have you performed `make distclean' before
configuring? It sounds like some
Vladimir Volovich [EMAIL PROTECTED] writes:
Hrvoje The inconsistency is a bug. It is intended that Wget encodes
Hrvoje all the unsafe characters, both in files and directories.
Hrvoje (It is debatable whether that is a bug.) This patch makes it
Hrvoje consistent, but I will not apply it
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 19 Dec 2001, Hrvoje Niksic wrote:
But one problem with this implementation is portability -- I'm pretty sure
that some systems don't support FIONBIO.
Correct. Ancient ones it seems, I couldn't find a single modern
(eh, no don't ask me
Thomas Reinke [EMAIL PROTECTED] writes:
Again, I just never saw the point.
FWIW, as I mentioned to Hrvoje earlier off-line, it can be a reliability
issue. Without it, wget can hang and require some form of intervention
to terminate properly,
I guess I was just lucky never to encounter
Holger Klawitter [EMAIL PROTECTED] writes:
I am using wget 1.5.3 under Linux (SuSE 7.1) and I discovered that
wget fails to parse netrc files if some words contain whitespace.
Wget 1.5.3 is old. I've now tried putting a quoted password in my
`.netrc', and it works for me with the latest
1101 - 1200 of 1457 matches
Mail list logo