John Levon [EMAIL PROTECTED] writes:
Thanks very much (wouldn't it be good to refer to the clause in the
RFC in the comments ?)
Uh, I suppose so. But it doesn't matter that much -- someone looking
for it will find it anyway. Besides, it's not clear which RFC Wget
conforms to. Web standards
Thomas Lussnig [EMAIL PROTECTED] writes:
1. Now if IPv6 enabled it only fetch IPv6 IPv4 sites faile
This is a problem, and part of the reason why the patch is so simple
in its current form. A correct patch must modify struct address_list
to hold a list of IP addresses, each of which can be
Markus Buchhorn [EMAIL PROTECTED] writes:
Reading back, that was itojun's proposal, and I suspect probably a
good choice, even if it seems less clean. Itojun is one of the leading
lights in IPv6 development, along with the whole WIDE group in Japan,
and heavily involved in the v6 stacks for
Daniel Stenberg [EMAIL PROTECTED] writes:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what my code extracts. By
extension, the idea was for
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
Well, why extract the addresses when you can just leave them in the
struct and pass a pointer to that?
Because I'm caching the result of the lookup, and making a deep
copy of `struct hostent
Thomas Lussnig [EMAIL PROTECTED] writes:
Ok first we don't need this difference. I think it's not so easy than
it first seem's.
Because IPv6 is an superset of IPv4 there is an representation fo IPv4
Adresses.
But is it desirable to use it in preference to native IPv4 calls?
I apologize if
Rami Lehti [EMAIL PROTECTED] writes:
Wget should try to honor
Content-disposition: filename=foobar
HTTP-response header.
It is really a pain to try to download a file that is created by a script.
Usually the server gives the Content-disposition: header
You would have to save the server
Jonathan Davis [EMAIL PROTECTED] writes:
I recently successfully compiled and installed wget 1.8.1 on my box.
The new OS and architecture reads as follows: Mac OS X
(powerpc-apple-darwin5.2)
Thanks for the report; I've now updated MACHINES.
Boris [EMAIL PROTECTED] writes:
As propose by Hrvoje, I have try with retry option, but no change, every
time I've got 'read error'.
I also test with the new release for windows (1.8.1), but same thing
:(
I have no idea what could be going on. Perhaps a Windows person might
help? On
Dan Lavie [EMAIL PROTECTED] writes:
I have just downloaded and installed WGET on my OS-X.
You didn't say where you downloaded it from or how you installed it,
so I'll assume you're using the standard build process.
1- I can¹t find any documentation.
The documentation is in Info format,
praveen sirivolu [EMAIL PROTECTED] writes:
I have a doubt.when we use wget to recursively retrieve pages from
internet its not bringing files with shtml and jhtml
extensions.is this feature not implemented or if it is there ,could
somebody explain me how to get those HTML pages.
They should
Daniel Stenberg [EMAIL PROTECTED] writes:
`struct addrinfo' contains a `struct sockaddr', which carries the
necessary scoping information (I think). The question at the time
was whether I could extract only the address(es) and ignore
everything else, as it was possible with IPv4. Itojune
[EMAIL PROTECTED] writes:
Funny you mention this. When I first heard about -p (1.7?) I
thought exactly that it would default to [spanning hosts to retrieve
page requisites]. I think it would be really useful if the page
requisites could be wherever they want. I mean, -p is already
Thomas Lussnig [EMAIL PROTECTED] writes:
how the socket part should work fine.
inet_pton and gethostbyname2 only get used if IPV6 is defined
Please don't use gethostbyname2. It's apparently a GNU extension, and
I don't think it will work anywhere except on Linux.
Now it leaves
Thomas Lussnig [EMAIL PROTECTED] writes:
1. without IPv6 there is no longer used new syscalls
(gethostbyname2,inet_ntop,inet_pton)
2. It can on runtime downgreade to IPv4
3. In IPv6 mode it can handle IPv4 Adresses
4. Checked with following input www.ix.de , 217.110.115.160 ,
Ian Abbott [EMAIL PROTECTED] writes:
I came across this extract from a table on a website:
td ALIGN=CENTER VALIGN=CENTER WIDTH=120 HEIGHT=120a
href=66B27885.htm msover1('Pic1','thumbnails/MO66B27885.jpg');
onMouseOut=msout1('Pic1','thumbnails/66B27885.jpg');img
SRC=thumbnails/66B27885.jpg
[EMAIL PROTECTED] writes:
That sounds like they wanted onMouseOver=msover1(...)
Which Wget would, by the way, have handled perfectly.
Ian Abbott [EMAIL PROTECTED] writes:
Here is a patch to deal with the -P C:\temp (and similar) problems
on Windows.
This looks good. I'll apply it as soon as CVS becomes operational
again.
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 16 Jan 2002, Hrvoje Niksic wrote:
The so called scope in IPv6 is emeddeded in the address, so you can't use
IPv6 addresses without getting the scope too.
Are you sure? Here is what itojun said in
[EMAIL PROTECTED]:
due
[EMAIL PROTECTED] writes:
Until there's an ESP package that can guess what the author
intended, I doubt wget has any choice but to ignore the defective
tag.
Seriously, I think you guys are too strict.
Similar discussion have spawned numerous times.
If the HTML code says
a href=URL
to be
terminated at empty lines.
Thanks for the report. This patch should fix the problems; please let
me know if it works for you.
2002-01-17 Hrvoje Niksic [EMAIL PROTECTED]
* netrc.c (parse_netrc): Skip leading whitespace before testing
whether the line is empty. Empty lines still
Hrvoje Niksic [EMAIL PROTECTED] writes:
Ian Abbott [EMAIL PROTECTED] writes:
Here is a patch to deal with the -P C:\temp (and similar) problems
on Windows.
This looks good. I'll apply it as soon as CVS becomes operational
again.
Applied now.
Tay Ngak San [EMAIL PROTECTED] writes:
I have downloaded your source code for wget and tried to make it but
failed due to va_list parameter conflict in stdarg.h and stdio.h.
Please advice.
What OS and compiler are you using to compile Wget?
Alexey Aphanasyev [EMAIL PROTECTED] writes:
It works for me. I wish the patch included in the next release.
Thanks for the confirmation. The patch is already in CVS.
Herold Heiko [EMAIL PROTECTED] writes:
My personal idea is:
As a matter of fact no *windows* text editor I know of, even the
supplied windows ones (notepad, wordpad) AFAIK will add the ^Z at the
end of file.txt. Wget is a *windows* program (although running in
console mode), not a *Dos*
Ian Abbott [EMAIL PROTECTED] writes:
Most (all?) of the escape sequences within URLs should be decoded
before transforming to an external file-name.
All, I'd say. Even now u-file and u-dir are not URL-encoded. They
get reencoded later, by url_filename.
The point between the two is my
Ian Abbott [EMAIL PROTECTED] writes:
- asctime (localtime ((time_t *)cookie-expiry_time)),
+ (cookie-expiry_time != ~0UL ?
+asctime (localtime ((time_t *)cookie-expiry_time))
+: UNKNOWN),
cookie-attr, cookie-value));
}
Yes, except for any other
Jeff Bailey [EMAIL PROTECTED] writes:
wget 1.8 fails to link on i686-pc-sco3.2v5.0.6
Does the compiler on that machine really not have alloca()? I'm
usually wary of attempts to compile `alloca.c' because they usually
point out a mistake in the configuration process.
Lauri Mägi [EMAIL PROTECTED] writes:
I'm using WGet 1.8.1 for downloading files over FTP protocol.
when filename contain spaces url is like that ftp://server.name/file%20name
and it saves files also with %20 in file names
Prior I was using WGet 1.7 and it saved spaces as the should be.
Michael Jennings [EMAIL PROTECTED] writes:
The issue centers on the documentation. Philosophically, in my
opinion, a program should be written so the documentation is easy to
read. In this case a hidden stripping of useless characters means
that there is one less thing to explain in the
Thomas Lussnig [EMAIL PROTECTED] writes:
i'm building an IPv6 patch for wget. And i'm worried about the point
that i have to add 12 in the sockaddr.
Perhaps it would help if you created a minimal test case for the
problem you're witnessing. For example:
#include stdio.h
#include
Jan Hnila [EMAIL PROTECTED] writes:
Hello,
please try this(it should work):
wget -r -l2 -A=htm,html,phtml http://www.tunedport.com
(the change is the equals sign.The same for -R. If you take a look
at the output of wget --help, you may notice the equality signs
there(in the longer
Sacha Mallais [EMAIL PROTECTED] writes:
Unable to establish SSL connection.
--
Also note the it does _not_ appear to be retrying the connection. I
have explicitly set --tries=5, and with a non-ssl connection, the
above stuff appears 5 times when it cannot
wget Admin [EMAIL PROTECTED] writes:
I am using wget version 1.5.3 under Solaris and 1.5.2 under IRIX.
Please upgrade. This problem is fixed in Wget 1.8.1.
Do you have any ideas to solve the problem? (Possibly without
having to recompile wget since I am not sysadmin.)
You do not have to
Way, Trevor [EMAIL PROTECTED] writes:
Using the -T, -t and -w parameters but cannot get it to timeout less than 3
minutes.
/usr/bin/wget --output-document=/tmp/performance.html -T5 --wait=2
--waitretry=2 --tries=2
Shuld this timeout after 5 secs, retry twice, waiting 2 secs between
Thanos Siaperas [EMAIL PROTECTED] writes:
Shouldn't wget first get the .listing, find the files needed by the
wildcard, and then request the files from the proxy? This looks like
a bug.
No, when using a proxy, you get HTTP behavior. So to do that, you
have to do it the HTTP way:
wget -rl1
Jens Röder [EMAIL PROTECTED] writes:
for wget I would suggest a switch that allows to send the output
directly to stdout. It would be easier to use it in pipes.
Are you talking about the log output or the text of the documents Wget
downloads?
* Log output goes to stderr by default, and can
Andre Majorel [EMAIL PROTECTED] writes:
I respectfully disagree. If we can spend the time to read and
answer the poster's question, the poster can spend five minutes
to subscribe/unsubscribe.
For reference, see the netiquette item on posting to newsgroups
and asking for replies by email.
Brent Morgan [EMAIL PROTECTED] writes:
Whats CVS and what is the significance of this version?
CVS stands for Concurrent Versions System, and is the version
control system where the master sources for Wget are kept. I would
not advise the download of the CVS version because it is likely to
be
' function.
Thanks for the report; this patch should fix the bug:
2002-02-01 Hrvoje Niksic [EMAIL PROTECTED]
* html-url.c (tag_handle_meta): Don't crash on meta
http-equiv=refresh where content is missing.
Index: src/html-url.c
Michael Dodwell [EMAIL PROTECTED] writes:
Just noticed that wget 1.7 errors with the subject line if you pass
it a protocol, port and username but not a password.
Please upgrade to Wget 1.8.1. I believe this problem has gone away.
Ian Abbott [EMAIL PROTECTED] writes:
I'd suggest either leaving them alone or adopting the IEC standards
that Henrik referred to, i.e. KiB = kibibyte = 2^10 bytes
Ugh! Never!
Let them keep their kibibytes to themselves. :-)
is not an https* url.
Thanks for the report. The problem you described should be fixed by
this patch:
2002-02-19 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (url_parse): Don't treat '?' as query string separator
when parsing FTP URLs.
Index: src/url.c
Peteris Krumins [EMAIL PROTECTED] writes:
GNU Wget 1.8
get: progress.c:673: create_image: Assertion `p - bp-buffer = bp-width' failed.
This problem has been fixed in Wget 1.8.1. Please upgrade.
John A Ogren [EMAIL PROTECTED] writes:
I'd like to use 'wget' to mirror a remote ftp directory, but it
requires a username and password to access the server. I don't see
any mention of command-line options for supplying this information
for an FTP server, only for an HTTP server. Is this a
It's a known problem. Timestamping doesn't work with FTP URLs over
proxy because the HEAD request is not honored by the proxy for FTP.
Note that your Wget is very old and you should upgrade -- but not
because of this, because this problem has remained.
Currently this is a known problem. Wget doesn't span hosts or
schemes with -p, although it probably should.
It's a known issue. Wget's wildcard magic only works when using the
FTP protocol. HTTP is used for communication with proxies, so
wildcarding doesn't work. But you should be able to simulate it
using:
wget -nd -rl1 -A foo*bar ftp://server/dir/
It's not elegant, but it works for me.
Again, thanks for taking the time to research this. Next time
sometimes ask this question, we'll forward him this email.
Thanks for the report, Paul. This patch, which I'm about to apply to
CVS, should fix it.
2002-02-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Handle the case when start_url doesn't
parse.
Index: src/recur.c
Thanks for looking into this. I've written a slightly different fix
before I saw the one from you.
Your patch was *almost* correct -- one minor detail is that you don't
take care to free QUEUE and BLACKLIST before exiting, therefore
technically creating a (small) memory leak.
My patch avoids
Samuel Hargis [EMAIL PROTECTED] writes:
I've read through the documentation and it says that (if a name
shows up more than once, the filenames will get extensions '.n')
Would that be like index.html duplicate would be named
index.n.html or index.html.n?
The latter.
Also, how does it
Jamie Zawinski [EMAIL PROTECTED] writes:
Please also set an exit alarm around your calls to connect() based
on the -T option.
This is requested frequently. I'll include it in the next release.
The reason why it's not already there is simply that I was lucky never
to be bitten by that
Noel Koethe [EMAIL PROTECTED] writes:
wget 1.8.1 is shipped with the files in doc/
wget.info
wget.info-1
wget.info-2
wget.info-3
wget.info-4
Yes.
As Ian said, this is so that people without `makeinfo' installed can
still read the documentation. (In fact, Info pages can even be read
Noel Koethe [EMAIL PROTECTED] writes:
OK. No problem for me. I just wrote this because the more
interesting doc, the manpage, is not shipped with the source.
I don't know how the man page is more interesting since it's a mere
subset of the Info documentation. All the GNU programs are shipped
[ Adding the development list to Cc, to facilitate discussion. ]
David F. Newman [EMAIL PROTECTED] writes:
First of all, I think this new behaviour needs an option to enable
it, rather than be on by default. The option could be called
rfc1806, or rather, rfc2183 now, unless anyone can
Robert Lupton the Good [EMAIL PROTECTED] writes:
This appears to be an over-enthusistic interpretation of %26 == ''
in wget.
I submit a URL (which is in fact a SQL query) with some embedded s
(logical ORs). These are encoded as %26, and the URL works just fine
with netscape and lynx. It
Doug Kearns [EMAIL PROTECTED] writes:
On Fri, Mar 22, 2002 at 04:08:36AM +0100, Hrvoje Niksic wrote:
snip
I think I agree with this. The amount of spam is staggering. I have
no explanation as to why this happens on this list, and not on other
lists which are *also* open to non
Tomislav Goles [EMAIL PROTECTED] writes:
Now I need to add the twist where username account info
resides on another machine (i.e. machine2 which by the way
is on the same network as machine1) So I need to do something
like the following:
$ wget ftp://username:[EMAIL
David McCabe [EMAIL PROTECTED] writes:
I am having a hell of a time to get the reg-ex stuff to work with the -A or -R
options. If I supply this option to my wget command:
-R 1*
Everything works as expected. Same with this:
-R 2*
Now, if I do this:
-R 1*,2*
I get all the files
Guillaume Morin [EMAIL PROTECTED] writes:
if I use 'wget ftp://site.com/file1.txt ftp://site.com/file2.txt',
wget will no reuse the ftp connection, but will open one for each
document downloaded from the same site...
Yes, that's how Wget currently behaves. But that's not a bug, or at
least
Good point there. I wonder... is there a legitimate reason to require
atime to be set to the mtime time? If not, we could just make the
change without the new option. In general I'm careful not to add new
options unless they're really necessary.
Guillaume Morin [EMAIL PROTECTED] writes:
If wget fetches a url which redirects to another host, wget
retrieves the file, and there's nothing that can be done to turn
that off.
So, if you do wget -r on a machine that happens to have a redirect to
www.yahoo.com you'll wind up trying to pull
Ivan Buttinoni [EMAIL PROTECTED] writes:
Again I send a suggestion, this time quite easy. I hope it's not
allready implemented, else I'm sorry in advance. It will be nice if
wget can use the regexp to evaluate what accept/refuse to download.
The regexp have to work on whole URL and/or
Guillaume Morin [EMAIL PROTECTED] writes:
For example if a link to the URL /foo?bar is seen then the correct
file is downloaded and saved with the name foo?bar. When viewing
the pages with Netscape the '?' character is seen to separate the
URL and the arguments. This makes the link fail.
Martin Tsachev [EMAIL PROTECTED] writes:
it compiles on i386-unknown-netbsdelf1.5.2 without any modifications
I think that wget isn't parsing the @import CSS declaration, it should
save those files when run with -p and convert the links if set so
That is true. Parsing @import would require
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Mon, 8 Apr 2002, Hrvoje Niksic wrote:
I was also thinking about checking for `Wget' in the body, and things
like that.
That might be annoying (although it is certainly an option to consider
anyway) as someone sending a mail legitimately may
Ian Abbott [EMAIL PROTECTED] writes:
On 5 Apr 2002 at 18:17, Noel Koethe wrote:
Will this be changed so the user could use -nv with /dev/null
and get only errors or warnings displayed?
So what I think you want is for any log message tagged as
LOG_VERBOSE (verbose information) or
Justin Piszcz [EMAIL PROTECTED] writes:
--12:12:21-- ftp://war:*password*@0.0.0.0:21//iso/file.iso
= `iso/file.iso'
== CWD not required.
== PASV ... done.== RETR file.iso ... done.
Length: 737,402,880
24% [] 180,231,952 37.40K/s ETA
Torsten Fellhauer -iXpoint- #429 [EMAIL PROTECTED] writes:
when connecting to a FTP-Server using a TrendMicro Viruswall Proxy,
we get the error Malformed status line,
Unfortunately, Wget is right; that status line is quite different from
what HTTP mandates. The status line should be
Matthias Jim Knopf [EMAIL PROTECTED] writes:
there is a bug (or a feature...) in the version 1.5.3
Note that the latest version of Wget is 1.8.1. I suggest you to
upgrade because the new version handles URLs much better.
I discovered that every doubled slash (//) is converted to a single
to the average speed for ETA, or is there a smarter
way to handle it? What are other downloaders doing?
2002-04-09 Hrvoje Niksic [EMAIL PROTECTED]
* progress.c (bar_update): Maintain an array of the time it took
to perform previous 30 network reads.
(create_image
Maurice Cinquini [EMAIL PROTECTED] writes:
I don't think using only a fraction of a second is a reliable method
for estimating current bandwidth. Here are some factors that can
make for a wildly varing ETAs when just looking at the last fraction
of a second.
- TCP slow start.
-
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 9 Apr 2002, Hrvoje Niksic wrote:
Should we revert to the average speed for ETA, or is there a smarter way to
handle it? What are other downloaders doing?
I'll grab the other part and explain what curl does. It shows a current
speed based
Tony Lewis [EMAIL PROTECTED] writes:
I'm often annoyed by ETA estimates that make no sense. How about showing two
values -- something like:
ETA at average speed: 1:05:17
ETA at current speed: 15:05
The problem is that Wget is limited by what fits in one line. I'd
like to keep enough space
Tony Lewis [EMAIL PROTECTED] writes:
Could you keep an array of speeds that is updated once a second such that
the value from six seconds ago is discarded and when the value for the
second that just ended is recorded?
Right now I'm doing that kind of trick, but for the last N reads from
the
Andre Majorel [EMAIL PROTECTED] writes:
If find it very annoying when a downloader plays yoyo with the
remaining time. IMHO, remaining time is by nature a long term thing
and short term jitter should not cause it to go up and down.
Agreed wholeheartedly, but how would you *implement* a
Daniel Stenberg [EMAIL PROTECTED] writes:
The meter is updated maximum once per second, I don't think it makes
sense to update the screen faster than that.
Maybe not, but I sort of like it. Wget's progress bar refreshes the
screen (not more than) five times per second, and I like the idea of
Roger L. Beeman [EMAIL PROTECTED] writes:
On Wed, 10 Apr 2002, Hrvoje Niksic wrote:
Agreed wholeheartedly, but how would you *implement* a non-jittering
ETA? Do you think it makes sense the way 1.8.1 does it, i.e. to
calculate the ETA from the average speed?
One common programming
Unfortunately, this bug is not easy to fix. The problem is that `-O'
was originally invented for streaming, i.e. for `-O -'. As a result,
many places in Wget's code assume that they can freely operate on the
file names, and -O seems more like an afterthought.
On the other hand, many people
Format::).
wget ftp://hniksic:[EMAIL PROTECTED]/.emacs
this would let other users on the system to see your password using
ps. it should have a big disclaimer.
You're right. I'll apply this patch, which I think should add enough
warnings to educate the unwary.
2002-04-10 Hrvoje Niksic
Guillaume Morin [EMAIL PROTECTED] writes:
When getting a file in a non-root directory from FTP with wget, wget
always tries CWD to that directory before getting the
file. Unfortunately sometimes you're not allowed to CWD to a
directory, but you're all allowed to list or download files from
I believe this is already on the todo list. However, this is made
harder by the fact that, to implement this kind of reject, you have to
start downloading the file. This is very different from the
filename-based rejection, where the decision can be made at a very
early point in the download
Loic Le Loarer [EMAIL PROTECTED] writes:
When I fetch with wget a whole subtree and when directories contains
space or some other special character, these character are
urlencoded in the local version while it is not the case for files.
For exemple if I mirror with wget -m the directory to
Noel Koethe [EMAIL PROTECTED] writes:
Ok got it. But it is possible to get this option as a switch for
using it on the command line?
Yes, like this:
wget -erobots=off ...
Antonis Sidiropoulos [EMAIL PROTECTED] writes:
But when the password contains characters such as '^' or space,
these chars are converted in the form: %{hex code}
e.g. a passwd like ^12 34 is translated to: %5E12%2034, so the
login fails.
Is this a bug ??
Thanks for the report. It is
/11 17:06:02
@@ -1,5 +1,29 @@
2002-04-11 Hrvoje Niksic [EMAIL PROTECTED]
+ * progress.c (struct progress_implementation): Use PARAMS when
+ declaring the parameters of *create, *update, *finish, and
+ *set_params.
+
+ * netrc.c: Ditto.
+
+ * http.c: Reformat some
Nelson H. F. Beebe [EMAIL PROTECTED] writes:
c89 -I. -I. -I/opt/include -DHAVE_CONFIG_H
-DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\ -DLOCALEDIR=\/usr/local/share/locale\ -O
-c connect.c
cc-1164 c89: ERROR File = connect.c, Line = 94
Argument of type int is incompatible with parameter of
to contain URLs. When Wget is taught to rummage through
JavaScript looking for URLs, `-k' will become aware of them as well.
Here is the patch:
2002-04-11 Hrvoje Niksic [EMAIL PROTECTED]
* html-url.c (tag_handle_form): New function. Pick up form
actions and mark them
John Poltorak [EMAIL PROTECTED] writes:
Can anyone confirm that WGET allows the use of wildcards trhough a proxy
server?
It doesn't. Use a substitute:
wget -rl1 -A wildcard URL...
Warwick Poole [EMAIL PROTECTED] writes:
I want to set a timeout of 5 seconds on a wget http fetch. I have
tried -T --timeout etc in the command line and in a .wgetrc
file. wget does not seem to obey these directives.
You have probably encountered the problem that Wget's timeout is not
Marcus - Videomoviehouse.com [EMAIL PROTECTED] writes:
I am trying to get wget to work with a URL with characters that it
doesn't seem to like. I tried putting the URL in quotes and still
gave me similar results. Works find if it is a simple URL like wget
www.something.com/index.html. Any
is that the same guard is not implemented in bar_create()
and bar_finish(), which also call create_image(). In the FTP case,
the crash comes from bar_create. This patch should fix it.
2002-04-11 Hrvoje Niksic [EMAIL PROTECTED]
* progress.c (bar_create): If INITIAL is larger than TOTAL, fix
Christopher Scott [EMAIL PROTECTED] writes:
The attached file contains a link which causes wget 1.8.1 to crash
on Solaris i386 and sparc, on both Solaris 7 and 8 on both
platforms. However, I downloaded the latest version for Windows, and
it ran correctly!?!
I'm afraid I cannot get Wget to
This change is fine with me. I vaguely remember that this test is
performed in two places; you might want to create a function.
Christopher H. Taylor [EMAIL PROTECTED] writes:
Any ETA on when you're going to add a timeout alarm to the connect()
function? I'm running 1.8.1 and still have the same problem. Many of
my applications that utilize wget are time critical and I'm
anxiously awaiting this fix. Thanks for your
Ian Abbott [EMAIL PROTECTED] writes:
On 11 Apr 2002 at 21:00, Hrvoje Niksic wrote:
This change is fine with me. I vaguely remember that this test is
performed in two places; you might want to create a function.
Certainly. Where's the best place for it? utils.c?
As good a place as any.
Kevin Rodgers [EMAIL PROTECTED] writes:
1. Don't #define _XOPEN_SOURCE 500 (by commenting it out).
2. Do #define _VA_ALIST.
I can confirm that (1) works. I didn't try (2).
Could you please try (2) and see if it works out?
I'm reluctant to withdraw the _XOPEN_SOURCE definition because
James C. McMaster (Jim) [EMAIL PROTECTED] writes:
This could be a great resource, but (I hate to say this) it has been
rendered more trouble than it is worth by the stubbornness and
stupidity of the owner. He has turned a deaf ear to all pleas to do
something, ANYTHING, to stop the flood of
301 - 400 of 1457 matches
Mail list logo