Tony Lewis [EMAIL PROTECTED] writes:
Mauro Tortonesi wrote:
no. i was talking about regexps. they are more expressive
and powerful than simple globs. i don't see what's the
point in supporting both.
The problem is that users who are expecting globs will try things like
Tony Lewis [EMAIL PROTECTED] writes:
I didn't miss the point at all. I'm trying to make a completely different
one, which is that regular expressions will confuse most users (even if you
tell them that the argument to --filter is a regular expression).
Well, most users will probably not use
Zembower, Kevin [EMAIL PROTECTED] writes:
[EMAIL PROTECTED]:/tmp$ wget --timestamping --no-host-directories --glob=on
--recursive --cut-dirs=4
'ftp://xxx:[EMAIL PROTECTED]/%2Fccp1/data/shared/news/motd/qotd.txt'
If you need double slash, you must spell it explicitly:
wget [...]
Maybe you should file a bug with the Fedora people. I don't think
Wget is doing anything wrong in the IPv6 department. It basically
calls getaddrinfo and accepts both types of addresses (preferring IPv4
addresses for connecting, unless specified otherwise). That
getaddrinfo should fail means
cliff [EMAIL PROTECTED] writes:
Thanks
$ gcc a.c
$ ./a.out yahoo.com
success
$ wget yahoo.com
--12:27:32-- http://yahoo.com/
= `index.html'
Resolving yahoo.com... failed: No such file or directory.
That is not good because it means that either Wget has a so far
unencountered
Vladimir Volovich [EMAIL PROTECTED] writes:
MT == Mauro Tortonesi writes:
are there any news on the wget update?
MT hrvoje fixed this problem more than one month ago. from the
MT ChangeLog:
i don't see the official source at ftp.gnu.org/gnu/wget/
that's what i'm asking about.
The
Is there any interest in finishing the GnuTLS support in Wget? The
support currently available in the repository can be tested using
`./configure --with-ssl=gnutls'. It should enable you to download
from SSL servers using --no-check-certificate, but it is not yet
finished. Specifically, and in
Thomas Braby [EMAIL PROTECTED] writes:
eta_hrs = (int) (eta / 3600), eta %= 3600;
Yes that also works. The cast is needed on Windows x64 because eta is
a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a
warning is issued.
The same is the case on 32-bit Windows, and also
cliff [EMAIL PROTECTED] writes:
Just an FYI since wget exposes this bug, you may see more questions about
it. The solution to my problem was
https://bugzilla.redhat.com/bugzilla/show_bug.cgi?id=186592
Specically, removing nisplus from the host line in /etc/nsswitch.conf
resolved the
Gregory Stark [EMAIL PROTECTED] writes:
However wget includes an internal copy of GNU getopt. In that case
it would be reasonable for wget to rip out this feature so that the
behaviour always matches the documentation.
You have a point, but it is not entirely true that Wget includes an
Greg Stark [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
What Wget could do to ensure consistency is unset the variable
POSIXLY_CORRECT during option processing. All other effects of
POSIXLY_CORRECT on getopt (such as use of illegal rather than
invalid in the error
Jesse Cantara [EMAIL PROTECTED] writes:
A quick resolution to the problem is to use the -nH command line
argument, so that wget doesn't attempt to create that particular
directory. It appears as if the problem is with the creation of a
directory with a ':' in the name, which I cannot do
Maxim Brandwajn [EMAIL PROTECTED] writes:
Hi guys, I keep getting this error at random files/times:
[...]
What version of Wget are you using?
18 mao [EMAIL PROTECTED] writes:
[...]
To answer the question in your subject line: no, Wget does not
auto-convert downloaded text files. It strives to download all files
unchanged byte-for-byte.
i use wget to download some html files,and use vim to edit them,i
fount that the newline has been
[ Please leave the Cc to [EMAIL PROTECTED] ]
18 mao [EMAIL PROTECTED] writes:
To answer the question in your subject line: no, Wget does not
auto-convert downloaded text files. It strives to download all files
unchanged byte-for-byte.
maybe that's the reason about the character
www.mail [EMAIL PROTECTED] writes:
Something like ...
wget --title=News Server #1 http://www.etc.com/latest_news.html
So that News Server #1 appears as the console title rather than the URL
(or its possible redirect).
I think the standard Windows console application (cmd.exe) always
[ Moving the discussion to the Wget mailing list.
Jerry's patch implements a --random option that shuffles the list
of addresses returned by getaddrinfo. ]
Jerry Lundström [EMAIL PROTECTED] writes:
A user scenario could be that wget with ipv6 enabled always picks a
broken website since RFC
www.mail [EMAIL PROTECTED] writes:
Changing console title was IMHO a mistaken feature to implement in the
first place
I agree.
In the case of Windows, I believe that no console application should
alter the window title, as such applications will overwrite the title
which was specified at
Herold Heiko [EMAIL PROTECTED] writes:
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
The idea behind the feature is that you can see which URL is
*currently* being downloaded (you can specify several). That's
somewhat different than just seeing the command line. I still
consider
Kat [EMAIL PROTECTED] writes:
Something like ...
wget --title=News Server #1 http://www.etc.com/latest_news.html
Doesn't work.
I believe that was intended as a suggestion, not as a description of
an existing feature.
ks [EMAIL PROTECTED] writes:
How do i define inside .wgetrc that it should not follow any links a
href=abc inside html files only download the files that are present in
the directory?
The only way to find which files are present *is* to follow a
href=... (and other elements). If you don't
Jeff Dickey [EMAIL PROTECTED] writes:
It would be very nice if there were a complement to the --limit-rate
parameter to specify a minimum allowable transfer rate, such as -M
or --minimum-rate; this would abort a transfer and cause wget to
terminate with a nonzero exit code when the transfer
J. Grant [EMAIL PROTECTED] writes:
I think I may have found a bug, the ETA is listed, but not the
K/s rate, the ETA must have been calculated using an K/s rate
determined by the current time into the download.
The --.-- current download rate means that the download is currently
not
J. Grant [EMAIL PROTECTED] writes:
The --.-- current download rate means that the download is
currently
not progressing. The ETA calculation is based on the average download
rate, and is always available.
is the K/s rate not the average rate ?
No, it's the current rate.
J. Grant [EMAIL PROTECTED] writes:
Could an extra value be added which lists the average rate? average
rate: xx.xx K/s ?
Unfortunately it would have problems fitting on the line.
yy :) [EMAIL PROTECTED] writes:
I ran wget -P /tmp/.test [1]http://192.168.1.10; in SUSE system (SLES 9)
and found that it saved the file in /tmp/_test.
This command works fine inRedHat, is it a bug?
I believe the bug is introduced by SuSE in an attempt to protect the
user. Try reporting it
Don Armstrong [EMAIL PROTECTED] writes:
Summary: The issue with wget.texi is that the GNU GPL is an Invariant
Section; since the GNU GPL cannot be modified anyway, this just forces
gpl.texi to always be distributed with wget.texi, even when you're
just distributing the manual.
The GPL text
Don Armstrong [EMAIL PROTECTED] writes:
On Thu, 18 May 2006, Hrvoje Niksic wrote:
If the point you're making is that someone might want to remove the
GPL text from the manual, for example to make it shorter, I guess
that's a valid concern.
Yes, that's the issue.
I see. But that's still
Mauro Tortonesi [EMAIL PROTECTED] writes:
Noèl Köthe wrote:
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
i wonder if it makes sense to add generic support for multiple headers
in wget, for instance by extending the --header
Mauro Tortonesi [EMAIL PROTECTED] writes:
Toni Casueps wrote:
I use Wget 1.10 for Linux. If I use -O and there was already a file
in the current directory with the same name it overwrites it, even
if I use -nc. Is this a bug or intentional?
IMVHO, this is a bug. if hrvoje does not provide a
Robert Nicholson [EMAIL PROTECTED] writes:
When I set my acceptlist to
viewtopic.php?t=5098
I notice that when it actually does the check it only sends
thru viewtopic.php leaving off the rest of the query string
Is that intentional?
It's intentional -- the acceptlist and friends check
Robert Nicholson [EMAIL PROTECTED] writes:
When wget is traversing a url what stops it visiting that url again?
It keeps a table of visited URLs.
and assuming it checks the url is it only checking for the exact
string?
It is.
ie. different url but same response because the url it's
Robert Nicholson [EMAIL PROTECTED] writes:
It looks like some modules are going to send back this and in some
cases I'd like to it retry rather than continue just because it
thinks it get the file.
How do you propose for Wget to differentiate between 200 ok and 200
Service Temporarily
I understand and agree with the reasoning behind removing the GPL as
the invariant section; but why also remove the GFDL as an invariant
section?
long.
This means that the only the last 1 or 2 bytes should be used in the base
64 algorithm.
You're right; thanks for reporting this. I have now installed this fix:
2006-06-19 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c (base64_encode): Would read past end of STR.
Reported
Note that, even if you don't import 1.11 into the current Fedora, you
can always take a look at the bugfixes in the 1.10 branch. For
example:
$ svn diff http://svn.dotsrc.org/repo/wget/tags/WGET_1_10_2 \
http://svn.dotsrc.org/repo/wget/branches/1.10
bruce [EMAIL PROTECTED] writes:
any idea as to who's working on this feature?
No one, as far as I know.
bruce [EMAIL PROTECTED] writes:
as you guys create/go forth in dealing with windows.. are you
focused on XP, or 2000 as well... keep in mind, there are a lot of
2000 users still around!!
Historically Wget was supposed to compile and run on Windows 98 as
well. I haven't been able to test
Herold Heiko [EMAIL PROTECTED] writes:
The --ignore-case (and wgetrc option) don't seem to be documented in
the texi.
Fixed now, thanks for the report!
bruce [EMAIL PROTECTED] writes:
i tried to follow the instructions provided with the 1.10 source on
the site...
I don't think that has anything to do with Windows 2000 vs. Windows
XP. You need to make sure that the correct windows/config*.h is being
picked for your compiler. The fact that
bruce [EMAIL PROTECTED] writes:
add to 'struct cmdline_option option_data[]'
.
.
{ version, 'V', OPT_FUNCALL, (void *) print_version, no_argument },
{ wait, 'w', OPT_VALUE, wait, -1 },
{ college-file, 'C', OPT_VALUE, collegefile, -1 },
{ waitretry, 0,
bruce [EMAIL PROTECTED] writes:
btw, i just saw the comment you menttioned in the init.c file is
it possible to add this same comment to main.c.
The comment there would be misleading because options in main.c are in
fact not alphabetically sorted. They don't need to be because they
are
Jamie Zawinski [EMAIL PROTECTED] writes:
If I specify -O, it is able to download the data; but if wget is
picking the file name itself, it is unable to write the file
(invalid argument). Neither --restrict-file-names=unix nor --
restrict-file-names=windows affects it.
It could be that your
Tony Lewis [EMAIL PROTECTED] writes:
I don't think that's valid HTML. According to RFC 1866: An HTML user
agent should treat end of line in any of its variations as a word
space in all contexts except preformatted text. I don't see any
provision for end of line within the HREF attribute of
Jochen Roderburg [EMAIL PROTECTED] writes:
E.g, a file which was supposed to have the name BW.txt came with the header:
Content-Disposition: attachment; filename=Bamp;W.txt;
All programs I tried (the new wget and several browsers and my own script ;-)
seemed to stop parsing at the first
Mauro Tortonesi [EMAIL PROTECTED] writes:
Hrvoje Niksic ha scritto:
Gisle Vanem [EMAIL PROTECTED] writes:
Kinda misleading that wget prints login incorrect here. Why
couldn't it just print the 530 message?
You're completely right. It was an ancient design decision made by me
when I wasn't
Christopher G. Lewis [EMAIL PROTECTED] writes:
For some reason, a change that was made in log.c between 1.8 and 1.9
has broken the ability to do a build without debug enabled.
Basically, in config.h if you change ENABLE_DEBUG to 0, wget will no
longer build.
That's not how it works, you're
Christopher G. Lewis [EMAIL PROTECTED] writes:
Lets edit that and comment out ENABLE_DEBUG:
config.h
/* Define if you want the debug output support compiled in. */
/* #define ENABLE_DEBUG 1 */
After futzing around for a little while, I got it to work.
That should work without any
www.mail [EMAIL PROTECTED] writes:
The following command crashes wget 1.11-alpha-1 on Windows 2000 SP4:
wget --output-document=- --no-content-disposition http://www.google.com/;
Fixed now, thanks for the report.
Matthias Kuehn [EMAIL PROTECTED] writes:
it seems wget sorts the v4/v6 ip´s before creating a table of the
returned ip´s.
Wget does really do that. It does prefer IPv4 addresses to IPv6
addresses, and it caches the addresses resolved (but only during a
single Wget run), but it doesn't sort
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header, the best solution i could
Jochen Roderburg [EMAIL PROTECTED] writes:
Petr Kras schrieb:
When transfer is broken and restoration is required
it doesnt work for files greater than 4GB (not checked for 2GB)
and brake is behind 4GB (2GB) limit.
--13:58:54-- ftp://streamlib.pan.eu/Streams/TVDC_SS_01100.ts
=
Nicle Yang [EMAIL PROTECTED] writes:
Does the wget support the HTTP1.1 chunked data?
It doesn't. Wget sends out an HTTP/1.0 request, so it shouldn't (in
theory) ever encounter chunked data.
[EMAIL PROTECTED] writes:
O, I understand, I hoped only that it be known that the accepted is a bad
spelling.
It is known. The documentation says:
referer = STRING
Set HTTP `Referer:' header just like `--referer=STRING'. (Note it
was the folks who wrote the HTTP spec who got the
Axel Boldt [EMAIL PROTECTED] writes:
I would like to advocate for a multithreading/parallel download
feature and I believe the last quoted sentence above is simply
false; parallel downloading provides considerable speedups in almost
all settings.
The most noticable speedup in the
Tony Lewis [EMAIL PROTECTED] writes:
A) This is the list for reporting bugs. Questions should go to
wget@sunsite.dk
For what it's worth, [EMAIL PROTECTED] is simply redirected to
[EMAIL PROTECTED] It is still useful to have a separate address for
bug reports, for at least two reasons. One,
Ed [EMAIL PROTECTED] writes:
Sorry for off topic post but as you see I can't get anything else to work.
If I try to join the list at sunsite.dk I get a message in Danish
asking me to do something to verify myself, I tried translating it
but it made no sense.
How exactly are you trying to
Lars Hamren [EMAIL PROTECTED] writes:
Download speeds are reported as K/s, where, I assume, K is short
for kilobytes.
It's meant to stand for what is now known as kibibyte.
The correct SI prefix for thousand is k, not K:
The prefix doesn't refer to thousands.
Ted Mielczarek [EMAIL PROTECTED] writes:
Is there any interest in this?
Sorry for answering this late. I, for one, find it very interesting.
Fetching CSS would be a very welcome feature.
Thanks for the report and the (correct) analysis. This patch fixes
the problem in the trunk.
2007-01-23 Hrvoje Niksic [EMAIL PROTECTED]
* cookies.c (parse_set_cookie): Would erroneously discard cookies
with unparsable expiry time.
Index: src/cookies.c
Thanks for the report. This patch fixes the problem:
2007-01-23 Hrvoje Niksic [EMAIL PROTECTED]
* progress.c (create_image): Check for ETA overflow.
(print_row_stats): Ditto.
Index: src/progress.c
===
--- src
Nejc ┼koberne [EMAIL PROTECTED] writes:
[EMAIL PROTECTED]:~# wget -O /dev/null http://10.0.0.2/testsmall.dat 21 |
grep saved
10:38:13 (86,22 MB/s) - `/dev/null' saved [21954560/21954560]
[EMAIL PROTECTED]:~# wget -O /dev/null ftp://testuser:[EMAIL
PROTECTED]/testsmall.dat 21 | grep saved
Vladimir Volovich [EMAIL PROTECTED] writes:
when using the -S option, wget dies apparently because the server
returns 8-bit characters in the WWW-Authenticate header:
[...]
Thank for the report and the test case. This patch fixes the problem:
2007-02-02 Hrvoje Niksic [EMAIL PROTECTED
Applied, thanks. Sorry about the delay.
Thanks for the report. Please note that your patch sets the thousands
separator to C, which is probably not what you had in mind. I'm
about to apply a slightly different patch to deal with the problem you
describe:
2007-02-11 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It's starting to look like a consensus. A Google search for:
wget DONE_CWD
finds:
http://www.mail-archive.com/wget@sunsite.dk/msg08741.html
That bug is fixed in subversion, revision 2194.
Hrvoje Niksic [EMAIL PROTECTED] writes:
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It's starting to look like a consensus. A Google search for:
wget DONE_CWD
finds:
http://www.mail-archive.com/wget@sunsite.dk/msg08741.html
That bug is fixed in subversion, revision 2194.
I
Eugene Homyakov [EMAIL PROTECTED] writes:
Could you please make -i option accept URL? This is useful when
downloading m3u's
Note that you can easily chain Wget invocations, e.g.
wget -qO- URL | wget -i-
Robert Millan [EMAIL PROTECTED] writes:
-AC_CHECK_FUNCS(strtoll usleep ftello sigblock sigsetjmp memrchr)
+AC_CHECK_FUNCS(strtoll usleep ftello sigblock sigsetjmp memrchr strcasecmp
strncasecmp strdup isatty symlink)
-dnl We expect to have these functions on Unix-like systems configure
Greg Lindahl [EMAIL PROTECTED] writes:
Host: kpic1 is a HTTP/1.1 feature. So this is non-sensical.
The `Host' header was widely used with HTTP/1.0, which is how it
entered the HTTP/1.1 spec.
For other reasons, Wget should really upgrade to using HTTP/1.1.
Poppa Pump [EMAIL PROTECTED] writes:
Now I also need to load 2 more cookie values, but these are set
using Javascript. Does anyone know how to set those cookies. I can't
seem to find any info on this. Thanks for your help.
Wget doesn't really distinguish the cookies set by Javascript from
George Pavlov [EMAIL PROTECTED] writes:
Permanent cookies are supposed to be present in cookies.txt, and
Wget will use them. Session cookies will be missing (regardless
of how they were set) from the file and therefore will not be
picked up by Wget.
This is not entirely true. You can
Adrian Sandor [EMAIL PROTECTED] writes:
Thanks a lot Steven,
Apparently there's more than a little code in src/cookies.c which is
not ready for NULL values in the attr and value members of the
cookie structure.
Does that mean wget is buggy or does brinkster break the cookie
specification?
Micah Cowan [EMAIL PROTECTED] writes:
The GNU Project has appointed me as the new maintainer for wget,
Welcome!
If you need assistance regarding the workings of the internals or
design decisions, please let me know and I'll gladly help. I haven't
had much time to participate lately, but
Tony Lewis [EMAIL PROTECTED] writes:
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, \%.2047s\, file);
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and arbitrary limits on file
name
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was supposed to indicate how to free the string.
Virden, Larry W. [EMAIL PROTECTED] writes:
Tony Lewis [EMAIL PROTECTED] writes:
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
arbitrary limits on file name length.
If it uses the heap, then doesn't that open a hole where a
Rich Cook [EMAIL PROTECTED] writes:
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
Micah Cowan [EMAIL PROTECTED] writes:
What is the status of the wget-patches list: is it being actively
used/monitored? Does it still serve its original purpose?
Mauro and I are subscribed to it. The list served its purpose while
Wget was actively maintained. It's up to you whether to
Micah Cowan [EMAIL PROTECTED] writes:
Mauro and I are subscribed to it. The list served its purpose while
Wget was actively maintained. It's up to you whether to preserve it
or replace it with a bug tracker patch submission process.
Given the low incidence of patch submission, is there any
Micah Cowan [EMAIL PROTECTED] writes:
I would like for devs to be able to avoid the hassle of posting
non-trivial changes they make to the wget-patches list. To my mind,
there are two ways of accomplishing this:
1. Make wget-patches a list _only_ for submitting patches for
consideration by
Micah Cowan [EMAIL PROTECTED] writes:
Someone just asked on the #wget IRC channel if there was a way to
exclude files with certain names, and I recommended -X, without
realizing that that option excludes directories, not files.
My question is: why do we allow users to exclude directories,
Micah Cowan [EMAIL PROTECTED] writes:
Yes, but -R has a lesser degree of control over the sorts of
pathnames that it can constrain: for instance, if one uses
-Rmyprefix*, it will match files myprefix-foo.html and
myprefix-bar.mp3; but it will also match notmyprefix.js, which is
probably not
Micah Cowan [EMAIL PROTECTED] writes:
I think we should either be a stub, or a fairly complete manual
(and agree that the latter seems preferable); nothing half-way
between: what we have now is a fairly incomplete manual.
Converting from Info to man is harder than it may seem. The script
Micah Cowan [EMAIL PROTECTED] writes:
Converting from Info to man is harder than it may seem. The script
that does it now is basically a hack that doesn't really work well
even for the small part of the manual that it tries to cover.
I'd noticed. :)
I haven't looked at the script that
Micah Cowan [EMAIL PROTECTED] writes:
I don't know. The reason directories are matched separately from
files is because files often *don't* match the pattern you've chosen
for directories. For example, -X/etc should exclude anything under
/etc, such as /etc/passwd, but also
Micah Cowan [EMAIL PROTECTED] writes:
I agree that it's probably a good idea to move HTML parsing to a model
that doesn't require slurping everything into memory;
Note that Wget mmaps the file whenever possible, so it's not actually
allocated on the heap (slurped). You need some memory to
Micah Cowan [EMAIL PROTECTED] writes:
Yes, but when mmap()ping with MEM_PRIVATE, once you actually start
_using_ the mapped space, is there much of a difference?
As long as you don't write to the mapped region, there should be no
difference between shared and private mapped space -- that's
Micah Cowan [EMAIL PROTECTED] writes:
Actually, I was wrong though: sometimes mmap() _is_ failing for me
(did just now), which of course means that everything is in resident
memory.
I don't understand why mmapping a regular would fail on Linux. What
error code are you getting?
(Wget tries
Micah Cowan [EMAIL PROTECTED] writes:
I have a question: why do we attempt to generate absolute paths and
such and CWD to those, instead of just doing the portable
string-of-CWDs to get where we need to be?
I think the original reason was that absolute paths allow crossing
from any directory
Esin Andrey [EMAIL PROTECTED] writes:
Hi!
I have downloaded wget-1.10.2 sources and try to compile it.
I have some warnings:
/|init.c: In function ‘cmd_spec_prefer_family’
init.c:1193: warning: доступ по указателю с приведением типа нарушает правила
перекрытия объектов в памяти
|/I have
control H [EMAIL PROTECTED] writes:
After a few hours of headache I found out my --post-data option
didn't work as I expected because the data I send has to be
URL-escaped. This is not mentioned both in the manpage and inline
help. A remark would be helpful.
Note that, in general, it
---BeginMessage---
Hi,I am using wget 1.10.2 in Windows 2003.And the same problem like Cantara.
The file system is NTFS.
Well I find my problem is, I wrote the command in schedule tasks like this:
wget -N -i D:\virus.update\scripts\kavurl.txt -r -nH -P
d:\virus.update\kaspersky
well, after
Micah Cowan [EMAIL PROTECTED] writes:
It is actually illegal to specify byte values outside the range of
ASCII characters in a URL, but it has long been historical practice
to do so anyway. In most cases, the intended meaning was one of the
latin character sets (usually latin1), so Wget was
Jim Wright [EMAIL PROTECTED] writes:
- --limit-rate will find your version handy, but I want to hear from
them. :)
I would appreciate and have use for such an option. We often access
instruments in remote locations (think a tiny island in the Aleutians)
where we share bandwidth with other
Jim Wright [EMAIL PROTECTED] writes:
I think there is still a case for attempting percent limiting. I
agree with your point that we can not discover the full bandwidth of
the link and adjust to that. The approach discovers the current
available bandwidth and adjusts to that. The
Micah Cowan [EMAIL PROTECTED] writes:
Among other things, version.c is now generated rather than
parsed. Every time make all is run, which also means that make
all will always relink the wget binary, even if there haven't been
any changes.
I personally find that quite annoying. :-( I hope
Tony Godshall [EMAIL PROTECTED] writes:
available bandwidth and adjusts to that. The usefullness is in
trying to be unobtrusive to other users.
The problem is that Wget simply doesn't have enough information to be
unobtrusive. Currently available bandwidth can and does change as new
Tony Godshall [EMAIL PROTECTED] writes:
My point remains that the maximum initial rate (however you define
initial in a protocol as unreliable as TCP/IP) can and will be
wrong in a large number of cases, especially on shared connections.
Again, would an algorithm where the rate is
Micah Cowan [EMAIL PROTECTED] writes:
FYI, I've removed the PATCHES file. Not because I don't think it's
useful, but because the information needed updating (now that we're
using Mercurial rather than Subversion), I expect it to be updated
again from time to time, and the Wgiki seems to be
1301 - 1400 of 1457 matches
Mail list logo