to build on all compilers, without warnings.
This patch applies on top of the previous one:
2002-04-12 Hrvoje Niksic [EMAIL PROTECTED]
* config.h.in: Only define _VA_LIST when compiled with gcc.
Index: src/config.h.in
===
RCS
Boaz Yahav [EMAIL PROTECTED] writes:
Is there any way to make Wget use HTTP/1.1 ?
Unfortunately, no.
Uncle [EMAIL PROTECTED] writes:
I have a problem with subject. WGet tells me that Continued download failed
on this file, which conflicts with `-c'. Refusing to truncate existing
file. How to solve this problem.
System environment: Windows98 SE, VC++ 6.0, Wingate 4.2.0 proxy.
That message
Uncle [EMAIL PROTECTED] writes:
As you know, ftp proxy is not supported by wget 1.8.1. I have added
some simple code, that allows to process wget through this type of
proxy servers using scheme of login in this way:
[EMAIL PROTECTED] I will send result patch
or joint to wget developers, if
Jochen Roderburg [EMAIL PROTECTED] writes:
That might have been the intended reason for this error message, but
I have also seen it several times with HTTP servers that *did*
support partial downloads. In my case it was on Unix platforms with
wget versions = 1.7. I usually then went back to
Paul Eggert [EMAIL PROTECTED] writes:
I'm using wget 1.6 on Solaris 8 (sparc), and am connected to the
Internet via a FWTK FTP proxy http://www.fwtk.org/main.html.
If I want to retrieve a file via the standard Solaris 'ftp' command,
without using 'wget', I do something like this:
$
Newer versions of Wget check the server type and adjust the directory
listing parser accordingly. If I remember correctly, NT directory
listing is now supported.
Alan E [EMAIL PROTECTED] writes:
It would be a lot easier to report this shit to SpamCop if the
mailing list software didn't strip the incoming headers.
I don't know if ezmlm can be made not to strip the headers. Perhaps
one of the admins knows if that's possible?
Karsten Thygesen [EMAIL PROTECTED] writes:
Hrvoje == Hrvoje Niksic [EMAIL PROTECTED] writes:
Hrvoje Alan E [EMAIL PROTECTED] writes:
It would be a lot easier to report this shit to SpamCop if the
mailing list software didn't strip the incoming headers.
Hrvoje I don't know if ezmlm
Daniel Stenberg [EMAIL PROTECTED] writes:
The web interface is not necessary. Listar, for instance, just
forwards the dubious mails to the moderator. Approving the message
is done by replying to listar (actually forwarding to
somelist-repost@somedomain, but you get the idea).
The benefit
Mika Tuupola [EMAIL PROTECTED] writes:
I have a site which has relative links like this:
a href=jump?dest=barlang=foolink/a
I have been trying different switches to make wget -r follow
those links but have been unsuccesfull. Is this possible with
the current
things for you:
2002-04-15 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (download_child_p): Don't ignore rejection of HTML
documents that are themselves leaves of recursion.
Index: src/recur.c
===
RCS file: /pack/anoncvs
You're probably right; there should be an option to disable DNS
caching. As a stop-gap measure, you can simply stop `lookup_host'
from caching the information it retrieves, by commenting the call to
`cache_host_lookup' at the end of `lookup_host'.
[EMAIL PROTECTED] writes:
Unfinished sentence...
Another way to specify username and password is in the
URL itself. For more information about security
issues with Wget,
If only that were a typo. It's a bug in the ugly script that converts
the Texinfo
I'm afraid that downloading files larger than 2G is not supported by
Wget at the moment.
Löfstrand Thomas [EMAIL PROTECTED] writes:
I have used wget with -d option to see what is going on, and it seems
like the proxyserver returns the following response: X-PLEASE_WAIT.
After reading the source code in http.c it seems like wget expects
the answer from the proxy to be HTTP/ and a
the retrieval by printing dots on the screen, each dot
representing a fixed amount of downloaded data.
But it looks like the default is bar.
Yes. Thanks for the report; I'm about to apply this fix.
2002-04-15 Hrvoje Niksic [EMAIL PROTECTED]
* wget.texi (Download
Velimir Kalik [EMAIL PROTECTED] writes:
Is it posible to specify for wget not to use proxy for some IPs or
domains? E.g. not to use proxy for www.nba.com, but use it for
everything else.
Thanks and please cc replies to my email address too!
Yes, that should work with the `no_proxy'
Jeroen W. Pluimers \(mailings\) [EMAIL PROTECTED] writes:
I wonder if anyone is maintaining RedHat 6.x RPM's for wget.
I have no idea. But, Wget is fairly easy to build from source, so I
never really bothered to find out.
I could not find a 1.8.1. RPM on the net using google nor using
that you describe. Correctly written pages
will not be affected adversely, and that's what truly matters.
Here is a patch that should implement what you need. Please let me
know if it works for you.
2002-04-16 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): If Content-Type is not given
[EMAIL PROTECTED] writes:
Good evening, I'm trying to make a ftp with WGET 1.7 My problem is
that my PC is under a proxy, and there is no way to make an FTP
unless you make first a FTP to that proxy, and then the proxy opens
a ftp session to the final machine what you want to connect to You
Thanks for the report. The thing I don't quite understand is, how come
you are the only one to experience this? My `msgfmt --version' says
0.10.40, so I'm not sure what your 1.3 refers to.
Maybe you should upgrade gettext?
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Tue, 16 Apr 2002, Hrvoje Niksic wrote:
Thanks for the report. The thing I don't quite understand is, how come
you are the only one to experience this? My `msgfmt --version' says
0.10.40, so I'm not sure what your 1.3 refers to.
Maybe you
Thomas Lussnig [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
You're probably right; there should be an option to disable DNS
caching. As a stop-gap measure, you can simply stop `lookup_host'
from caching the information it retrieves, by commenting the call to
`cache_host_lookup' at the end
Maciej Rozycki offered to contribute patches to convert Wget's
configure.in to work with Autoconf 2.53. I think the move makes
sense; all the new Autoconf development will be on that release, and
it makes sense to stay in touch with that.
Does anyone think this might be a bad idea? Are there
Roger, thanks for this patch, but that's not quite what I had in
mind. Specifically, I'd like to keep calculating the download speed
exactly as now -- based only on current timings.
However, I would like the ETA to be based on the smarter model of
predicting overall speed. I'll try to modify
R.I.P. Deaddog [EMAIL PROTECTED] writes:
Apparent several pieces of email are detected to contain suspecious
attachments, but they still pass through the filter.
Yes. :-(
I will lobby for the inclusion of list-specific rules.
Please take over the fix if you can. Christian told me that he would
be busy for the next several weeks. He is unsure how much time he can
spend on fixing Wget bugs.
Daniel Stenberg [EMAIL PROTECTED] writes:
Can the admin of the wget list please prevent his mails from showing
up here?
*sigh*
I'm doing all in my power to get the sunsite people to accept the
proposed changes to SpamAssassin configuration. It takes time because
it requires changes to
Thanks. The problem is that I don't personally control the mailing
list. Also, whatever scheme you use, there is the question of false
positives -- the detected spams have to at least be forwarded to the
moderators.
Take a look at the SpamAssasssin, which we are using now. It is a
very good
What algorithm is recommending for validating cookie domains? I
originally tried to implement a check compliant with rfc2109, but that
proved not to work with a bunch of web sites. So I implemented the
recommendation from the bogosity under
Thomas Lussnig [EMAIL PROTECTED] writes:
if you do not resect this than you can set cookies for an hole
country and read them. Like co.uk. etc. So i would prever to
handle it like mozilla. do not allow cookie generalization !!!
Thomas, please think this through before responding. *Of
Ian Abbott [EMAIL PROTECTED] writes:
To quote from there:
[...] Only hosts within the specified domain can set a cookie for
a domain and domains must have at least two (2) or three (3)
periods in them to prevent domains of the form: .com, .edu,
and va.us. Any domain
Yes -- Wget versions up to the latest one in CVS can get suck in
connect() if the remote server is sufficiently catatonic. In the
latest CVS sources the `-T' timeout also covers the timeout of
`connect'.
monstru [EMAIL PROTECTED] writes:
this letter is to simply comment you that the new option in 1.8
version, --limit-rate, doesn't appear in the man page... it seems
that someone forgot to put it in the page... or not.! :)
Yes; it's a simple oversight.
It will be documented in the next
OK, I've now implemented a very silly check, but at least it will
cover most cases of usage. Setting the cookie for a second-level
domain (e.g. .foo.bar) is allowed if:
+ The top-level domain is one of the several recognized ones
OR
+ Its subdomain is more than three characters long.
This
[ Added the list to Cc ]
Thomas Lussnig [EMAIL PROTECTED] writes:
i can write some Doku for the SSL Options and what they are used
for. and i think his point is right --ssl-cakey --ssl-cafile would
be nicer but i used the naming convention that exist from the older
ssl options.
I know,
Ian Abbott [EMAIL PROTECTED] writes:
I realized it was stupid after I posted it (I was about to leave!)
when I remembered cc domains like .de don't need an extra period. I
thought maybe a table of exceptions would sort that out
The problem is that new domains appear all the times, and
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Mon, 22 Apr 2002, Jamie Zawinski wrote:
I know this would be somewhat evil, but can we have a special case in
wget to assume that files named ?N=D and index.html?N=D are the same
as index.html? I'm tired of those dumb apache sorting directives
Tony Lewis [EMAIL PROTECTED] writes:
Maciej W. Rozycki wrote:
Hmm, it's too fragile in my opinion. What if a new version of Apache
defines a new format?
I think all of the expressions proposed thus far are too fragile. Consider
the following URL:
Max Waterman [EMAIL PROTECTED] writes:
Someone (rudely) suggested it was unacceptable to ask for a 'cc'
rather than joining the email list.
That is not the case -- it is perfectly acceptable to post a question
and ask for `Cc'. Especially so when you're posting to
[EMAIL PROTECTED], an
Noel Koethe [EMAIL PROTECTED] writes:
If the http content-length header differs from actual data length,
wget disregards the http specification as follows:
It doesn't disregard the HTTP specification. As far as I'm aware,
HTTP simply specifies that the information provided by Content-Length
Herold Heiko [EMAIL PROTECTED] writes:
I think wget needs sometimes (often) to reread what it wrote to the
disk (html conversion). This means something like that wouldn't
work, or better, would be to specialized.
In the long run, I hope to fix that. The first step has already been
done --
Caddell, Travis [EMAIL PROTECTED] writes:
I'm stuck with Windows at my office :(
But what option offered by wget would allow the user to specify the name of
the folder that the web site would be saved in.
For example if I were to wget -cdr www.cnn.com the folder would be
named
Herold Heiko [EMAIL PROTECTED] writes:
But if I understand this correctly (sorry, sources not checked, foot
in mouth ecc.) with -k wget still needs to correct the html files
later, when it knows what has been downloaded and what not. So it
can't print the file as soon as downloaded, only at
Noel Koethe [EMAIL PROTECTED] writes:
I got the following bug report [http://bugs.debian.org/144242]:
As always, thanks for forwarding the report.
This is to say, wget claims to be nearly finished downloading the
file, but does nothing if left for long periods. The file has been
completely
The FSF people have agreed that we add the following exception to
Wget's license to allow linking with OpenSSL. The lawyer blurb at the
beginning of every source file that explains about the GPL now
contains the following appendix:
In addition, as a special exception, the Free Software
Since we need to have a release because of the OpenSSL legalese, we
can as well fix the most important (crashing) bugs in 1.8.1. I have
opened a branch named `branch-1_8_2' where the 1.8.2-specific changes
will be applied.
Note that only bug fixes will be accepted for 1.8.2. No new features.
I agree that there is use to such an option. Please read the
`PATCHES' file for instructions how to submit changes to Wget.
Yup. That bug has been fixed in the CVS, and will be fixed in the
next release. Thanks for the report.
Peter L. Ashford [EMAIL PROTECTED] writes:
Of course, this list admin also feels that it's OK to tell someone
that you don't care what they have to say on anything, except one
question that you're asking (list open to non-subscribers).
I'm trying to understand the message behind the sarcasm
Ian Abbott [EMAIL PROTECTED] writes:
Windows versions will still have problems saving filenames with the
query character '?' in them. Should we introduce a temporary change
to remap this to something else (e.g. '@') in the Windows version of
Wget 1.8.2?
Sure; we can do that. But then
Herold Heiko [EMAIL PROTECTED] writes:
Personally if any possible I try to not cange my compiles in any way - I'd
have to maintain them, and usually I don't have the time for that. Also I'd
prefer not to inflict my C skills to unwilling users :)
As most users (I suppose) I'd welcome even a
Herold Heiko [EMAIL PROTECTED] writes:
Why only to the 1_8_2 branch ?
Because 1.8.2 will be released in matter of days. 1.9 is under no
such pressure, so there's less need for temporary solutions.
There's a lot of people out there using cvs code.
True -- and I like the CVS code being
Noel Koethe [EMAIL PROTECTED] writes:
shouldn't the OpenSSL exception also added to the COPYING file?
Good question. Should it be appended, after the entire text of the
GPL? Or should it be at the beginning?
Ian Abbott [EMAIL PROTECTED] writes:
I'm not sure where the ChangeLog entry goes. Is it src/ChangeLog
src/ChangeLog on the branch, that for sure.
or src/ChangeLog-branches/1.8.2_branch.ChangeLog, or even
src/ChangeLog-branches/1.8_branch.ChangeLog?
Good question. Two points:
* I screwed
[EMAIL PROTECTED] writes:
I do the following:
wget http://killefiz.de/zaurus/showdetail.php?app=221
but the file is saved as http://killefiz.de/zaurus/showdetail.php@app=221
(*.php?app gets translated to *.php@app)
Are you running Wget on Windows?
Jamie Zawinski [EMAIL PROTECTED] writes:
...since ^M and ^H tricks don't work in emacs shell buffers,
they just make a mess.
Yup. This patch should fix things.
2002-05-24 Hrvoje Niksic [EMAIL PROTECTED]
* progress.c (bar_set_params): Fall back to dot progress
Jamie Zawinski [EMAIL PROTECTED] writes:
Roger L. Beeman wrote:
On Fri, 24 May 2002, Hrvoje Niksic wrote:
Jamie Zawinski [EMAIL PROTECTED] writes:
...since ^M and ^H tricks don't work in emacs shell buffers,
they just make a mess.
The tricks seem to be working fine in emacs
Thomas Lussnig [EMAIL PROTECTED] writes:
i think to add it on all source files is an bit of overkill.
But I was told that it was necessary for legal reasons, therefore it's
already been done.
Because it is only mathers the point on using openssl on some
operating systems. And on the other
Ian Abbott [EMAIL PROTECTED] writes:
The 1.8.2 branch is pretty similar to 1.8.1 at the moment and
doesn't compile with any version of Borland C++.
Should we care to fix that before the release? I'm not sure how
important error-free compilation under Borland is.
Henrik van Ginhoven [EMAIL PROTECTED] writes:
problem, I agree. On large networks some evil-minded person could
write a tiny cron-script that ran once every 5 minutes or so to
parse ps-output looking for nothing but passwords,
Note that the standard workaround for this problem, which is now
Here is the final (ha ha) pre-test for the 1.8.2 bugfix release. Get
it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.8.2-pre3.tar.gz
The difference between -pre1 is that the often reported Windows file
name problem has now been fixed.
Please try to compile it and see if it works for
Thanks; I've now applied it to the 1.8.2 branch.
Herold Heiko [EMAIL PROTECTED] writes:
Relevant windows binary at http://space.tin.it/computer/hherold .
One (1, sorry, no time) test done with something like
wget -vkKHp http://ars.userfriendly.org/cartoons/?id=20020527
Seems to work correctly.
Heh, that's a nice test. Glad it works. :-)
Here is another pre-test of what will become Wget 1.8.2 RSN unless a
major problem is discovered.
http://fly.srk.fer.hr/~hniksic/wget/wget-1.8.2-pre4.tar.gz
The difference between -pre3 is that an overlooked https recursion fix
has been imported from the main trunk, thanks to a reminder
Jacques Beigbeder [EMAIL PROTECTED] writes:
I ran into a trouble with:
wget -m http://some/site
because of a line like:
img src=a.gif v:shapes=...
v:shapes contains a character ':', so a.gif isn't mirrored.
Correction for wget 1.8.1:
(line 340 of src/html-parse.c)
#define
Hrvoje Niksic [EMAIL PROTECTED] writes:
Jacques Beigbeder [EMAIL PROTECTED] writes:
I ran into a trouble with:
wget -m http://some/site
because of a line like:
img src=a.gif v:shapes=...
v:shapes contains a character ':', so a.gif isn't mirrored.
Correction for wget 1.8.1
Ian Abbott [EMAIL PROTECTED] writes:
Should we care to fix that before the release? I'm not sure how
important error-free compilation under Borland is.
I could attempt to get it to compile on Borland C++ 4.5. I'm not sure
which previous releases compiled okay with that compiler, though.
Doug Kaufman [EMAIL PROTECTED] writes:
On Sat, 25 May 2002, Hrvoje Niksic wrote:
Here is the final (ha ha) pre-test for the 1.8.2 bugfix release. Get
...
Please try to compile it and see if it works for you. It should
This doesn't work out of the box for DJGPP or for Cygwin. Appended
Thomas Lussnig [EMAIL PROTECTED] writes:
i make some thoughts about the tls/ssl/input topic. With an look
about who to make such changes easyer in the future. And i think one
good idee maybe something like object orientated. For the input Part
that would be:
[...]
In short: yes, a thing
Thomas Lussnig [EMAIL PROTECTED] writes:
For example, it's wrong to call it input stream if it also allows
writing. Second, if we allow read and write, we should also think
about operations such as seek and rewind. Thirdly, we should think
about peeking, buffered reads, etc. Also, do we allow
Wget 1.8.2, a bugfix release of Wget, has been released, and is now
available from the GNU ftp site:
ftp://ftp.gnu.org/pub/gnu/wget/wget-1.8.2.tar.gz
As anyone who has followed this list knows, 1.8.2 is a bugfix release
that fixes many bugs reported for 1.8.1. An important legal change is
Ian Abbott [EMAIL PROTECTED] writes:
On Wed, 29 May 2002 05:14:14 +0200, Hrvoje Niksic [EMAIL PROTECTED] wrote:
Wget 1.8.2, a bugfix release of Wget, has been released, and is now
available from the GNU ftp site:
ftp://ftp.gnu.org/pub/gnu/wget/wget-1.8.2.tar.gz
This is a bit late
Noel Koethe [EMAIL PROTECTED] writes:
On Mit, 29 Mai 2002, Hrvoje Niksic wrote:
Wget 1.8.2, a bugfix release of Wget, has been released, and is now
available from the GNU ftp site:
Please correct the websites:
http://www.gnu.org/software/wget/
http://wget.sunsite.dk/
News:
The latest
Ian Abbott [EMAIL PROTECTED] writes:
Would wt-wintime.u.HighPart work under both compilers? I'm just
asking as someone who would like to see the number of #ifdefs
decrease rather than increase.
Microsoft only document the anonymous form in their Win32 SDK, which
is why I'm hesitant to take
Hack Kampbjørn [EMAIL PROTECTED] writes:
But I'm not sure wget should do [HTML de-quoting] for URLs on the
cmd line or in a non-HTML file.
I'm pretty sure that it shouldn't. HTML unquoting only makes sense in
the context of HTML. That's how the browsers behave, as well --
typing amp; in the
I don't know why Wget dumps core on startup. Perhaps a gettext
problem? I have seen reports of failure on startup on Solaris, and it
strikes me that Wget could have picked up wrong or inconsistent
gettext.
Try unsetting the locale-related evnironment variables and seeing if
Wget works then.
This crash seems to be gettext-related. What does `ldd wget' say?
Noel Koethe [EMAIL PROTECTED] writes:
The internationalization of wget seem incomplete. It is translated, but not
properly localized, which can easily be seen here:
Längd: 34,885,632 [audio/mpeg]
100%[] 34,885,632
True. I intended the thousand
Radomir Tomis [EMAIL PROTECTED] writes:
Does anyone know how to download a page that yields the message
in subject ?
For example, how to download the following page ?
http://groups.yahoo.com/group/fabia/message/?source=1
Browser gets the page at this URL correctly (either
Thomas Lussnig [EMAIL PROTECTED] writes:
wget 'http://groups.yahoo.com/group/fabia/message/?source=1'
It download the right URL and follow the redirection.
Because you're using the source from CVS where the bug has been
fixed. See my response to Radomir.
Daniel Stenberg [EMAIL PROTECTED] writes:
On Fri, 7 Jun 2002, Martin Trautmann wrote:
I'm afraid that this is only little help, since the current virus fakes a
from by any other address from address book. But maybe it will help to
reduce the garbage from the list.
Refusing mails larger
Daniel Stenberg [EMAIL PROTECTED] writes:
Yes
If the big bad company wants to use wget and modify it for their own purposes
without giving the source code back, the recently added exception to the GPL
gives them every means:
- Just make your extension a library and name it OpenSSL.
-
Sorry about the extended absence. I've been extremely busy at work
and will continue to do so for a while.
If someone wants to get write access to apply bug fixes and do
development, please let me know.
It would probably be nice to release 1.9 before doing destablizing
changes. However, I
George Prekas [EMAIL PROTECTED] writes:
Give some details about file name sanity.
Currently Wget encodes unsafe characters in file names according to
the rules defined for URLs: by replacing unsafe characters with a %hh
representation. The original rationale for this was to prevent
creation
Sorry about the silence. I've been inactive for the past several
months, and so have most of other developers. The patches are being
accumulated, but not lost.
The anonymous CVS repository exists and is located on sunsite's CVS.
Please note http://wget.sunsite.dk/.
Herold Heiko [EMAIL PROTECTED] writes:
This seems to be a bad timeframe for most developers or skilled
individuals following wget, mostly due to real-world time
constrainsts as far as I understand.
Yes. To make things worse, I am going to serve the civil service for
the next eight months.
I
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Tue, 2 Sep 2003, Herold Heiko wrote:
Did you know wget is currently in lack of a maintainer ?
the GNU website says (http://www.gnu.org/help/help.html):
We are looking for new maintainers for these GNU packages (contact
[EMAIL PROTECTED] if
Mauro Tortonesi [EMAIL PROTECTED] writes:
it makes the sources __MUCH__ easier to maintain, believe me.
How so? It adds the complexity to the build process, and it makes a
crucial build component (Makefiles) almost impossible to understand,
debug, and modify.
that's a point i didn't take
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Tue, 2 Sep 2003, Jeremy Reeve wrote:
I've written a trivial patch to implement the --disable-dns-cache feature
as described in the TODO contained in the CVS tree. I need to write the
Changelog entry which I'll do and post to the patches list
Mauro Tortonesi [EMAIL PROTECTED] writes:
I almost can't believe this is serious -- ansi2knr and the PARAMS
macro are concession enough to pre-ANSI compilers; I wouldn't want to
encumber the program with the KR-style function definitions too.
i agree that supporting KR compilers is a good
[EMAIL PROTECTED] (Georg Bauhaus) writes:
Hrvoje Niksic, Tue, 02 Sep 2003 15:58:02 +0200:
Mauro Tortonesi [EMAIL PROTECTED] writes:
2. Care to elaborate on why you introduced automake in wget?
it makes the sources __MUCH__ easier to maintain, believe me.
How so? It adds
Mauro Tortonesi [EMAIL PROTECTED] writes:
what about the patch for src/url.c?
It looks fine, but I'm not sure I understand why it's necessary to
install full IP address understanding at the URL parsing level.
please, take a look at RFC 2732. it is just 6 pages long and it is
an IETF
Mauro Tortonesi [EMAIL PROTECTED] writes:
Outside Unix this is needlessly complex and completely unnecessary,
not to mention that it doesn't work outside a shell requirement.
Wget tries to help by allowing you to completely avoid Autoconf and
simply provide your own Makefile and config.h.
Ahmon Dancy [EMAIL PROTECTED] writes:
Please consider this patch:
[...]
It looks good to me. Cheating about whether the connection was
really refused looks slightly wrong, but on the other hand, changing
every single place that looks at the result is tedious.
Please note that patches should
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Wed, 3 Sep 2003, Hrvoje Niksic wrote:
If you have patches for Wget's configure.in to work with Autoconf
2.5x, I'll gladly accept those, too.
Please note that the changes to support lone autoconf 2.5x are minimal
and almost exclusively m4
[EMAIL PROTECTED] writes:
I found a workaround for the problem described below.
Using option -nh does the job for me.
As the subdomains mentioned below are on the same IP
as the main domain wget seems not to compare their
names but the IP only.
I believe newer versions of Wget don't do
Hrvoje Niksic [EMAIL PROTECTED] writes:
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Wed, 3 Sep 2003, Hrvoje Niksic wrote:
If you have patches for Wget's configure.in to work with Autoconf
2.5x, I'll gladly accept those, too.
Please note that the changes to support lone autoconf 2.5x
Mauro Tortonesi [EMAIL PROTECTED] writes:
here is my rfc2732 patch for wget. here is a brief summary of the
changes i've made:
[...]
Thanks for the patch. I'm about to apply this, with several (minor)
changes:
* I modified the functions to not require a zero-terminated string,
but to
401 - 500 of 1457 matches
Mail list logo