Maciej W. Rozycki [EMAIL PROTECTED] writes:
I couldn't send the patches earlier, sorry. Besides what you have
already done, I have the following bits within my changes.
Thanks, I never would have caught those myself. Do you have
suggestions for Autoconf 2.5x features Wget could put to good
Jeremy Reeve [EMAIL PROTECTED] writes:
Please consider this, my trivial --disable-dns-cache patch for wget.
ChangeLog should read something like:
2003-09-07Jeremy S. Reeve [EMAIL PROTECTED]
* host.c, init.c, main.c, options.h: Added --disable-dns-cache
option
to turn
Ahmon Dancy [EMAIL PROTECTED] writes:
I'll apply it shortly.
Thanks.
Applied now.
Is there a wget-announce mailing list?
No.
Newman, David [EMAIL PROTECTED] writes:
This is my third attempt at a Content-Disposition patch and if it
isn't acceptable yet, I'm sure it is pretty close.
Thanks. Note that I and other (co-)maintainers have been away for
some time, so if your previous attempts have been ignore, it might not
Ahmon Dancy [EMAIL PROTECTED] writes:
Is there a wget-announce mailing list?
No.
Alright. Is there a rough estimate for the next release date?
I'm thinking of releasing 1.9 with the accumulated features in the
current CVS. The code base is IMHO stable enough for that. The only
major
Daniel Stenberg [EMAIL PROTECTED] writes:
These are two snippets that can be used to detect IPv6 support and a
working getaddrinfo() info. Adjust as you see fit!
Thanks a bunch! I'll try it out later today.
Jochen Roderburg [EMAIL PROTECTED] writes:
Question: Is the often discussed *feature* in version 1.8.x meanwhile
repaired, that special characters in local filenames are
url-encoded?
Hmm, that was another thing scheduled to be fixed for 1.9.
Herold Heiko [EMAIL PROTECTED] writes:
could you please check the thread Windows filename patch for 1.8.2
from around 24-05-2002 (Hack Kampbjørn, Ian Abbott) ? That patch
(url.c) got committed to the 1.8 branch but not to the 1.9 branch.
Also, it is comprised of two parts, the first one:
of wasted time.
i agree here.
OK then. Here is an additional patch:
2003-09-09 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (url_parse): Return an error if the URL contains a [...]
IPv6 numeric address and we don't support IPv6.
Index: src/url.c
Thanks to Daniel Stenberg who has either been reading my mind or has
had the exact same needs, here is a patch that brings configure
(auto-)detection for IPv6.
Please test it out on various configurations where IPv6 is or is not
enabled.
ChangeLog:
2003-09-09 Hrvoje Niksic [EMAIL PROTECTED
Mauro Tortonesi [EMAIL PROTECTED] writes:
Thanks for the patch. I'm curious, in what circumstances would one
want to use this option? (I'm also asking because of the manual in
which I'd like to explain why the option is useful.)
e.g., with RFC 3041 temporary ipv6 addresses.
Do they really
[ I'm Cc-ing the list because this might be interesting to others. ]
Mauro Tortonesi [EMAIL PROTECTED] writes:
ok, i agree here. but, in order to help me with my work on wget, could
you please tell me:
* how do you generate a wget tarball for a new release
With the script `dist-wget' in
Mauro Tortonesi [EMAIL PROTECTED] writes:
* how do you generate/maintain gettext-related files (e.g. the files in
the po directory
The `.po' files are from the translation project. POTFILES.IN is
generated by hand when a new `.c' file is added.
ok, but what about Makefile.in.in and
Mauro Tortonesi [EMAIL PROTECTED] writes:
AFAIR wget.pot is generated by Makefile. (It should probably not be
in CVS, though.) Makefile.in.in is not generated, it was originally
adapted from the original Makefile.in.in from the gettext
distribution. It has served well for years in the
Patrick Cernko [EMAIL PROTECTED] writes:
I discovered a small problem with the increasing number of servers with
canching IPs but constant name (provided by Nameservers like
dyndns.org). If the download with wget is interrupted by a IP change
(e.g. a dialup host whose provider killed the
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Wed, 10 Sep 2003, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
Shouldn't we simply check for libinet6 in the usual fashion?
this could be another solution. but i think it would be much better
to do it only for kame
Mauro Tortonesi [EMAIL PROTECTED] writes:
Isn't the second check a matter of running a small test program, as in
the check that Daniel provided (but more sophisticated)?
sure. but what was the problem with stack detection? it's simply a couple
of AC_EGREP_CPP macros after all...
The problem
back the system clock by ~6 milliseconds, to which Wget reacted
badly.
Even so, Wget shouldn't crash. The correct fix is to disallow the
timer code from ever returning decreasing or negative time intervals.
Please let me know if this patch fixes the problem:
2003-09-14 Hrvoje Niksic [EMAIL
is needed.
2003-09-14 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (append_uri_pathel): Use opt.restrict_file_names when
calling file_unsafe_char.
* init.c: New command restrict_file_names.
* main.c (main): New option --restrict-file-names[=windows,unix
Nicolas, thanks for the patch; I'm about to apply it to Wget CVS.
Hrvoje Niksic [EMAIL PROTECTED] writes:
Jochen Roderburg [EMAIL PROTECTED] writes:
Question: Is the often discussed *feature* in version 1.8.x meanwhile
repaired, that special characters in local filenames are
url-encoded?
Hmm, that was another thing scheduled to be fixed for 1.9.
I
Noèl Köthe [EMAIL PROTECTED] writes:
Am Mi, 2003-09-10 um 22.21 schrieb Hrvoje Niksic:
Just a small patch for the documentation:
--- wget-1.8.2.orig/doc/wget.texi
+++ wget-1.8.2/doc/wget.texi
@@ -507,7 +507,7 @@
@item -t @var{number}
@itemx [EMAIL PROTECTED]
Set number
from unsigned __int64 to double not
implemented, use signed __int64
The culprit seems to be (in wtimer_sys_diff)
#ifdef WINDOWS
return (double)(wst1-QuadPart - wst2-QuadPart) / 1;
#endif
Does this patch help?
2003-09-16 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c
Dieter Drossmann [EMAIL PROTECTED] writes:
I use a extra file with a long list of http entries. I included this
file with the -i option. After 154 downloads I got an error
message: Segmentation fault.
With wget 1.7.1 everything works well.
Is there a new limit of lines?
No, there's no
Mitra [EMAIL PROTECTED] writes:
Hi,
Thanks for the response.
I've never used Info before, except for documentation of emacs and
very few things are documented there. I suggest it should be
presumed that people will look at man wget or wget --help and
make sure the documentation is
Herold Heiko [EMAIL PROTECTED] writes:
Does compile now, but I managed to produce an application error during a
test run on a https site.
I produced a debug build with /DDEBUG /Zi /Od /Fd /FR and produced the
wget.bsc by running bscmake on all the sbr files, but I didn't yet
understand how
Christopher G. Lewis [EMAIL PROTECTED] writes:
Here's a small change to print out the OpenSSL version with the -V
--help parameters.
[...]
I think that GNU Wget something should always stand for Wget's
version, regardless of the libraries it has been compiled with. But
if you want to see the
Stefan Eissing [EMAIL PROTECTED] writes:
Of course this is only noticable with HTTP/1.1 server which leave
the connection open and do not apply transfer-encding: chunked for
empty response bodies.
They may not apply chunked transfer because Wget doesn't know how to
handle it. And leaving the
Noèl Köthe [EMAIL PROTECTED] writes:
-infinite retrying.
+infinite retrying. Default (no command-line switch) is to retry
+20 times but fatal errors like connection refused or not found
+(404) are not being retried.
Thanks. I've now committed this:
Index: doc/wget.texi
Herold Heiko [EMAIL PROTECTED] writes:
Repeatable, and it seems to appear with this:
2003-09-15 Hrvoje Niksic [EMAIL PROTECTED]
* retr.c (get_contents): Reduce the buffer size to the amount of
data that may pass through for one second. This prevents long
sleeps when
Stefan Eissing [EMAIL PROTECTED] writes:
Please excuse if this bug has already been reported:
In wget 1.8.1 (OS X) and 1.8.2 (cygwin) the handling of resources
with content-length 0 is wrong. wget tries to read the empty content
and hangs until the socket read timeout fires. (I set the
Noèl Köthe [EMAIL PROTECTED] writes:
at the end of the description of the option --http-passwd=password:
For more information about security issues with Wget,
The sentence is incomplete.
wget.texi shows:
For more information about security issues with Wget, @xref{Security
Herold Heiko [EMAIL PROTECTED] writes:
Found it.
Using the 23:00 connect.c and the 23:59 retr.c does produce the bug.
Using the 23:59 connect.c and the 23:00 retr.c works ok.
This means the problem must be in retr.c .
OK, that narrows it down. Two further questions:
1) If you comment out
I've noticed the mistake as soon as I compiled with SSL (and saw the
warnings):
2003-09-18 Hrvoje Niksic [EMAIL PROTECTED]
* retr.c (get_contents): Pass the correct argument to ssl_iread.
Index: src/retr.c
===
RCS file
Herold Heiko [EMAIL PROTECTED] writes:
Solution 1: have a switch like --use-protocol-dir = [no|most|all]
no would be the current state:
./www.some.site/index.html
./www.some.site/index.html
./www.some.site/index.html
all would be: always add a directory level for the protocol:
Ilya N. Golubev [EMAIL PROTECTED] writes:
Duplicating my [EMAIL PROTECTED] sent on Wed, 10 Sep 2003
19:48:56 +0400 since mailer reports that [EMAIL PROTECTED] does not
work.
wget -mLd http://www.hro.org/docs/rlex/uk/index.htm
does not follow `A HREF=uk1.htm#1' links contained in the
Lucuk, Pete [EMAIL PROTECTED] writes:
as we can see above, wget has raznoe.shtml.html as the main file,
this is *not* what I want, I *always* want the main file to be name
index.html.
Wget doesn't really have the concept of a main file. As a
workaround, you could simply `ln -s
Doug Kaufman [EMAIL PROTECTED] writes:
On Thu, 18 Sep 2003, Hrvoje Niksic wrote:
modifying advance_declaration() in html-parse.c. A future version of
Wget will probably parse comments in a non-compliant fashion, by
considering everything between !-- and -- to be a comment, which is
what
Dimitri Ars [EMAIL PROTECTED] writes:
I'm having trouble connecting with wget to a site using SSL:
[...]
I can repeat this, but currently I don't understand enough about SSL
to fix it. Christian, could you please help?
wget https://145.222.135.165/index.htm
--13:46:36--
Sorry about the lack of response. Your feature requests are quite
reasonable, but I have no idea of the timeframe when I'll work on them
(they're not a priority for me). Perhaps someone else is interested
in helping implement them.
The things I planned to tackle for a post-1.9 release are
Mark Veltzer [EMAIL PROTECTED] writes:
On Monday 22 September 2003 00:20, you wrote:
Sorry about the lack of response. Your feature requests are quite
reasonable, but I have no idea of the timeframe when I'll work on
them (they're not a priority for me). Perhaps someone else is
interested
Mark Veltzer [EMAIL PROTECTED] writes:
In addition I would add a flag that makes the URL method work like
the explicit method and vice versa. This would cover all bases.
The semantics of that flag aren't as obvious as it may seem. For
example, it's completely legal to do this:
wget -r
In these enlightened times when 2G+ or large files are no longer
considered large even in the third world, more and more people ask for
the ability to download huge files with Wget.
Wget carefully uses `long' for potentially large values, such as
file sizes and offsets, but that has no effect on
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Mon, 22 Sep 2003, Hrvoje Niksic wrote:
Well, using off_t and AC_SYS_LARGEFILE seems to be the recommended
practice.
Recommended for POSIX systems, perhaps, but not really portable to
older machines. And it doesn't solve the portable
Note that this is not an alias, it's a mailing list you must have
subscribed to before. (We're not in the spam business just yet,
despite certain unfortunate events in the past.) To unsubscribe,
please send mail to [EMAIL PROTECTED].
Daniel Stenberg [EMAIL PROTECTED] writes:
On Mon, 22 Sep 2003, Hrvoje Niksic wrote:
The bottom line is, I really don't know how to solve this
portably. Does anyone know how widely ported software deals with
large files?
In curl, we provide our own *printf() code that works as expected
DervishD [EMAIL PROTECTED] writes:
Yes, you're true, but... How about using C99 large integer types
(intmax_t and family)?
But then I can use `long long' just as well, which is supported by C99
and (I think) required to be at least 64 bits wide. Portability is
the whole problem, so
Randy Paries [EMAIL PROTECTED] writes:
Not sure if this is a bug or not.
I guess it could be called a bug, although it's no simple oversight.
Wget currently doesn't support large files.
After a lot of time of sitting in CVS, a beta of Wget 1.9 is
available. To see what's new since 1.8, check the `NEWS' file in the
distribution. Get it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta1.tar.gz
Please test it on as many different platforms as possible and in the
places
To unsubscribe, send mail to [EMAIL PROTECTED].
failed to enforce
the exception.) This patch should fix it:
2003-09-24 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (url_escape_1): Revert unintentional change to lowercase
xdigit escapes.
(url_escape_dir): Document that this function depends on the
output
Could the person who sent me the patch for Windows compilers support
please resend it? Amidst all the viruses, I accidentally deleted the
message before I've had a chance to apply it. Sorry about the
mistake.
Jack Pavlovsky [EMAIL PROTECTED] writes:
It's probably a bug: bug: when downloading wget -mirror
ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is,
but when downloading wget ftp://somehost.org/somepath/3*, wget saves
the files as 3acv14%7Eanivcd.mpg
Thanks for the report.
Thanks for the patch, I've now applied it using the following
ChangeLog entry:
2003-09-26 Gisle Vanem [EMAIL PROTECTED]
* mswindows.c (read_registry): Removed.
(set_sleep_mode): New function.
(windows_main_junk): Call it.
BTW, unless you want your patch to be reviewed
jayme [EMAIL PROTECTED] writes:
[...]
Before anything else, note that the patch originally written for 1.8.2
will need change for 1.9. The change is not hard to make, but it's
still needed.
The patch didn't make it to canonical sources because it assumes `long
long', which is not available on
This beta includes several important bug fixes since 1.9-beta1, most
notably the fix for correct file name quoting with recursive FTP
downloads. Important Windows fixes by Gisle Vanem and Herold Heiko
are also present.
Get it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta2.tar.gz
[ Added Cc to [EMAIL PROTECTED] ]
Tony Lewis [EMAIL PROTECTED] writes:
The following patch adds a command line option to save any links
that are not followed by wget. For example:
wget http://www.mysite.com --mirror --unfollowed-links=mysite.links
will result in mysite.links containing all
Does anyone know the current procedure for submitting the `.pot' file
to the GNU Translation Project? At the moment, the project home page
at http://www.iro.umontreal.ca/contrib/po/HTML/ appears dead.
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
I'm curious: what is the use case for this? Why would you want to
save the unfollowed links to an external file?
I use this to determine what other websites a given website refers to.
For example:
wget
http
Tony Lewis [EMAIL PROTECTED] writes:
Would something like the following be what you had in mind?
301 http://www.mysite.com/
200 http://www.mysite.com/index.html
200 http://www.mysite.com/followed.html
401 http://www.mysite.com/needpw.html
--- http://www.othersite.com/notfollowed.html
Yes,
Payal Rathod [EMAIL PROTECTED] writes:
I have 5-7 user accounts in /home whose data is important. Every day at
12:00 I want to back their data to a differnt backup machine.
The remote machine has a ftp server.
Can I use wget for this? If yes, how do I proceed?
The way to do it with Wget
Not many changes from the previous beta. This is for the purposes of
the Translation Project, to which I've submitted `wget.pot', and which
might wonder where to get the source of a wget-1.9-beta3 from.
Get it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta3.tar.gz
Mauro's IPv6
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host-directories ftp://username:[EMAIL PROTECTED]
But if I run in thru' crontab, where will it store
This problem is not specific to timeouts, but to recursive download (-r).
When downloading recursively, Wget expects some of the specified
downloads to fail and does not propagate that failure to the code that
sets the exit status. This unfortunately includes the first download,
which should
The home page is back, but it says that the TP Robot is dead. I've
contacted Martin Loewis, perhaps he'll be able to provide more info.
Payal Rathod [EMAIL PROTECTED] writes:
On Thu, Oct 02, 2003 at 12:03:34PM +0200, Hrvoje Niksic wrote:
Payal Rathod [EMAIL PROTECTED] writes:
On Wed, Oct 01, 2003 at 09:26:47PM +0200, Hrvoje Niksic wrote:
The way to do it with Wget would be something like:
wget --mirror --no-host
Gisle Vanem [EMAIL PROTECTED] writes:
I've patched util.c to make run_with_timeout() work on Windows
(better than it does with alarm()!).
Cool, thanks! Note that, to save the honor of Unix, I've added
support for setitimer on systems that support it (virtually everything
these days), so
I've committed this patch, with minor changes, such as moving the code
to mswindows.c. Since I don't have MSVC, someone else will need to
check that the code compiles. Please let me know how it goes.
It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget
downloads the HTML files only because it absolutely has to, in order
to recurse through them. After it finds the links in them, it deletes
them.
Thanks for the contribution. Note that a slightly more correct place
to send the patch is the [EMAIL PROTECTED] list, followed by
people with a keener interest in development.
Also, you should send at least a short explanation of what each patch
is supposed to do and why one should apply it.
Thanks for the patch, I've now applied it with the following ChangeLog
entry:
2003-10-03 Gisle Vanem [EMAIL PROTECTED]
* connect.c: And don't include them here.
* mswindows.h: Include winsock headers here.
However, I've postponed applying the part that changes `-d'. I agree
Jochen Roderburg [EMAIL PROTECTED] writes:
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget
downloads the HTML files only because it absolutely has to, in order
to recurse through them. After it finds the links in them, it deletes
Gisle Vanem [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] said:
I've committed this patch, with minor changes, such as moving the code
to mswindows.c. Since I don't have MSVC, someone else will need to
check that the code compiles. Please let me know how it goes.
It compiled
Gisle Vanem [EMAIL PROTECTED] writes:
--- mswindows.c.org Mon Sep 29 11:46:06 2003
+++ mswindows.c Sun Oct 05 17:34:48 2003
@@ -306,7 +306,7 @@
DWORD set_sleep_mode (DWORD mode)
{
HMODULE mod = LoadLibrary (kernel32.dll);
- DWORD (*_SetThreadExecutionState) (DWORD) = NULL;
+
To subscribe to this list, please send mail to
[EMAIL PROTECTED].
There is currently no way to disable following redirects. A patch to
do so has been submitted recently, but I didn't see a good reason why
one would need it, so I didn't add the option. Your mail is a good
argument, but I don't know how prevalent that behavior is.
What is it with servers that
Tony Lewis [EMAIL PROTECTED] writes:
wget
http://www.custsite.com/some/page.html --http-user=USER --http-passwd=PASS
If you supply your user ID and password via a web form, it will be
tricky (if not impossible) because wget doesn't POST forms (unless
someone added that option while I wasn't
Suhas Tembe [EMAIL PROTECTED] writes:
Hello Everyone,
I am new to this wget utility, so pardon my ignorance.. Here is a
brief explanation of what I am currently doing:
1). I go to our customer's website every day log in using a User Name Password.
2). I click on 3 links before I get to
Several bugs fixed since beta3, including a fatal one on Windows.
Includes a working Windows implementation of run_with_timeout.
Get it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta4.tar.gz
Dan Jacobson [EMAIL PROTECTED] writes:
-q and -S are incompatible and should perhaps produce errors and be
noted thus in the docs.
They seem to work as I'd expect -- `-q' tells Wget to print *nothing*,
and that's what happens. The output Wget would have generated does
contain HTTP headers,
Karl Eichwalder [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
As for the Polish translation, translations are normally handled
through the Translation Project. The TP robot is currently down, but
I assume it will be back up soon, and then we'll submit the POT file
Karl Eichwalder [EMAIL PROTECTED] writes:
Also, my Croatian translation of 1.9 doesn't seem to have made it
in. Is that expected?
Unfortunately, yes. Will you please resubmit it with the subject line
updated (IIRC, it's now):
TP-Robot wget-1.9-b3.hr.po
I'm not sure what b3 is, but
Karl Eichwalder [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
I'm not sure what b3 is, but the version in the POT file was
supposed to be beta3. Was there a misunderstanding somewhere along
the line?
Yes, the robot does not like beta3 as part of the version
string. b3
Karl Eichwalder [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
Ouch. Why does the robot care about version names at all?
It must know about the sequences; this is important for merging
issues. IIRC, we have at least these sequences supported by the
robot:
1.2
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Please be aware that Wget needs to know the size of the POST
data in advance. Therefore the argument to @code{--post-file}
must be a regular file; specifying a FIFO or something like
@file{/dev/stdin} won't work
Karl Eichwalder [EMAIL PROTECTED] writes:
I guess, you as the wget maintainer switched from something
supported to the unsupported betaX scheme and now we have
something to talk about ;)
I had no idea that something as usual as betaX was unsupported. In
fact, I believe that bX was added when
Stefan Eissing [EMAIL PROTECTED] writes:
Am Dienstag, 07.10.03, um 16:36 Uhr (Europe/Berlin) schrieb Hrvoje
Niksic:
What the current code does is: determine the file size, send
Content-Length, read the file in chunks (up to the promised size) and
send those chunks to the server
Martin, thanks for the patch and the detailed report. Note that it
might have made more sense to apply the patch to the latest CVS
version, which is somewhat different from 1.8.2.
I'm really not sure whether to add this patch. On the one hand, it's
nice to support as many architectures as
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
I don't understand what you're proposing. Reading the whole file in
memory is too memory-intensive for large files (one could presumably
POST really huge files, CD images or whatever).
I was proposing that you read the file
Josh Brooks [EMAIL PROTECTED] writes:
I have noticed very unpredictable behavior from wget 1.8.2 -
specifically I have noticed two things:
a) sometimes it does not follow all of the links it should
b) sometimes wget will follow links to other sites and URLs - when the
command line used
Suhas Tembe [EMAIL PROTECTED] writes:
Thanks everyone for the replies so far..
The problem I am having is that the customer is using ASP Java
script. The URL stays the same as I click through the links.
URL staying the same is usually a sign of the use of frame, not of ASP
and
Suhas Tembe [EMAIL PROTECTED] writes:
this page contains a drop-down list of our customer's locations.
At present, I choose one location from the drop-down list click
submit to get the data, which is displayed in a report format. I
right-click then choose view source save source to a file.
Suhas Tembe [EMAIL PROTECTED] writes:
It does look a little complicated This is how it looks:
form action=InventoryStatus.asp method=post [...]
[...]
select name=cboSupplier
option value=4541-134289454A/option
option value=4542-134289 selected454B/option
/select
Those are the
[EMAIL PROTECTED] (Martin v. Löwis) writes:
Why do you think the scheme is narrow-minded?
Because 1.9-beta3 seems to be a problem.
VERSION = ('[.0-9]+-?b[0-9]+'
'|[.0-9]+-?dev[0-9]+'
'|[.0-9]+-?pre[0-9]+'
'|[.0-9]+-?rel[0-9]+'
'|[.0-9]+[a-z]?'
Mauro Tortonesi [EMAIL PROTECTED] writes:
so, i am asking you: what do you think of these changes?
Overall they look very good! Judging from the patch, a large piece of
the work part seems to be in an unexpected place: the FTP code.
Here are some remarks I got looking at the patch.
It
Mauro Tortonesi [EMAIL PROTECTED] writes:
I still don't understand the choice to use sockaddr and
sockaddr_storage in a application code.
They result in needless casts and (to me) uncomprehensible code.
well, using sockaddr_storage is the right way (TM) to write IPv6 enabled
code ;-)
Not
[EMAIL PROTECTED] (Martin v. Löwis) writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
VERSION = ('[.0-9]+-?b[0-9]+'
'|[.0-9]+-?dev[0-9]+'
'|[.0-9]+-?pre[0-9]+'
'|[.0-9]+-?rel[0-9]+'
'|[.0-9]+[a-z]?'
'|[0-9][0-9][0-9][0-9]-[0-9][0
Thanks for the patch, Herold. I've applied and also added similar
fixes for Borland's and Watcom's Makefiles. I've used the following
ChangeLog entry:
2003-10-09 Herold Heiko [EMAIL PROTECTED]
* windows/Makefile.watcom (OBJS): Ditto.
* windows/Makefile.src.bor: Ditto.
It's a bug. -O currently doesn't work everywhere in should. If you
just want to change the directory where Wget operates, the workaround
is to use `-P'. E.g.:
wget -N ftp://ftp.pld-linux.org/dists/ac/PLD/athlon/PLD/RPMS/packages.dir.mdd -P
Mauro Tortonesi [EMAIL PROTECTED] writes:
and i'm saying that for this task the ideal structure is
sockaddr_storage. notice that my code uses sockaddr_storage
(typedef'd as wget_sockaddr) only when dealing with socket
addresses, not for ip address caching.
Now I see. Thanks for clearing it
501 - 600 of 1457 matches
Mail list logo