-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
vinothkumar raman wrote:
We need to give out the time stamp the local file in the Request
header for that we need to pass on the local file's time stamp from
http_loop() to get_http() . The only way to pass on this without
altering the signature
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Yes, that's what it means.
I'm not yet committed to doing this. I'd like to see first how many
mainstream servers will respect If-Modified-Since when given as part of
an HTTP/1.0 request (in comparison to how they respond when it's part of
an
This mean we should remove the previous HEAD request code and use
If-Modified-Since by default and have it to handle all the request and
store pages if it is not returning a 304 response
Is it so?
On Fri, Aug 29, 2008 at 11:06 PM, Micah Cowan [EMAIL PROTECTED] wrote:
Follow-up Comment #4, bug
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sir Vision wrote:
Hello,
enterring following command results in an error:
--- command start ---
c:\Downloads\wget_v1.11.3bwget
ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/;
-P c:\Downloads\
--- command
ok, thanks for your reply
We have a work-around in place now, but it doesnt scale very good.
Anyways, I'll start looking for another solution
Thanks!
Mark
On Sat, Mar 1, 2008 at 10:15 PM, Micah Cowan [EMAIL PROTECTED] wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Mark Pors wrote:
Micah Cowan [EMAIL PROTECTED] writes:
The new Wget flags empty Set-Cookie as a syntax error (but only
displays it in -d mode; possibly a bug).
I'm not clear on exactly what's possibly a bug: do you mean the fact
that Wget only calls attention to it in -d mode?
That's what I meant.
I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
Generally, if Wget considers a header to be in error (and hence
ignores it), the user probably needs to know about that. After all,
it could be the symptom of a Wget bug, or of an unimplemented
extension the server
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Diego Campo wrote:
Hi,
I got a bug on wget when executing:
wget -a log -x -O search/search-1.html --verbose --wait 3
--limit-rate=20K --tries=3
http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
Segmentation fault (core
Micah Cowan [EMAIL PROTECTED] writes:
I was able to reproduce the problem above in the release version of
Wget; however, it appears to be working fine in the current
development version of Wget, which is expected to release soon as
version 1.11.*
I think the old Wget crashed on empty
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
Micah Cowan [EMAIL PROTECTED] writes:
I was able to reproduce the problem above in the release version of
Wget; however, it appears to be working fine in the current
development version of Wget, which is expected to
On 10/4/07, Brian Keck [EMAIL PROTECTED] wrote:
I would have sent a fix too, but after finding my way through http.c
retr.c I got lost in url.c.
You and me both. A lot of the code needs re-written.. there's a lot of
spaghetti code in there. I hope Micah chooses to do a complete
re-write for
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
On 10/4/07, Brian Keck [EMAIL PROTECTED] wrote:
I would have sent a fix too, but after finding my way through http.c
retr.c I got lost in url.c.
You and me both. A lot of the code needs re-written.. there's a lot of
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Brian Keck wrote:
Hello,
I'm wondering if I've found a bug in the excellent wget.
I'm not asking for help, because it turned out not to be the reason
one of my scripts was failing.
The possible bug is in the derivation of the filename from
Micah Cowan [EMAIL PROTECTED] writes:
It is actually illegal to specify byte values outside the range of
ASCII characters in a URL, but it has long been historical practice
to do so anyway. In most cases, the intended meaning was one of the
latin character sets (usually latin1), so Wget was
On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
sprintf(filecopy, \%.2047s\, file);
This fix breaks the FTP protocol, making wget instantly stop working
with many conforming servers, but apparently start working with yours;
the RFCs are very clear that the file name argument starts
On 7/15/07, Rich Cook [EMAIL PROTECTED] wrote:
I think you may well be correct. I am now unable to reproduce the
problem where the server does not recognize a filename unless I give
it quotes. In fact, as you say, the server ONLY recognizes filenames
WITHOUT quotes and quoting breaks it. I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
sprintf(filecopy, \%.2047s\, file);
This fix breaks the FTP protocol, making wget instantly stop working
with many conforming servers, but apparently start working with
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
On OS X, if a filename on the FTP server contains spaces, and the remote
copy of the file is newer than the local, then wget gets thrown into a
loop of No such file or directory endlessly. I have changed the
following in
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Mauro Tortonesi wrote:
Micah Cowan ha scritto:
Update of bug #20323 (project wget):
Status: Ready For Test = In
Progress
___
Follow-up Comment #3:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Matthew Woehlke wrote:
Micah Cowan wrote:
The wget-notify mailing list
(http://addictivecode.org/mailman/listinfo/wget-notify) will now also be
receiving notifications of bug updates from GNU Savannah, in addition to
subversion commits.
Micah Cowan wrote:
Matthew Woehlke wrote:
Micah Cowan wrote:
...any reason to not CC bug updates here also/instead? That's how e.g.
kwrite does thing (also several other lists AFAIK), and seems to make
sense. This is 'bug-wget' after all :-).
It is; but it's also 'wget'.
Hmm, so it is; my
From various:
[...]
char filecopy[2048];
if (file[0] != '') {
sprintf(filecopy, \%.2047s\, file);
} else {
strncpy(filecopy, file, 2047);
}
[...]
It should be:
sprintf(filecopy, \%.2045s\, file);
[...]
I'll admit to being old and grumpy, but am I the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Steven M. Schweda wrote:
From various:
[...]
char filecopy[2048];
if (file[0] != '') {
sprintf(filecopy, \%.2047s\, file);
} else {
strncpy(filecopy, file, 2047);
}
[...]
It should be:
sprintf(filecopy,
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, \%.2047s\, file);
It should be:
sprintf(filecopy, \%.2045s\, file);
in order to leave room for the two quotes.
Tony
-Original Message-
From: Rich Cook [mailto:[EMAIL PROTECTED]
Sent:
Good point, although it's only a POTENTIAL buffer overflow, and it's
limited to 2 bytes, so at least it's not exploitable. :-)
On Jul 5, 2007, at 9:05 AM, Tony Lewis wrote:
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, \%.2047s\, file);
It
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Tony Lewis [EMAIL PROTECTED] writes:
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
arbitrary limits on file name length.
If it uses the heap, then
Tony Lewis [EMAIL PROTECTED] writes:
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, \%.2047s\, file);
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and arbitrary limits on file
name
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was supposed to indicate how to free the string.
Virden, Larry W. [EMAIL PROTECTED] writes:
Tony Lewis [EMAIL PROTECTED] writes:
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
arbitrary limits on file name length.
If it uses the heap, then doesn't that open a hole where a
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it? I'd use asprintf, but I'm afraid to
suggest that here as it may not be portable.
On Jul 5, 2007, at 10:45 AM, Hrvoje Niksic wrote:
Tony Lewis [EMAIL PROTECTED] writes:
There is a buffer overflow
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was supposed to indicate how to free the
Please remove me from this list. thanks,
John Bruso
From: Rich Cook [mailto:[EMAIL PROTECTED]
Sent: Thu 7/5/2007 12:30 PM
To: Hrvoje Niksic
Cc: Tony Lewis; [EMAIL PROTECTED]
Subject: Re: bug and patch: blank spaces in filenames causes looping
On Jul 5, 2007
Rich Cook [EMAIL PROTECTED] writes:
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. Freshly allocated with malloc in the function documentation
was
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)? Or do I need to do more to make that happen, or...? Thanks!
On Jul 5, 2007, at 12:52 PM, Hrvoje Niksic wrote:
Rich Cook [EMAIL PROTECTED] writes:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the future)?
Or do I need to do more to make that happen, or...? Thanks!
Well, I need a chance to
Thanks for the follow up. :-)
On Jul 5, 2007, at 3:52 PM, Micah Cowan wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)?
Or do
Mario Ander schrieb:
Hi everybody,
I think there is a bug storing cookies with wget.
See this command line:
C:\Programme\wget\wget --user-agent=Opera/8.5 (X11;
U; en) --no-check-certificate --keep-session-cookies
--save-cookies=cookie.txt --output-document=-
--debug
Matthias Vill schrieb:
Mario Ander schrieb:
Hi everybody,
I think there is a bug storing cookies with wget.
See this command line:
C:\Programme\wget\wget --user-agent=Opera/8.5 (X11;
U; en) --no-check-certificate --keep-session-cookies
--save-cookies=cookie.txt --output-document=-
A quick search at http://www.mail-archive.com/wget@sunsite.dk/; for
-O found:
http://www.mail-archive.com/wget@sunsite.dk/msg08746.html
http://www.mail-archive.com/wget@sunsite.dk/msg08748.html
The way -O is implemented, there are all kinds of things which are
incompatible with
Juhana Sadeharju wrote:
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1.
First, the bin/wgetdir is defined as
wget -p -E -k --proxy=off -e robots=off --passive-ftp
-o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50
--waitretry=10 $@
The download command
From: Sebastian
Doctor, it hurts when I do this.
Don't do that.
Steven M. Schweda [EMAIL PROTECTED]
382 South Warwick Street(+1) 651-699-9818
Saint Paul MN 55105-2547
Reece ha scritto:
Found a bug (sort of).
When trying to get all the images in the directory below:
http://www.netstate.com/states/maps/images/
It gives 403 Forbidden errors for most of the images even after
setting the agent string to firefox's, and setting -e robots=off
After a packet
Hi !
Maybe you can add this patch to your mainline-tree:
http://www.mail-archive.com/wget%40sunsite.dk/msg09142.html
Best regards
Marc Schoechlin
On Wed, Jul 26, 2006 at 07:26:45AM +0200, Marc Schoechlin wrote:
Date: Wed, 26 Jul 2006 07:26:45 +0200
From: Marc Schoechlin [EMAIL PROTECTED]
Daniel Richard G. ha scritto:
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
fixed, thanks.
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.tortonesi.com
University of Ferrara -
Tony Lewis ha scritto:
Run the command with -d and post the output here.
in this case, -S can provide more useful information than -d. be careful to
obfuscate passwords, though!!!
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi
Title: RE: BUG
Run the command with -d and post the output here.
Tony
_
From: Junior + Suporte [mailto:[EMAIL PROTECTED]]
Sent: Monday, July 03, 2006 2:00 PM
To: [EMAIL PROTECTED]
Subject: BUG
Dear,
I using wget to send login request
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Behalf Of Þröstur
Sent: Wednesday, June 21, 2006 4:35 PM
There have been some reports in the past but I don't think it has been acted
upon; one of the problems is that the list of names can be extended at will
(beside the standard comx, lptx,
From: Eduardo M KALINOWSKI
wget http://www.somehost.com/nonexistant.html -O localfile.html
then file localfile.html will always be created, and will have length
of zero even if the remote file does not exist.
Because with -O, Wget opens the output file before it does any
network
yy :) [EMAIL PROTECTED] writes:
I ran wget -P /tmp/.test [1]http://192.168.1.10; in SUSE system (SLES 9)
and found that it saved the file in /tmp/_test.
This command works fine inRedHat, is it a bug?
I believe the bug is introduced by SuSE in an attempt to protect the
user. Try reporting it
- Original Message -
From: Hrvoje Niksic [EMAIL PROTECTED]
Date: Tuesday, March 28, 2006 7:23 pm
in progress.c line 880:
eta_hrs = (int)(eta / 3600, eta %= 3600);
eta_min = (int)(eta / 60, eta %= 60);
eta_sec = (int)(eta);
This is weird. Did you compile the code
Thomas Braby [EMAIL PROTECTED] writes:
eta_hrs = (int) (eta / 3600), eta %= 3600;
Yes that also works. The cast is needed on Windows x64 because eta is
a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a
warning is issued.
The same is the case on 32-bit Windows, and also
Gary Reysa wrote:
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log
El 28/03/2006, a las 20:43, Tony Lewis escribió:
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line
as:
eta_hrs = eta / 3600; eta %=
Greg Hurrell [EMAIL PROTECTED] writes:
El 28/03/2006, a las 20:43, Tony Lewis escribió:
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line
Thomas Braby [EMAIL PROTECTED] writes:
With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64
I was getting no ETA until late in the transfer, when I'd get things
like:
49:49:49 then 48:48:48 then 47:47:47 etc.
So I checked the eta value in seconds and it was correct, so
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line as:
eta_hrs = eta / 3600; eta %= 3600;
This makes it much less likely that someone
Beni Serfaty [EMAIL PROTECTED] writes:
I Think I found a bug when STANDALONE is defined on hash.c
I hope I'm not missing something here...
Good catch, thanks. I've applied a slightly different fix, appended
below.
By the way, are you using hash.c in a project? I'd like to hear if
you're
Tony Lewis wrote:
The --convert-links option changes the website path to a local file
system path. That is, it changes the directory, not the file name.
Thanks I didn't understand it that way.
IMO, your suggestion has merit, but it would require wget to maintain
a list of MIME types and
Tobias Koeck wrote:
done.
== PORT ... done.== RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done.
[ = ] -673,009,664 113,23K/s
Assertion failed: bytes = 0, file retr.c, line 292
This application has requested the Runtime to terminate it in an unusual
way.
HonzaCh [EMAIL PROTECTED] writes:
My localeconv()-thousands_sep (as well as many other struct
members) reveals to empty string () (MSVC6.0).
How do you know? I mean, what program did you use to check this?
My quick'n'dirty one. See the source below.
Your source neglects to
HonzaCh [EMAIL PROTECTED] writes:
Latest version (1.10.1) turns out an UI bug: the thousand separator
(space according to my local settings) displays as á (character
code 0xA0, see attch.)
Although it does not affect the primary function of WGET, it looks
quite ugly.
Env.: Win2k Pro/Czech
Mark Street [EMAIL PROTECTED] writes:
I'm not sure why this [catering for paths without a leading /] is
done in the code.
rfc1808 declared that the leading / is not really part of path, but
merely a separator, presumably to be consistent with its treatment
of ;params, ?queries, and #fragments.
Hrvoje,
Many thanks for the explanation and the patch.
Yes, this patch successfully resolves the problem for my particular test
case.
Best regards,
Mark Street.
Mark Street [EMAIL PROTECTED] writes:
Many thanks for the explanation and the patch. Yes, this patch
successfully resolves the problem for my particular test case.
Thanks for testing it. It has been applied to the code and will be in
Wget 1.10.1 and later.
Will Kuhn [EMAIL PROTECTED] writes:
Apparentl wget does not handle single quote or double quote very well.
wget with the following arguments give error.
wget
--user-agent='Mozilla/5.0' --cookies=off --header
'Cookie: testbounce=testing;
Hi
wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz
is splitting using on the first @ not the second.
Is this a problem with the URL standard or a wget issue?
Regards
Andrew Gargan
Andrew Gargan [EMAIL PROTECTED] writes:
wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz
is splitting using on the first @ not the second.
Encode the '@' as %40 and this will work. For example:
wget ftp://someuser:[EMAIL PROTECTED]/some_file.tgz
Is this a problem
Seemant Kulleen [EMAIL PROTECTED] writes:
I wanted to alert you all to a bug in wget, reported by one of our
(gentoo) users at:
https://bugs.gentoo.org/show_bug.cgi?id=69827
I am the maintainer for the Gentoo ebuild for wget.
If someone would be willing to look at and help us with that
Seemant Kulleen [EMAIL PROTECTED] writes:
Since I don't use Gentoo, I'll need more details to fix this.
For one, I haven't tried Wget with socks for a while now. Older
versions of Wget supported of --with-socks option, but the procedure
for linking a program with socks changed since then,
This problem has been fixed for the upcoming 1.10 release. If you
want to try it, it's available at
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2
Hi Jorge!
Current wget versions do not support large files 2GB.
However, the CVS version does and the fix will be introduced
to the normal wget source.
Jens
(just another user)
When downloading a file of 2GB and more, the counter get crazy, probably
it should have a long instead if a int
I don't know why you say that. I see bug reports and discussion of fixes
flowing through here on a fairly regular basis.
Mark Post
-Original Message-
From: Dan Jacobson [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 15, 2005 3:04 PM
To: [EMAIL PROTECTED]
Subject: bug-wget still
Dan Jacobson [EMAIL PROTECTED] writes:
Is it still useful to mail to [EMAIL PROTECTED] I don't think
anybody's home. Shall the address be closed?
If you're referring to Mauro being busy, I don't see it as a reason to
close the bug reporting address.
P I don't know why you say that. I see bug reports and discussion of fixes
P flowing through here on a fairly regular basis.
All I know is my reports for the last few months didn't get the usual (any!)
cheery replies. However, I saw them on Gmane, yes.
Quoting Alan Robinson [EMAIL PROTECTED]:
When downloading a 4.2 gig file (such as from
ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o
f_the_4th_ave.mpeg ) cause the status text (i.e.
100%[+===] 38,641,328 213.92K/sETA
00:00)
Hi Jason!
If I understood you correctly, this quote from the manual should help you:
***
Note that these two options [accept and reject based on filenames] do not
affect the downloading of HTML files; Wget must load all the HTMLs to know
where to go at all--recursive retrieval would make no
On Sun, Aug 22, 2004 at 08:02:54PM +0200, Jan Minar wrote:
+/* vasprintf() requires _GNU_SOURCE. Which is OK with Debian. */
+#ifndef _GNU_SOURCE
+#define _GNU_SOURCE
This must be done before stdio.h is included.
+#endif
+#include ctype.h
+
#ifndef errno
extern int errno;
#endif
@@
tags 261755 +patch
thanks
On Sun, Aug 22, 2004 at 11:39:07AM +0200, Thomas Hood wrote:
The changes contemplated look very invasive. How quickly can this
bug be fixed?
Here we go: Hacky, non-portable, but pretty slick non-invasive,
whatever that means. Now I'm going to check whether it is
Tristan Miller [EMAIL PROTECTED] writes:
There appears to be a bug in the documentation (man page, etc.) for
wget 1.9.1.
I think this is a bug in the man page generation process.
Yup; 1.9.1 cannot download large files. I hope to fix this by the
next release.
Juhana Sadeharju [EMAIL PROTECTED] writes:
Command: wgetdir http://liarliar.sourceforge.net;.
Problem: Files are named as
content.php?content.2
content.php?content.3
content.php?content.4
which are interpreted, e.g., by Nautilus as manual pages and are
displayed as plain texts. Could
D Richard Felker III [EMAIL PROTECTED] writes:
The request log shows that the slashes are apparently respected.
I retried a test case and found the same thing -- the slashes were
respected.
OK.
Then I remembered that I was using -i. Wget seems to work fine with
the url on the command
The whole matter of conversion of / to /index.html on the file
system is a hack. But I really don't know how to better represent
empty trailing file name on the file system.
Hrvoje Niksic wrote:
The whole matter of conversion of / to /index.html on the file
system is a hack. But I really don't know how to better represent
empty trailing file name on the file system.
Another, for now rather limited, hack: on file systems which support some
sort of file attributes
On Mon, Mar 01, 2004 at 07:25:52PM +0100, Hrvoje Niksic wrote:
Removing the offending code fixes the problem, but I'm not sure if
this is the correct solution. I expect it would be more correct to
remove multiple slashes only before the first occurrance of ?, but
not afterwards.
D Richard Felker III [EMAIL PROTECTED] writes:
The following code in url.c makes it impossible to request urls that
contain multiple slashes in a row in their query string:
[...]
That code is removed in CVS, so multiple slashes now work correctly.
Think of something like
On Mon, Mar 01, 2004 at 03:36:55PM +0100, Hrvoje Niksic wrote:
D Richard Felker III [EMAIL PROTECTED] writes:
The following code in url.c makes it impossible to request urls that
contain multiple slashes in a row in their query string:
[...]
That code is removed in CVS, so multiple
D Richard Felker III [EMAIL PROTECTED] writes:
Think of something like http://foo/bar/redirect.cgi?http://...
wget translates this into: [...]
Which version of Wget are you using? I think even Wget 1.8.2 didn't
collapse multiple slashes in query strings, only in paths.
I was using
Interesting. Is it really necessary to zero out sockaddr/sockaddr_in
before using it? I see that some sources do it, and some don't. I
was always under the impression that, as long as you fill the relevant
members (sin_family, sin_addr, sin_port), other initialization is not
necessary. Was I
Manfred Schwarb [EMAIL PROTECTED] writes:
Interesting. Is it really necessary to zero out sockaddr/sockaddr_in
before using it? I see that some sources do it, and some don't. I
was always under the impression that, as long as you fill the relevant
members (sin_family, sin_addr, sin_port),
francois eric [EMAIL PROTECTED] writes:
after some test:
bug is when: ftp, with username and password, with bind address specifyed
bug is not when: http, ftp without username and password
looks like memory leaks. so i made some modification before bind:
src/connect.c:
--
...
/*
Frank Klemm [EMAIL PROTECTED] writes:
Wget don't work properly when the URL contains characters which are
not allowed in file names on the file system which is currently
used. These are often '\', '?', '*' and ':'.
Affected are at least:
- Windows and related OS
- Linux when using FAT or
You're right -- that code was broken. Thanks for the patch; I've now
applied it to CVS with the following ChangeLog entry:
2003-10-15 Philip Stadermann [EMAIL PROTECTED]
* ftp.c (ftp_retrieve_glob): Correctly loop through the list whose
elements might have been deleted.
From: Gisle Vanem [mailto:[EMAIL PROTECTED]
Jens Rösner [EMAIL PROTECTED] said:
...
I assume Heiko didn't notice it because he doesn't have that function
in his kernel32.dll. Heiko and Hrvoje, will you correct this ASAP?
--gv
Probably.
Currently I'm compiling and testing on NT 4.0
Jens Rsner [EMAIL PROTECTED] said:
I downloaded
wget 1.9 beta 2003/09/29 from Heiko
http://xoomer.virgilio.it/hherold/
...
wget -d http://www.google.com
DEBUG output created by Wget 1.9-beta on Windows.
set_sleep_mode(): mode 0x8001, rc 0x8000
I disabled my wgetrc as well and the
Gisle Vanem [EMAIL PROTECTED] writes:
--- mswindows.c.org Mon Sep 29 11:46:06 2003
+++ mswindows.c Sun Oct 05 17:34:48 2003
@@ -306,7 +306,7 @@
DWORD set_sleep_mode (DWORD mode)
{
HMODULE mod = LoadLibrary (kernel32.dll);
- DWORD (*_SetThreadExecutionState) (DWORD) = NULL;
+
This problem is not specific to timeouts, but to recursive download (-r).
When downloading recursively, Wget expects some of the specified
downloads to fail and does not propagate that failure to the code that
sets the exit status. This unfortunately includes the first download,
which should
OK, I see.
But I do not agree.
And I don't think it is a good idea to treat the first download special.
In my opinion, exit status 0 means everything during the whole
retrieval went OK.
My prefered solution would be to set the final exit status to the highest
exit status of all individual
Randy Paries [EMAIL PROTECTED] writes:
Not sure if this is a bug or not.
I guess it could be called a bug, although it's no simple oversight.
Wget currently doesn't support large files.
how do I get off this list? I tried a few times before
got no response from the server.
thank you-
Matt
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Tuesday, September 23, 2003 8:53 PM
To: Randy Paries
Cc: [EMAIL PROTECTED]
Subject: Re: bug maybe
1 - 100 of 151 matches
Mail list logo