Thanks everybody for the kind welcome. I hope I and Anthony will do a
good job here.
Now to be concrete: I guess the first step is to move the revision
control to Savannah. I have only a question, would you like to continue
using Mercurial or migrate to Git? The latter is becoming a de-facto
Micah Cowan mi...@cowan.name writes:
I think the main thing with migrating to Savannah is that there are a
lot of developers who would have commit access, where there's really
only a couple people in there who are active developers. So you may want
to prune that list (see who's actually in
Jens Schleusener jens.schleuse...@t-systems-sfr.com writes:
(for the Germans: Giuseppe spoken Tschuseppe ?)
It is more like: Jewseppee
This can help you better:
http://www.pronounceitright.com/pronuncia.php?id_pronuncia=3631
Ok, that I was afraid. Maybe that should be mentioned shortly
Hi Micah,
Micah Cowan mi...@cowan.name writes:
I think it will be cleaner to use gnulib in the same way as other
projects are doing it, not checking in the results but using a
bootstrap script. To force a specific revision of gnulib, a git
submodule can be used.
Yeah, but then you need to
Hi,
Jeff, thanks for the patch; and also thanks to Hrvoje for the
review. I'll apply it as soon as we move to Savannah.
Cheers,
Giuseppe
Hrvoje Niksic hnik...@xemacs.org writes:
I am not the maintainer, but if you agree with my reasoning it certainly
won't hurt to resubmit the patch.
Hello Linda,
what you need to give back your changes in GNU wget is to give copyright
assignments to the FSF; this is the only overhead; and of course the
changes must be accepted.
Cheers,
Giuseppe
Linda Walsh w...@tlinx.org writes:
If we wanted to modify wget and check back in changes
Hello,
Jill Brandmeir jill.brandm...@oracle.com writes:
Hi,
The following feedback came into a Sun feedback system from
mike.irv...@atosorigin.com.
I get error 403 Forbidden when I try wget
The comment came in 4/26/10.
Can you please provide more information?
Cheers,
Giuseppe
Hello wget hackers,
I have migrated the GNU Wget repository from Mercurial to Bazaar.
The new repository is accessible here:
bzr branch http://bzr.savannah.gnu.org/r/wget/trunk
Are there pending patches that should be applied?
Cheers,
Giuseppe
Thanks for your bug report!
I don't have a Solaris 10 system to test my patch, but I have looked at
the generated `configure' file and it seems correct.
Would you mind to try this patch? To get a new `configure' you need to
execute `autoreconf'.
Cheers,
Giuseppe
=== modified file
Hi Micah,
I have already committed a patch fixing it. I found the same problem on
some other projects as well :-)
Cheers,
Giuseppe
Micah Cowan mi...@cowan.name writes:
Douglas E. Engert wrote:
wget-1.12 configure on Solaris 10 would fail trying to look
at .. for a number of files. The
I don't see any problem, you have written it so you choose the license.
Anyway, if you don't have a specific reason, isn't be better to use GPL
instead of LGPL? :-)
Cheers,
Giuseppe
Crazy Pete crazypet...@yahoo.com writes:
Hi everyone!
I have created the following project:
Hello,
A new alpha version is here:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
and the detached gpg signature:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2.sig
It contains other changes to the build system. It should work fine on
MinGW/MSYS now.
Please report here any
Thanks for your patch. Recently I have done many changes to the build
system, would you please test this alpha version on your system?
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
Cheers,
Giuseppe
Rainer Orth r...@cebitec.uni-bielefeld.de writes:
I've just tried to build wget 1.12
what web sites are you trying to access and what wget version are you
using?
It smells like chunked transfer encoding data that the server sends
careless of the HTTP version specified by wget. You can try to build
wget from the source repository, or using a recent alpha tarball where
HTTP/1.1 is
Hi Guillaume,
Guillaume Turri guillaume.tu...@ecl2008.ec-lyon.fr writes:
Indeed, according to this page
http://wget.addictivecode.org/RepositoryAccess I thought the current
repository was the Mercurial one.
How could I have found it out if I haven't read this mailing list?
Have I made a
Hello,
thanks for you report.
Can you please to re-build wget applying this patch?
Cheers,
Giuseppe
=== modified file 'src/css-tokens.h'
--- src/css-tokens.h2010-05-08 19:56:15 +
+++ src/css-tokens.h2010-05-24 10:04:21 +
@@ -61,6 +61,6 @@
NUMBER,
URI,
FUNCTION
-}
Alexander Lane onthesp...@gmail.com writes:
I've encountered a website that does not put the at the end of
some of its img tags. Wget skips downloading those images as a result,
but I checked several web browsers they were all able to cope with
it.
I don't know whether this was done in an
Hello,
thanks for your report. I am not sure that the URL normalisation should
collapse multiple consecutive forward slashes, I don't see anything
about it in RFC 1808. We can't assume that foo//bar is the same as
foo/bar, it could be handled differently by the server, for example it
may be
Thanks!
I have removed wsock32, now the wget executable on Windows links to
ws2_32.
Cheers,
Giuseppe
Keisial keis...@gmail.com writes:
If you use ws2_32 you don't need to link with wsock32. In fact wsock32
is mostly forwarded functions.
Linking to wsock32 and not ws2_32 has the advantage
Keisial keis...@gmail.com writes:
SciFi wrote:
Another point is that all of wget's perl shell procs are hard-coded with
#!/usr/bin/perl which again points to Apple's and not the newer one we
installed from ActiveState.com. The version mismatch causes symbols to be
missed during make's
Thanks for the report. It should be fixed now, as -I ../md5 is not
present anymore.
Can you please test this alpha tarball?
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
Cheers,
Giuseppe
Jay K jay.kr...@cornell.edu writes:
% uname
OSF1
% uname -r
V5.1
% cc -V
Compaq C V6.3-025
...@cowan.name.\n),
+ fputs (_(Currently maintained by Giuseppe Scrivano
gscriv...@gnu.org.\n),
stdout);
fputs (_(Please send bug reports and questions to bug-wget@gnu.org.\n),
stdout);
I would rather drop this line at all.
Thanks!
Giuseppe
Thanks for your patch but I can't apply it since this fix is already
present in the source repository.
Cheers,
Giuseppe
tho t...@koanlogic.com writes:
--- http.c.orig 2010-06-10 14:01:43.0 +0200
+++ http.c2010-06-10 14:01:55.0 +0200
@@ -1829,6 +1829,13 @@
/*
Muthu Subramanian K muthus...@gmail.com writes:
oh...I missed it :) just checked if it works for me...sorry about that...
Btw, would other protocols work as well (say, multiple http and ftp downloads
with resume)?
I have checked multiple FTP URL's with resume and it seems to work as
expected.
Hi,
I would like to drop completely the windows/ subdirectory. The build
under Windows can be done easily with Cygwin or MinGW in the GNU way:
./configure make.
My feeling is that the Makefile's present under windows/ need more
effort to be kept updated, than the real benefits we can get from
Hrvoje Niksic hnik...@xemacs.org writes:
v...@mage.me.uk v...@mage.me.uk writes:
Thanks for the encouragement! I've attached a patch which should tell
the user there is a problem with the system wgetrc file and exit. Seems
suspiciously simple, can anyone spot any problems with it?
[...]
+
Daniel Stenberg dan...@haxx.se writes:
Please change the type of the variable `ok' to `bool' and include
this change in your patch, also include stdbool.h.
Are you then dropping everything pre C99? I'm just curious as I
thought wget traditionally aimed to work fine even with older
Jochen Roderburg roderb...@uni-koeln.de writes:
With a filename coming over Content-Disposition?;-)
I see now. Sorry the misunderstanding.
I have pushed a fix.
Thanks,
Giuseppe
Jochen Roderburg roderb...@uni-koeln.de writes:
I have to admit of course, that the combination of a usable
Modify-Date and a Content-Disposition filename will be very rare in
the wild, but nevertheless possible. I have seen the
Content-Disposition headers mostly when the reply data is
Hello Christopher,
I have spent some time in the past weeks to fix the Windows build,
with the last alpha tarball,./configure make should be enough to
build wget. Do you know other problems under Windows (beside IPv6, SSL
and NTLMv1)?
What I can say is that they should be fixed before the next
Micah Cowan mi...@cowan.name writes:
On 06/14/2010 08:32 AM, Giuseppe Scrivano wrote:
By the way, I see that currently OpenSSL is preferred over GNU TLS (not
only under Windows), I would invert this.
The current GNU TLS support is broken: that needs to be fixed first. My
understanding
Thanks for your report! I have updated the help string for
--random-wait.
Cheers,
Giuseppe
Tom Mizutani gombei1...@gmail.com writes:
I recently look into Wget documentations and found the
specification of --random-wait option seems to have been
changed, but not explicitly announced in
Solar Designer so...@openwall.com writes:
I think Florian should have replied to you by now. Please confirm.
(I just want to ensure that a possible loss of an e-mail message doesn't
result in duplicate work or whatever.)
I haven't received any reply yet.
As an alternative to copyright
oh...@cox.net writes:
I have been using wget, together with 'time', with the following command line
parameters:
time wget --page-requisites --secure-protocol=SSLV3 --load-cookies
cookies.txt --keep-session-cookies
https://portal.foo.com/test/appmanager/portal/desktop
However, when I do
Hello Timothy,
this bug is fixed by the commit 2363. It is present in the last alpha
tarball: ftp://alpha.gnu.org/gnu/wget/wget-1.12-2392.tar.bz2.
Or you can apply this small patch.
=== modified file 'src/host.c'
--- src/host.c 2010-05-08 19:56:15 +
+++ src/host.c 2010-05-25 16:36:02
Paul pk...@finestplanet.com writes:
When wget encounters a URL that ends in a slash, since there's no file
name, wget has no idea what it should name it, but does have to name it
something. So it goes with the traditional index.html filename.
I would like to prevent the downloading of this
Hi Nguyen,
there was a similar discussion on this mailing-list not much ago:
http://lists.gnu.org/archive/html/bug-wget/2010-06/msg00108.html
Cheers,
Giuseppe
Nguyen Kim Son nguyenk...@gmail.com writes:
Hello,
I'd like to know if there exists an API for wget? I am trying to write a
We are still waiting for the FSF to receive copyright papers.
In the meanwhile, Florian, can you please send me a cleaned copy of your patch
that works with the last bazaar revision and include an entry for the
ChangeLog file?
Thanks,
Giuseppe
Doruk Fisek dfi...@fisek.com.tr writes:
So what
Hi,
thanks for the report. I have changed this function name to don't clash
with the gnulib module (that is used by gnutls).
I have attached a patch, can you try it?
Cheers,
Giuseppe
Ploni Almoni pl...@hotmail.com writes:
Hello,
I am not convinced it is desiderable to implement this feature in wget
as it can be easily done with sha1sum or md5sum (and a small script).
These tools can also read sums from files and verify them.
It is quite meaningless, I think, to print the checksum to the screen as
it can't easily
Hi Hrvoje,
we can relax the gettext version, but the bootstrap script is used only
when the code is compiled from the source repository and gettext is
needed to generate files used by gettext later, and that are distributed
in source releases. It is a dependency only in the bootstrap phase. If
Jochen Roderburg roderb...@uni-koeln.de writes:
With the brand-new autoconf v2.66 the wget build process does not work
any longer with error message:
configure.ac:172: error: AC_CHECK_SIZEOF: requires literal arguments
../../lib/autoconf/types.m4:765: AC_CHECK_SIZEOF is expanded from...
Jozua jo...@sparky.za.net writes:
Hi.
When continuing a ftp download, an incorrect value is used for the
total file size.
The SIZE command returns the correct size, but the value used comes
from the response to the RETR command, which (at least in this case)
is the number of bytes
Hi Daniel,
Daniel Stenberg dan...@haxx.se writes:
I consider the size from SIZE to be much more reliable than the size
you need to guess from the RETR response - based on the fact that
SIZE has a documented way to return the exact size while RETR has
not. No matter which happens to be the
Hi Minato,
I wasn't able to reproduce this problem, I have tried with a directory
containing 5 files.
What do you get using this command (be sure that website.com is the real
one)?
strace -e open,socket wget -q -nc -r -l inf --no-remove-listing \
http://website.com/ 21 | tail
Can
Hello Minato,
Minato Namikaze camusen...@hotmail.com writes:
socket(PF_INET, SOCK_DGRAM|SOCK_NONBLOCK, IPPROTO_IP) = -1 EMFILE (Too many
open files)
socket(PF_INET, SOCK_DGRAM|SOCK_NONBLOCK, IPPROTO_IP) = -1 EMFILE (Too many
open files)
thanks for your further investigation. Can you
you can be interested in this previous discussion:
http://lists.gnu.org/archive/html/bug-wget/2010-07/msg7.html
Does it solve the problem for you?
Cheers,
Giuseppe
Chaim Millet aicit...@gmail.com writes:
Why doesn't wget use the system proxy settings?
(ubuntu 10.04)
Would it be
Florian Weimer f...@deneb.enyo.de writes:
I wonder if it makes any sense to work on this patch. Perhaps you
should find someone with an assignment on file who can do a clean-room
reimplementation?
sorry the huge delay, there were some problems at the FSF with the
copyright assignments
Caleb,
Caleb Cushing xenoterrac...@gmail.com writes:
I thought it was working because sometimes I seemed to get the files
updated but the last couple of days no updates came in, so I turned
continue off and by timestamp check alone it downloaded new files. So
I guess it must not work all the
I couldn't reproduce this problem, can you please give me more
information?
Can you use the -d flag to wget and see what happens?
Thanks,
Giuseppe
Avinash pavin...@gmail.com writes:
Hi All,
I am trying to use --spider option with an URL ending with '/'.
The error I am getting is,
What wget version are you using? It works well for me using wget 1.12.
Cheers,
Giuseppe
gary jefferson garyjefferson...@gmail.com writes:
I'm using 'wget -nc -p -k -r --default-page=index.html http://webpy.org/'
This works fine for most of the site, but fails on pages such as
Thanks for your contribution! It looks good but I want to check the
patch better before use it.
Cheers,
Giuseppe
John Trengrove jtrengr...@gmail.com writes:
This is a patch to change the behaviour for FTP directory listing.
Currently the hours are printed only if the hour is non-zero and
Jochen Roderburg roderb...@uni-koeln.de writes:
OTOH I also saw that the patch as such is not yet complete and does
not yet cover all aspects of the underlying problem.
It seems that setting contentdisposition=on (what I also have
permanently in my wget configuration) circumvents the patch.
Hi Hrvoje,
Hrvoje Niksic hnik...@xemacs.org writes:
That thread doesn't really address the question of Wget using the
system proxy setting that the OP is asking for. I've just tried the
following sequence of steps:
1. configure a proxy in (ubuntu/gnome) system-preferences-network
proxy
Hello Inaki,
can you also specify -d and post here the last lines you get (or any
other message that you consider useful for us)?
As another test, can you use valgrind to trace memory usage of wget?
What does ulimit -m tell you?
Cheers,
Giuseppe
Inaki San Vicente sanbirenpo...@gmail.com
Johnny yggdra...@gmx.co.uk writes:
I am trying to fetch a complete set of pdf docs, whereof some are
hidden in a collapsible list; if you visit the site you must expand
the list to get the docs. Usind wget, I cannot get all the files (the
top level files downloads, but not the rest).
This
Keisial keis...@gmail.com writes:
That scss file contains several entries like:
background-image:url( );
background-image:url();
I have applied the following patch. It should fix the issue reported.
Cheers,
Giuseppe
=== modified file 'src/css-url.c'
--- src/css-url.c 2010-07-29
I have just uploaded a new alpha version of wget.
Since last alpha release different bugs were fixed and the GNU TLS
backend was improved.
For more details you can look at the ChangeLog or using this command:
bzr log -r2392..2416 after you checkout a fresh copy of Wget from the
Bazaar repository
Hello,
David sk.ran...@gmail.com writes:
Hi all,
I'm having issues with Wget 1.12 (on Ubuntu 10.04, 32 bit) and Wget
skipping converting links with spaces in them. Wget downloads the
linked files fine if they have spaces in the filenames, but skips
converting the link to relative when
Manuel Reinhardt reinha...@syslab.com writes:
When downloading an html document with
wget -E -k -p http://...
I noticed that sometimes some of the Stylesheets are not found when
opening the local copy. This happens when the html uses a CSS @import
statement that includes a URL containing
Charles Kozierok charles...@gmail.com writes:
Giuseppe, here's the output requested. Thanks.
Can you attach the working version too?
Thanks,
Giuseppe
Charles Kozierok charles...@gmail.com writes:
But I guess the bigger question is why a Windows version of wget
wouldn't gracefully handle removal of the standard end-of-line
characters. I'm not even sure I can figure out how to properly write
the data out to the file without the 0A0D at the
Dennis, CHENG Renquan crq...@fedoraproject.org writes:
On Thu, Sep 16, 2010 at 4:55 PM, Giuseppe Scrivano gscriv...@gnu.org wrote:
Hi Cheng,
the development version is hosted on the bazaar repository.
If you have problems accessing the bazaar repository then you can try
with this alpha
Thanks!
I have done some small changes and I have pushed it with the commit
2428.
Giuseppe
Merinov Nikolay kim.roa...@gmail.com writes:
Giuseppe Scrivano gscriv...@gnu.org writes:
Can you please write some documentation for users, so I can commit your
patch?
I am fix conflict with 2409
Hello,
I have just uploaded a new alpha tarball containing all the recent
changes done to wget.
If no problems are found, hopefully a release can be done in the next
2-3 weeks.
These are the noteworthy changes since wget 1.12:
** Support HTTP/1.1
** Fix some portability issues.
** Handle
Hello Daniel,
it seems that a clean gnulib bootstrap has fixed these warnings.
Probably some old files left in my workspace.
I am going to take a look at the problems reported by clang.
Thanks,
Giuseppe
Daniel Stenberg dan...@haxx.se writes:
On Sun, 3 Oct 2010, Giuseppe Scrivano wrote:
I
Thanks, the patch looks good. I have done some minor adjustments to
follow the gnu coding standards, I have attached it.
Can you please provide a ChangeLog entry?
Cheers,
Giuseppe
=== modified file 'src/html-url.c'
--- src/html-url.c 2010-07-29 23:00:26 +
+++ src/html-url.c
committed!
Thanks,
Giuseppe
Steven Schubiger s...@member.fsf.org writes:
ftp.c: In function 'getftp':
ftp.c:506: warning: 'targ' may be used uninitialized in this function
=== modified file 'src/ftp.c'
--- src/ftp.c 2010-09-29 11:34:09 +
+++ src/ftp.c 2010-10-11 13:29:34 +
@@
oh...@cox.net writes:
I was wondering if wget is able to follow the URL in a page that contains a
META-REFRESH tag?
It does.
Cheers,
Giuseppe
oh...@cox.net writes:
Does it do that automatically (default), or is there a command line parameter
to enable that functionality?
you don't need to specify it. META-REFRESH URLs are followed as any
other URL found in the page.
Giuseppe
Rahul Prasad rahul.pa...@gmail.com writes:
What if someone does not know Shell Script ?
Scripting is not a feature of Wget, Its a feature of Shell. I want to
add this feature to wget so that end users wont have to write script.
The user can learn how write a basic shell script.
IMO, it is a
Rahul Prasad rahul.pa...@gmail.com writes:
2010/10/15 Giuseppe Scrivano gscriv...@gnu.org
The user can learn how write a basic shell script.
So, you mean we should remove all the features which can be achieved
by scripting.
Only those that can be _easily_ achieved by scripting
Rahul Prasad rahul.pa...@gmail.com writes:
We are not going to impose anything, there is nothing complicated in a
shell script expansion.
Not complicated for whome? Ask a noob and you will get your answer.
Isn't your proposed solution an expansion?
I don't think that wget -batch
Daniel Stenberg dan...@haxx.se writes:
I pulled the latest with bzr just now and ran clang-analyzer on the
code base. This is the report:
http://daniel.haxx.se/wget/2010-10-03-1/
I have fixed all of these warnings reported by clang-analyzer.
If there are not other surprises, hopefully I will
Hello Micah,
Micah Cowan mi...@cowan.name writes:
One could always save away timestamp and length information (and
possibly a checksum) from the original, and then compare afterwards to
see if there was a change. Not ideal, I agree, but personally I dislike
the idea of adding command-line
Have you already tried specifying -e robots=off?
Cheers,
Giuseppe
Daum matt...@gmail.com writes:
wget --span-hosts --convert-links --page-requisites
http://boston.craigslist.org/gbs/abo/2028061114.html doesn't seem to be
bringing down the images. I've included the source of that page
because the server returns a different page.
It seems that the following command works:
wget --header=Accept-Language: it-it http://www.tecut2.it/spinetoli/
Cheers,
Giuseppe
Massimiliano Ciancio mcian...@gmail.com writes:
Hi all,
I'm trying to wget the following page:
Hello,
the filesize should be considered as well. Can you please give us more
information about the ftp server and the wget version you are using?
Thanks,
Giuseppe
Tim Lam t@epuron.com.au writes:
Is there a way I can make wget retrieve the file based purely on filesize and
not on
thanks for your patch! I have looked at it and it seems fine, but in
order to accept it into wget, we will have to complete the copyright
assignment to the FSF process.
I will send you more details in a separate e-mail.
Cheers,
Giuseppe
Filipe Brandenburger filbran...@gmail.com writes:
Thanks, the patch looks fine. I am going to apply it with some
adjustments.
Giuseppe
Reza v...@mage.me.uk writes:
Hi all
I was thinking of fixing this: http://savannah.gnu.org/bugs/?20370 . I
remember the main concern being that the user wget would be unable to
recover if the system
Thanks for your report! Wget translations are done by the translation
project, can you please report it to the proper translation team:
http://translationproject.org/domain/wget.html
Thanks,
Giuseppe
wks1986 wks1...@gmail.com writes:
In wget-1.12, the string eta (estimated time of
Thanks for your report, but it seems that this problem is fixed in the
development version by the commit #2317.
Giuseppe
Orion Poplawski or...@cora.nwra.com writes:
--2010-12-01 10:27:05--
https://tn123.org/mod_xsendfile/mod_xsendfile-0.12.tar.bz2
Connecting to
Hello Petr,
Petr Pisar petr.pi...@atlas.cz writes:
On Thu, Dec 02, 2010 at 10:21:29PM +0100, Giuseppe Scrivano wrote:
I am not sure yet about the next release, I can't apply a patch because
the author hasn't assigned copyright to the FSF yet. I don't think
there will be a release before 2
Hello,
I get -I ../lib as argument to the compiler using the development
version of wget. What version have you tried?
Thanks,
Giuseppe
Perry Smith pedz...@gmail.com writes:
On a system that does not have getopt.h in its /usr/include, the build will
stop when src/main.c is compiled for
writes:
Hi,
I only tried the released 1.12.
I notice that the SCM being used is Bazaar (which is yet another SCM
-- and I don't have it). Plus, it appears I need a password to access
it.
Perry
On Dec 9, 2010, at 6:32 PM, Giuseppe Scrivano wrote:
Hello,
I get -I ../lib as argument
I am not sure how this can be achieved under Windows. If you don't need
a GET request but a HEAD request is enough, then you can use --spider.
IIRC, the shell under Windows provide NUL, which can emulate /dev/null
somehow. I don't have a Windows system here to try it, but you can.
Does:
wget
Wines Joe IM joewi...@tfl.gov.uk writes:
Giuseppe,
Thanks for your response. Unfortunately it doesn't work with NUL or
nul (tried both). The trigger file contains nothing so I have no
idea why it is creating it?!
an empty file is still a valid file. What about remove it after the
Hi Jonas,
wget 1.12 doesn't support HTTP/1.1 so the Transfer-Encoding: chunked
header is not considered. The development version[1] supports
HTTP/1.1. If you are already using it, can you please attach the wget
-d output?
In any case, wget specifies HTTP/1.0 in the request, your web server
Michelle Konzack linux4miche...@tamay-dogan.net writes:
I get such results, if I am on the target server and try to test
something whie using U320 SCSI drives which make up to 230 MByte/sec.
Downloading of 12 MByte from a Harddisk which make more then 120 MByte/s
confuse wget.
Can
Hello,
Bernd ml...@gmx.de writes:
When spidering for a document on the web I get a :
HTTP request sent, awaiting response... 416 Requested Range Not Satisfiable
The file is already fully retrieved; nothing to do.
when the same file is in my working directory.
are you using --continue?
Sylvain Paré sylvain.p...@gmail.com writes:
Hi thanks for this work!
When will those patches will be released ? (i.e new release of wget)
Regards,
Sylvain
There is a pending patch before I can prepare a release. Unfortunately
I can't apply it until the FSF has not received copyright papers
Hi Leonard,
Leonard Ehrenfried leonard.ehrenfr...@web.de writes:
I'd be greatful for a ruthless code review - it's the only way to learn!
I have some comments about your patch:
=== modified file 'src/main.c'
--- src/main.c2011-01-01 12:19:37 +
+++ src/main.c2011-01-25
Hello Gilles,
thanks for your patch. I am not sure it is a good idea to use stderr
to prompt a message to the user. I would just inhibit the message when
-O- is used.
Cheers,
Giuseppe
Gilles Carry gilles.ca...@st.com writes:
Hello,
Here is a small patch to change the ask-password
Thanks for your contribution. I have just applied your patch.
Giuseppe
Steven Schubiger s...@member.fsf.org writes:
Patch attached.
=== modified file 'src/ChangeLog'
--- src/ChangeLog 2010-12-10 22:55:54 +
+++ src/ChangeLog 2011-02-22 12:43:23 +
@@ -1,3 +1,9 @@
Micah Cowan mi...@cowan.name writes:
On 02/23/2011 01:23 AM, Giuseppe Scrivano wrote:
Hello Gilles,
thanks for your patch. I am not sure it is a good idea to use stderr
to prompt a message to the user. I would just inhibit the message when
-O- is used.
Personally, I agree with Gilles
Micah Cowan mi...@cowan.name writes:
Changing the prompt to stderr seems like a simple, single step forward
towards proper usage. It's not perfect, but it strikes me as a good
sight better than using stdout, which really ought to be reserved for
program results-type output, IMO.
I have
Hi Zhenbo,
thanks to have reported them. I have committed a patch (commit #2460)
which should fix these memory leaks.
Cheers,
Giuseppe
Zhenbo Xu zhenbo1...@gmail.com writes:
Hi,everybody!
I found some memory leaks in wget-1.12 source codes.The following lists
the bugs:
bug 1:
Hello Ethan,
can you please try again using the last development version?
You can fetch it from the Bazaar repository how explained here:
https://savannah.gnu.org/bzr/?group=wget
The branch is trunk.
Thanks,
Giuseppe
Ethan Zheng legen...@hotmail.com writes:
Absolutely newbie,
Could
Hello,
I have prepared a new alpha release containing the last changes:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2460.tar.bz2
To verify it, here the detached GPG signature using the key C03363F4:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2460.tar.bz2.sig
Hopefully the next release is close now.
Steven M. Schweda s...@antinode.info writes:
I know that all the serious folks in the world have all the GNU
infrastructure in place, but wouldn't a clever repository-access system
be able to grind out a ready-to-use distribution kit upon user request?
Just a thought.
we make a
1 - 100 of 693 matches
Mail list logo