Ángel González keis...@gmail.com writes:
You would also need
#define _FILE_OFFSET_BITS 64
but that seems already handled by configure.
I'm not sure if that would work for 32bit Windows, though.
_FILE_OFFSET_BITS is defined by the AC_SYS_LARGEFILE macro in the
configure.ac file so we haven't
hey,
thanks for your patches. I have pushed them.
Cheers,
Giuseppe
Gijs van Tulder gvtul...@gmail.com writes:
Hi,
Here are two small patches. I hope they will be useful.
First, a patch that fixes a memory leak in fd_read_body (src/retr.c)
and skip_short_body (src/http.c) when it
hi,
I will take a deeper look after your copyright assignment process is
completed. As a suggestion for the futur: it will be better if you ask
on the mailing list before start working on a task next time (unless it
is a bug). Not all new feature requests can be accepted into wget.
Cheers,
Gijs van Tulder gvtul...@gmail.com writes:
Hi all,
The attached patch should hopefully fix Evgenii's problem.
The patch changes the configure script to always use libz, unless it
is explicitly disabled. In that case, the patch makes sure that the
WARC functions do not use gzip but write to
Thanks for the patch, except some minor esthetic changes, like an empty
space between the function name and '(', that I can fix before apply it,
the patch seems ok.
Before I can apply it though, you need to get copyright assignments with
the FSF. I am going to send more information in private to
Micah Cowan mi...@micah.cowan.name writes:
I believe hh's suggestion is to have the format reflect the way it would look
in a URL; so [ and ] around ipv6, and nothing around ipv4 (since ipv4 format
isn't ambiguous in the way ipv6 is).
I agree. Please rework your patch to use
thanks. The patch is not complete yet, it doesn't fix the other message
I have reported before. Can you please check it as well? Can you
provide a ChangeLog file entry?
Cheers,
Giuseppe
Sasikanth sasikanth@gmail.com writes:
I had modified the patch as you guys suggested.
For ipv6 the
Naxa anaxagra...@gmail.com writes:
I suggest a feature for limiting the recursion depth level specifically
on different Hosts, when spanning hosts.
This way I wouldn't need to know and list the different hosts when,
for example, a page links to multiple image hosting sites.
An option like
Eli Zaretskii e...@gnu.org writes:
Sorry, I don't understand this comment. fd is indeed a file
descriptor, but ioctlsocket's first argument is a SOCKET object, which
is an unsigned int, and we get it from a call to `socket' or some
such. So where do you see a potential problem?
And
,
Giuseppe
=== modified file 'ChangeLog'
--- ChangeLog 2011-12-11 14:18:11 +
+++ ChangeLog 2011-12-12 20:24:25 +
@@ -1,3 +1,8 @@
+2011-12-12 Giuseppe Scrivano gscriv...@gnu.org
+
+ * Makefile.am (EXTRA_DIST): Add build-aux/bzr-version-gen.
+ Reported by: Elan Ruusamäe g...@pld
Paul Wratt paul.wr...@gmail.com writes:
this works but no size in output:
wget -nv --output-file=wget.txt _url_
I found a reference to a 2007 post asking for:
3) add support for turning off the progress bar with
--progress=none
I think I am going to add this support by myself.
I have
david painter ddpain...@bigpond.com writes:
Help. after installing and trying get my DVD and Cd drives to work I
now have a error message stating E:Type '2011-12-04' is not known on
line 1 in Source list/etc/apt/source.list.d/medibuntu.list
you have reached the GNU wget mailing list. Your
Paul Wratt paul.wr...@gmail.com writes:
if it does not obey - server admins will ban it
the work around:
1) get single html file first - edit out meta tag - re-get with
--no-clobber (usually only in landing pages)
2) empty robots.txt (or allow all - search net)
possible solutions:
A)
hello Alex,
sorry for the late reply. Correct, when you specify
--content-disposition, the destination file name is not known. You can
see it by specifying the destination file using -O, as:
wget -c --content-disposition --debug
http://www.dubovskoy.net/CANTER/01.mp3 -O 01.mp3
that command is
Jochen Roderburg roderb...@uni-koeln.de writes:
I have some problems compiling recent development versions (with the
WARC additions) on my Linux.
First it was missing a tmpdir.h. Looking around I saw some tmpdir
files in the gnulib directories, but obviously they were not where the
build
what happens if you specify -H?
Cheers,
Giuseppe
Randy Kramer rhkra...@gmail.com writes:
I just joined the list and I'm jumping the gun a little bit (because I
usually
lurk on a list for a little while before posting), but...
I'm trying to save a local copy of this page with all the
hi Vishwanath,
is it possible to use the last released version of wget? You can find
it here: ftp://ftp.gnu.org/gnu/wget/wget-1.13.4.tar.xz
I have no clue what changes are contained in the so called Red Hat
modified version of wget, I highly suggest to use the last upstream
version in order to
the
attached patch. One of the WARC functions uses the basename function,
which causes problems on OS X. Including libgen.h and strdup-ing the
output of basename seems to solve this problem.
Thanks,
Gijs
Op 04-11-11 22:27 schreef Giuseppe Scrivano:
Gijs van Tuldergvtul...@gmail.com writes:
Hi
Gijs van Tulder gvtul...@gmail.com writes:
Hi Giuseppe,
* I've changed the configure.ac and src/Makefile.am.
* I've added a ChangeLog entry.
lovely. I am going to push it soon with some small adjustments.
Thanks for the great work. Whenever it happens to be in the same place,
I'll buy you
Committed with a ChangeLog entry and a small change. Another beer? :-)
Thanks!
Giuseppe
Gijs van Tulder gvtul...@gmail.com writes:
Hi,
I think there is a memory leak in the GnuTLS part of wget. When
downloading multiple files from a HTTPS server, wget with GnuTLS uses
a lot of memory.
Gijs van Tulder gvtul...@gmail.com writes:
=== modified file 'bootstrap.conf'
--- bootstrap.conf2011-08-11 12:23:39 +
+++ bootstrap.conf2011-10-21 19:24:18 +
@@ -28,6 +28,7 @@
accept
alloca
announce-gen
+base32
bind
c-ctype
clock-time
@@ -49,6 +50,7 @@
mbtowc
Hrvoje Niksic hnik...@xemacs.org writes:
I expect the biggest changes to be required in progress.c. :)
anyone has some ideas? :-) How should it look?
Cheers,
Giuseppe
Gijs van Tulder gvtul...@gmail.com writes:
Hi all,
Based on the comments by Giuseppe and Ángel I've revised the
implementation of the wget WARC extenstion. I've attached a patch.
1. It's no longer based on the warctools library. Instead, I've
written a couple of new WARC-writing functions,
Thanks. Pushed.
Cheers,
Giuseppe
Steven Schubiger s...@member.fsf.org writes:
=== modified file 'ChangeLog'
--- ChangeLog 2011-09-04 12:19:12 +
+++ ChangeLog 2011-10-16 18:18:34 +
@@ -1,3 +1,8 @@
+2011-10-16 Steven Schubiger s...@member.fsf.org
+
+ * util/paramcheck.pl:
hello,
The winter is coming, not much to do outside and I have spent the day
working on something I had in mind already for too long.
Unfortunately I couldn't start the implementation as I have thought it
could be possible, there are too many nested `select' points in the code
and implement an
Hi Gijs,
Gijs van Tulder gvtul...@gmail.com writes:
can you please send a complete diff against the current development
tree version?
Here's the diff of the WARC additions (1.9MB zipped) to revision 2565:
http://dl.dropbox.com/u/365100/wget_warc-20110926-complete.patch.bz2
the patch is
Steven M. Schweda s...@antinode.info writes:
[Various other changes/fixes affecting VMS]
Still wondering.
For the curious, a set of patches should be available at:
http://antinode.info/ftp/wget/wget-1_13_4/1_13_4_1.dru
can you please include a ChangeLog entry for each of
Henrik Holst henrik.ho...@millistream.com writes:
No problem, I'll give it a try, yell at me if I do something wrong:
Good job! I have applied the patch and pushed it.
Cheers,
Giuseppe
Hi Henrik,
Henrik Holst henrik.ho...@millistream.com writes:
This patch adds an option to not skip the content sent by the HTTP server
when the server responds with a status code in the 4xx and 5xx range.
thanks for the patch, I am quite inclined to include it. Can you please
provide the
Hi Micah,
Micah Cowan mi...@cowan.name writes:
So, from where I'm sitting, it looks like --preserve-permissions was an
implemented feature for two major releases (1.10 and 1.11 series), and
has now been missing from the last two major releases (1.12 and 1.13).
Probably, it should be
k...@freefriends.org (Karl Berry) writes:
Tiny change for the manual to make its dir entry consistent with others,
ok?
Ok. Pushed.
Thanks,
Giuseppe
Gijs van Tulder gvtul...@gmail.com writes:
Hi.
It's been a while since we've discussed the WARC addition to Wget. Is
there anything I can help with?
can you please send a complete diff against the current development tree
version?
I'll take a look at it ASAP.
Thanks,
Giuseppe
Manuel José Muñoz Calero manuelj.mu...@gmail.com writes:
These days I've been reading as much as I could: manual, wiki, code
and baazar usage.
If you are agree, I'm beginning with...
#21439: Support for FTP proxy authentication
It sounds great!
... planned release 1.15, status
Daniel Stenberg dan...@haxx.se writes:
On Mon, 26 Sep 2011, Giuseppe Scrivano wrote:
#21439: Support for FTP proxy authentication
It sounds great!
Since there's no FTP proxy standard or spec, how exactly is this going
to work?
ops, thanks to have pointed it out. I wasn't aware
k...@freefriends.org (Karl Berry) writes:
Hi Giuseppe,
The copyright year in the wget --version output should be 2011, not 2009.
As seen in 1.13.4.
thanks to have reported it, this patch fixes it:
=== modified file 'src/main.c'
--- src/main.c 2011-09-06 13:53:39 +
+++ src/main.c
Hello,
I am pleased to announce the new version of GNU wget.
It fixes some bugs reported in the recent wget 1.13.3 release.
It is available for download here:
ftp://ftp.gnu.org/gnu/wget/wget-1.13.4.tar.gz
ftp://ftp.gnu.org/gnu/wget/wget-1.13.4.tar.xz
and the GPG detached signatures using the
ma...@inbox.com writes:
In these specific tests, I am using GNU Wget 1.11.4 on a Windows platform.
CSS support was added in wget 1.12.
Cheers,
Giuseppe
ma...@inbox.com writes:
I wonder if Wget needs an option like --resetdefaults=yes to reset any
changes that may have been made in the .wgetrc file.
I think you can get the same behaviour by using --config=/dev/null. The
parameter --config is supported since wget 1.13.
Cheers,
Giuseppe
matt...@creativegraphicsolutions.biz writes:
I've tried several work-arounds for this, all with no success. Wget
simply refuses to follow quota specifications for single files no
matter how Wget is invoked.
Respecting quotas for single files would be useful in other situations
where Wget is
Christian Jullien eli...@orange.fr writes:
When compiling gnutls.c on solaris 10 sparc with gcc 4.6.1
I get an error on:
ret = ioctl (fd, FIONBIO, one);
because FIONBIO is undefined.
Adding:
#include sys/fcntl.h
Let:
#ifdef F_GETFL
ret = fcntl (fd,
Hi Vladimir,
thanks, it has been fixed in the source repository.
Cheers,
Giuseppe
Vladimir Lomov lomov...@gmail.com writes:
Hello,
I'm on Archlinux x86_64. After updating the system with the help of
package manager wget aborts on simple `wget --version' with exit code
3.
Seems I found
I am pleased to announce the new version of GNU wget.
It is available for download here:
ftp://ftp.gnu.org/gnu/wget/wget-1.13.3.tar.gz
ftp://ftp.gnu.org/gnu/wget/wget-1.13.3.tar.xz
and the GPG detached signatures using the key C03363F4:
ftp://ftp.gnu.org/gnu/wget/wget-1.13.3.tar.gz.sig
Hello Denis,
this bug will be fixed in the next release of wget. It wasn't
officially released yet but you can find newer tarballs here:
ftp://ftp.gnu.org/gnu/wget
If it still doesn't work for you with 1.13, please report it.
Cheers,
Giuseppe
Denis Laplante denis.lapla...@ubc.ca writes:
Ray Satiro raysat...@yahoo.com writes:
Calling utime() works. You could also use SetFileTime(). 2489 changed utime
to utimes but the CRT doesn't have utimes.
thanks to have checked it.
I am going to apply the patch below.
Cheers,
Giuseppe
=== modified file 'configure.ac'
---
David H. Lipman dlip...@verizon.net writes:
I don't know when it happened, probably when I upgraded WGET, but when I
download files
thedy inherit the date and time of the file of when they were downloaded.
It used to be that when the file was downloaded, it retained the date and
time of
David H. Lipman dlip...@verizon.net writes:
WinXP/Vista -- Win32
Y:\wget --version
GNU Wget 1.12-2504 built on mingw32.
the change introduced by the revision
gscriv...@gnu.org-20110419103346-cctazi0zxt2770wt could be the reason of
the problem you have reported.
If it is possible for you to
can you please
H.Merijn Brand h.m.br...@xs4all.nl writes:
That is bad. Why? GNU TLS /might/ be more safe than OpenSSL in some
aspects, but is is for sure not available on (older) versions of AIX
and/or HP-UX. It is already quite a bit of work to get OpenSSL and
OpenSSH to be rather
Hello,
The following bug report was sent to the wget mailing list, I am not
sure why it happens, it seems related to gnulib, has anyone an idea
about it?
I don't have access to any HP-UX box to test it by myself.
Thanks,
Giuseppe
With HP-UX 11.00 and HP C-ANSI-C it doesn't even *compile*
ops...
Thanks to have reported it. I am sure it depends from a fix for a
similar error Perry had on AIX.
At this point, it seems the only way to fix the problem is to include
config.h at the very beginning of css.c. I have looked at the flex
documentation but I can't find anything useful to
Hello,
I have tried the command you suggested but I wasn't able to let it hang.
Are you able to reproduce this problem every time? If so, can you
please include the debug information generated by --debug?
Thanks,
Giuseppe
Axel Reinhold a...@freakout.de writes:
Hi,
wget 1.13.1 hangs on
going to remove the include of wget.h ?
On Aug 17, 2011, at 9:09 AM, Giuseppe Scrivano wrote:
ops...
Thanks to have reported it. I am sure it depends from a fix for a
similar error Perry had on AIX.
At this point, it seems the only way to fix the problem is to include
config.h
...@gmail.com writes:
Do I need all the autoconf stuff for this? I made the change but the
Makefile didn't reflect the changes.
On Aug 17, 2011, at 9:29 AM, Giuseppe Scrivano wrote:
Yes, but it seems to create another problem under Mac OS X 10.6.8.
In any case, this is the hack I was talking about
Perry Smith pedz...@gmail.com writes:
I took a stab at installing GNUTLS and gave up. The beauty of wget is
I can get it going with very few things needed. I compiled without ssl
at all but getting openssl going is fairly easy too. GNUTLS is asking
for nettle, zlib, and something else
Jochen Roderburg roderb...@uni-koeln.de writes:
And in general they seem to want to steer away the users from openssl
to gnutls and in order to do that the configure script doesn't even
mention this option any longer. :-(
And in the same vein the option --with-libssl-prefix has completely
Hello Perry,
thanks to have reported it. Does it work correctly if you drop the
#include wget.h line from css.l?
=== modified file 'src/css.l'
--- src/css.l 2011-01-01 12:19:37 +
+++ src/css.l 2011-08-12 15:18:23 +
@@ -36,7 +36,6 @@
#define YY_NO_INPUT
-#include wget.h
Gijs van Tulder gvtul...@gmail.com writes:
It would be cool if Wget could become one of these tools. Already the
Swiss army knife for mirroring websites, the one thing that Wget is
missing is a good way to store these mirrors. The current output of
--mirror is not sufficient for archival
Hello Karl,
thanks to have reported it. It looks like a very ugly one, I think it
depends from last change:
revno: 2517
committer: Giuseppe Scrivano gscriv...@gnu.org
branch nick: wget
timestamp: Fri 2011-08-05 21:36:08 +0200
message:
gnutls: do not use a deprecated function.
I'll rollback
Peng Yu pengyu...@gmail.com writes:
I was looking at the patched version. (See the patch posted in bug
#31147) So I think that the bug in the patch (see the relevant code
below, where full_file has the query string). I guess for full_file a
different 'acceptable' function should be used.
Hello Peng,
AFAICS, `s' is a path, so '/' in the query string is escaped and
`acceptable' doesn't see it.
As for your example:
http://xxx.org/somescript?arg1=/xxy
`s' in this case will be something like:
xxx.org/somescript?arg1=%2Fxxy
Do you have any example where it doesn't work?
Cheers,
Noël Köthe n...@debian.org writes:
I don't want to pester with this question but when is the next wget
release planed? 1.12 was released 2009-09-22 and since then there were
some bugfixes and patches integrated in the VCS but they do not reach
the user.
I have just uploaded another test
Jochen Roderburg roderb...@uni-koeln.de writes:
--- ./src/host.c.orig 2011-08-06 16:45:59.0 +
+++ ./src/host.c2011-08-06 19:49:41.0 +
@@ -829,7 +829,7 @@
int printmax = al-count;
if (! opt.show_all_dns_entries)
-printmax = 3;
+
Hello Nirgal,
thanks to have reported it. I am not sure it is really wrong to omit
quotes but in any case I am going to apply this patch:
=== modified file 'src/cookies.c'
--- src/cookies.c 2011-01-01 12:19:37 +
+++ src/cookies.c 2011-08-02 20:53:42 +
@@ -350,6 +350,13 @@
Peng Yu pengyu...@gmail.com writes:
Hi,
I use the following code to download the cookies. But it will always
download some_page. Is there a way to just download the cookies?
wget --post-data='something' --directory-prefix=/tmp
--save-cookies=cookies_file --keep-session-cookies
Peng Yu pengyu...@gmail.com writes:
Suppose I want download www.xxx.org/somefile/aaa.sfx and the links
therein (but restricted to the directory www.xxx.org/somefile/aaa/)
I tried the option '--mirror -I /somefile/aaa', but it only download
www.xxx.org/somefile/aaa.sfx. I'm wondering what
,
Jan G Thomas
jatho...@redhat.com
- Original Message -
From: Giuseppe Scrivano gscriv...@gnu.org
To: Jan Thomas jatho...@redhat.com
Cc: bug-wget@gnu.org
Sent: Monday, July 25, 2011 12:24:44 PM
Subject: Re: [Bug-wget] next wget release?
hey Jan,
this is what I get using the last
Patrick Steil patr...@churchbuzz.org writes:
Also, if I use wget in spider mode, it will at the end of the log
output tell me about all the broken links... but I also need to know
what page those broken links are created on (if the broken link) is on
the site I am getting... this will help me
Hello,
Patrick Steil patr...@churchbuzz.org writes:
If I run this command:
wget www.domain.org/news?page=1 options= -r --no-clobber --html-extension
--convert-links -np --include-directories=news
Here is what it does today:
1. When --html-extension is turned on, the --noclobber is not
Hello,
I couldn't reproduce the problem here, I get the same content I get with
the browser.
Does it behave differently if you use a recursive download or if you
request a single page? Does it happen everytime?
If you are able to reproduce it, can you please post the output you get
running
Hello,
how are you invoking wget? Do you see something different in the http
headers when you use --debug?
Thanks,
Giuseppe
Richard van Katwijk rich...@three6five.com writes:
Hi,
I am using the firefox plugin 'httpfox' to trace the sending and receiving
of cookies between my browser and
Can it be that the server allows GET but not HEAD?
Can you attach the debug log without --spider as well? You can drop the
payload if it is confidential :-) The request and the response headers
matter.
Thanks,
Giuseppe
Avinash pavin...@gmail.com writes:
Hi ,
I am getting 'Authorization
are you executing wget from the c:\Windows\system32 directory?
To prevent the file to be being written to the disk, you can specify
-O NUL on the command line, never tried by myself but I remember it
works under Windows.
Giuseppe
Itay Levin itay.le...@onsettechnology.com writes:
I'm using
Thanks to have reported these problems. I'll take a look at them in the
next few days.
Cheers,
Giuseppe
Merinov Nikolay kim.roa...@gmail.com writes:
Current realisation of IDN support in wget not worked when system uses
UTF-8 locale.
Current realisation of function `url_parse' from
David H. Lipman dlip...@verizon.net writes:
If you are using Mapped Drives, there is NO NEED to use WGET as there are
plenty of OS
utilities from XXCOPY to RoboCopy.
though these tools have two problems, first of all they are not free.
Second, as already reported, they don't follow HTML
Itay Levin sit...@gmail.com writes:
no i didn't specify any output dir - so it by default created the
files in c:\windows\system32
but still it could be the working directory where wget is executed.
Giuseppe
Hello,
d113803_0-m m...@rtinlochner.de writes:
150 Opened data connection.
fertig.
113803.webhosting42.1blu.de/www/demos/.listing: Permission denied
can you check your permissions on the /www/demos/ directory? Can you
browse it?
Cheers,
Giuseppe
what version are you using?
It seems to work well here:
$ wget -q -P testdir ftp://alpha.gnu.org/gnu/wget/wget-1.12-2504.tar.bz2 ls
testdir/
wget-1.12-2504.tar.bz2
Giuseppe
Michele Prendin mich...@micheleprendin.com writes:
Hello there,
I'm facing issues to use wget -P to download to a
Michele Prendin mich...@micheleprendin.com writes:
Thanks Giuseppe for the help,
i fixed the issues upgrading wget (you udate wget(
despite now i can save in the folder i want, i have another issue
with the older wget when i was using
wget www.google.com/popupfile.php
the phpfile
Hi Volker,
I see it now, thanks. This small patch makes sure the url is parsed in
any case.
Cheers,
Giuseppe
=== modified file 'src/retr.c'
--- src/retr.c 2011-06-05 12:31:24 +
+++ src/retr.c 2011-06-08 09:29:20 +
@@ -1005,9 +1005,7 @@
break;
}
- /* Need
message
On Wed, Jun 8, 2011 at 7:08 AM, Giuseppe Scrivano gscriv...@gnu.org
wrote:
brad bruggemann bradley.bruggem...@gmail.com writes:
Use wget to grab file:
wget --secure-protocol=TLSv1 --certificate-type=PEM
--certificate=/
path.to/cert.pem --password
Hi Volker,
thanks to have reported this bug but it was fixed in the development
version of wget and the fix will be included in the next release.
Can you please confirm if it works for you?
You can fetch a source tarball here:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2504.tar.bz2
Thanks,
I doubt it will work with a recent version of wget. Anyway, I suggest
you to take an older version (something like 1.10.2) and apply the patch
using GNU patch, once the source code is patched you can build it and
get the wget executable.
Cheers,
Giuseppe
Dale Egan d...@leemyles.com writes:
Yang Zhang yanghates...@gmail.com writes:
I mentioned --include-directories in my original email. I couldn't
figure out how to use it to this effect. Could you demonstrate?
have you already tried the following one?
wget -r -I /host/foo/ http://host/foo/bar/baz/index.cgi?page=1
Giuseppe
Micah Cowan mi...@cowan.name writes:
have you already tried the following one?
wget -r -I /host/foo/ http://host/foo/bar/baz/index.cgi?page=1
Shouldn't that be just -I /foo/ ?
Yeah, sure :-)
Thanks,
Giuseppe
On Sat, Apr 23, 2011 at 10:45 AM, Giuseppe Scrivano gscriv...@gnu.org wrote:
Thanks for the patch. It looks ok but in order to apply it, you need to
complete the copyright assignments process to the FSF. We are very
quite close to have a wget release and I doubt the FSF will receive your
hello,
the character in the url is interpreted by your shell.
Try using something like:
wget URL
Cheers,
Giuseppe
Jeff Givens j...@sds.net writes:
Hello, I am having an issue downloading files via download links from
CNET. It appears to locate some of the URL but stops at the first
Hi Mojca,
it was already reported here:
http://savannah.gnu.org/bugs/index.php?20519
On the same page you can find an explanation why it behaves this way.
Cheers,
Giuseppe
Mojca Miklavec mojca.miklavec.li...@gmail.com writes:
Dear list,
when I try to run
wget -np --mirror
Thanks for the patch. Committed and pushed.
Cheers,
Giuseppe
Cristian Rodríguez crrodrig...@opensuse.org writes:
Hi:
the attached patch adds support to an openSSL library compiled without
SSlv2 , in which case, wget will behave like if it was using
the GNUTLS backend, that is, doing
thanks for the bug report, it is already fixed in the development
version. The fix will be included in the next wget release.
Cheers,
Giuseppe
Vitaly Minko vitaly.mi...@gmail.com writes:
Hi all,
I get segmentation fault when HTTP server returns malformed status line
(without a status
David Skalinder da...@skalinder.net writes:
I want to mirror part of a website that contains two links pages, each of
which contains links to many root-level directories and also to the other
links page. I want to download recursively all the links from one links
page, but not from the
Ray Satiro raysat...@yahoo.com writes:
Hi,
It is still an issue that wget/openssl combo is broken in windows.
I have uploaded a new tarball:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2474.tar.bz2
Can you please check if it works well for you now? OpenSSL should work
well now under Windows,
Ray Satiro raysat...@yahoo.com writes:
Anything in OpenSSL that tries to write to a socket will fail because it's
passed a fd and not a socket. For example sock_write() in openssl's
crypto/bio/bss_sock.c:153 calling send() and passing a fd will cause an error
of
WSAENOTSOCK.
It
Micah Cowan mi...@cowan.name writes:
So it looks like wget is correctly blocking the http URL, but
incorrectly permitting the https URL.
We check if the two schemes are similar but at the same time we require
the port to be identical.
I have relaxed this condition, now the two ports must be
Micah Cowan mi...@cowan.name writes:
Since the manpage is automatically generated from the info manual, this
needs to be fixed in wget.texinfo, too.
thanks, I am going to fix it in the documentation too.
Giuseppe
Steven M. Schweda s...@antinode.info writes:
I know that all the serious folks in the world have all the GNU
infrastructure in place, but wouldn't a clever repository-access system
be able to grind out a ready-to-use distribution kit upon user request?
Just a thought.
we make a
Hello,
I have prepared a new alpha release containing the last changes:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2460.tar.bz2
To verify it, here the detached GPG signature using the key C03363F4:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2460.tar.bz2.sig
Hopefully the next release is close now.
Hello Ethan,
can you please try again using the last development version?
You can fetch it from the Bazaar repository how explained here:
https://savannah.gnu.org/bzr/?group=wget
The branch is trunk.
Thanks,
Giuseppe
Ethan Zheng legen...@hotmail.com writes:
Absolutely newbie,
Could
Hi Zhenbo,
thanks to have reported them. I have committed a patch (commit #2460)
which should fix these memory leaks.
Cheers,
Giuseppe
Zhenbo Xu zhenbo1...@gmail.com writes:
Hi,everybody!
I found some memory leaks in wget-1.12 source codes.The following lists
the bugs:
bug 1:
Micah Cowan mi...@cowan.name writes:
Changing the prompt to stderr seems like a simple, single step forward
towards proper usage. It's not perfect, but it strikes me as a good
sight better than using stdout, which really ought to be reserved for
program results-type output, IMO.
I have
Hello Gilles,
thanks for your patch. I am not sure it is a good idea to use stderr
to prompt a message to the user. I would just inhibit the message when
-O- is used.
Cheers,
Giuseppe
Gilles Carry gilles.ca...@st.com writes:
Hello,
Here is a small patch to change the ask-password
Thanks for your contribution. I have just applied your patch.
Giuseppe
Steven Schubiger s...@member.fsf.org writes:
Patch attached.
=== modified file 'src/ChangeLog'
--- src/ChangeLog 2010-12-10 22:55:54 +
+++ src/ChangeLog 2011-02-22 12:43:23 +
@@ -1,3 +1,9 @@
501 - 600 of 693 matches
Mail list logo