Mauro Tortonesi wrote:
this is a very interesting point, but the patch you mentioned above uses
the
LIST -a FTP command, which AFAIK is not supported by all FTP servers.
As I recall, that's why the patch was not accepted. However, it would be
useful if there were some command line option to
Behdad Esfahbod [EMAIL PROTECTED] writes:
It happened to me to unintentionally run two commands:
wget -b -c http://some/file.tar.gz
and hours later I figured out that the 1GB that I've downloaded
is useless since two wget processes have been downloading the
same data twice and appending
Linda Walsh [EMAIL PROTECTED] writes:
I noticed after my post in the archives that this bug is fixed in
1.10.
Now if I can just get the server-ops to fix their CVS server, that'd
be great -- I've checked out CVS projects from other sites and not
had inbound TCP attempts to some 'auth'
On Wednesday 03 August 2005 08:14 am, dan1 wrote:
Hello.
I am using wget since a long time now. I like it very much.
However I have 2 requests of enhancements that I think to be important and
very useful:
1. There should be a 'download acceleration' mode that triggers several
downloads at
Thanks for the report. The problem seems to come from Wget's use of
AI_ADDRCONFIG hint to getaddrinfo. Wget 1.10.1 will not use that
hint.
PoWah Wong wrote:
The login page is:
http://safari.informit.com/?FPI=uicode=
How to figure out the login command?
These two commands do not work:
wget --save-cookies cookies.txt http://safari.informit.com/?FPI= [snip]
wget --save-cookies cookies.txt
I can save cookies but still has wgetting a blank web
page. The web page url is copied from the url
displayed in the web browser.
These are the logs.
C:\Program Files\wget\wget --save-cookies
cookies.txt http://safari.informit.com/JVXSL.asp;
This sounds like a difficult page to download because they may be using
cookies or session variables. I'm not sure the best way to proceed, but
I would look at the wget documentation about cookies. I think you may
have to save the cookies that are generated by the login page and use
Putting quotes around the url got rid of your Invalid parameter errors.
I just tried accessing the url you are trying to wget and received an
http 500 response. I also tried accessing
http://proquest.booksonline.com/ and never got a response.
According to your output, wget got back a 0
I put quotes around the url, but it still does not
work.
C:\bookC:\Program Files\wget\wget.exe
http://proquest.booksonline.com/?x=1mode=sectionso
rtKey=titlesortOrder=ascview=xmlid=0-321-16076-2/ch03lev1sec1g=catid=s=1b=1f=1t=1c=1u=1r
=o=1n=1d=1p=1a=0page=0
--22:45:26--
Windows MSVC test binary at
http://xoomer.virgilio.it/hherold/
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent:
from Hrvoje Niksic:
[...] Unfortunately EOL conversions break
automatic downloads resumption (REST in FTP),
Could be true.
manual resumption (wget -c),
Could be true. (I never use wget -c.)
break timestamping,
How so?
and probably would break checksums if we added them.
[EMAIL PROTECTED] (Steven M. Schweda) writes:
from Hrvoje Niksic:
[...] Unfortunately EOL conversions break
automatic downloads resumption (REST in FTP),
Could be true.
manual resumption (wget -c),
Could be true. (I never use wget -c.)
It's the consequence of EOL conversion
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It does seem a bit odd that no one has noticed this fundamental
problem until now, but then I missed it, too.
Long ago I intentionally made Wget use binary mode by default and not
muck with line endings because I believed exact data transfer was
John Haymaker [EMAIL PROTECTED] writes:
I am trying to download all pages in my site except secure pages that
require login.
Problem: when wget encounters a secure page requiging the user to log in,
it hangs there for up to an hour. Then miraculously, it moves on.
By secure pages do you
[...] (The new code does make one potentially risky assumption,
but it's explained in the comments.)
The latest code in my patches and in my new 1.9.1d kit (for VMS,
primarily, but not exclusively) removes the potentially risky assumption
(CR and LF in the same buffer), so it should be
Thanks.
From: Mauro Tortonesi [EMAIL PROTECTED]
Organization: University of Ferrara
To: [EMAIL PROTECTED]
Subject:Re: WGET return status codes
Date sent: Sat, 18 Jun 2005 15:33:26 -0500
Copies to: [EMAIL
On Tuesday 14 June 2005 07:06 am, Zinovy Malkin wrote:
Dear all,
I'm not sure the address I'm sending this message is appropriate, sorry.
Could anybody advise me please where can I find the list of the wget
return status codes.
at the moment wget status codes are not completely standardized,
Hi Hrvoje,
Thanks for the detailed report!
Thanks for your detailed answer ;-)
Jens Schleusener [EMAIL PROTECTED] writes:
1) Only using the configure-option --disable-nls and the C compiler
gcc 4.0.0 the wget-binary builds successfully
I'd be interested in seeing the error log without
Gabor Z. Papp [EMAIL PROTECTED] writes:
* Hrvoje Niksic [EMAIL PROTECTED]:
| new configure script coming with wget 1.10 does not honour
| --with-ssl=/path/to/ssl because at linking conftest only
| -I/path/to/ssl/include used, and no -L/path/to/ssl/lib
|
| That is not supposed to
Jens Schleusener [EMAIL PROTECTED] writes:
--12:36:51-- http://www.example.com/
= `index.html'
Resolving www.example.com... failed: Invalid flags in hints.
This is really bad. Apparently your version of getaddrinfo is broken
or Wget is using it incorrectly. Can you intuit
Gabor Z. Papp [EMAIL PROTECTED] writes:
* Hrvoje Niksic [EMAIL PROTECTED]:
| According to config.log, it seems your SSL includes are not in
| /pkg/include after all:
Sure, they are in /pkg/include/openssl.
You're right. The Autoconf-generated test is wrong, and I'm trying to
figure out
Hi Hrvoje,
Jens Schleusener [EMAIL PROTECTED] writes:
--12:36:51-- http://www.example.com/
= `index.html'
Resolving www.example.com... failed: Invalid flags in hints.
This is really bad. Apparently your version of getaddrinfo is broken
or Wget is using it incorrectly. Can
Jens Schleusener [EMAIL PROTECTED] writes:
The reason for the above error is as already written - at least in
my case using the self compiled libtool version 1.5
I don't think the libtool version used on the system makes any
difference (except for a developer at the point of libtoolizing his
Hi,
The above line
configure:25756: /bin/sh ./libtool gcc -c -O2 -Wall -Wno-implicit
-I/usr/local/contrib/include conftest.c 5
gcc -c -O2 -Wall -Wno-implicit -I/usr/local/contrib/include conftest.c
-DPIC -o .libs/conftest.o
looks for me (as a configure-layman) a little bit strange
Thanks for the detailed report!
Jens Schleusener [EMAIL PROTECTED] writes:
1) Only using the configure-option --disable-nls and the C compiler
gcc 4.0.0 the wget-binary builds successfully
I'd be interested in seeing the error log without --disable-nls and/or
with the system compiler.
This patch should take care of the problems with compiling Wget 1.10
with the native IBM cc.
2005-06-15 Hrvoje Niksic [EMAIL PROTECTED]
* host.h (ip_address): Remove the trailing comma from the type
enum in the no-IPv6 case.
* main.c (struct cmdline_option): Remove the
Nagy Ferenc Lszl [EMAIL PROTECTED] writes:
If the ftp server returns invalid data (for example '221 Bye.') in
response to PWD, wget segfaults because in ftp_pwd (ftp-basic.c)
request will be NULL after the line 'request = strtok (NULL,
\);', and this NULL will be passed to xstrdup.
Thanks
Windows MSVC binary at http://xoomer.virgilio.it/hherold/
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Friday, June
Thank you.
I appreciate this.
Will keep you posted on how it turns out.
Regards,
Kiran
-Original Message-
From: Steven M. Schweda [mailto:[EMAIL PROTECTED]
Sent: Saturday, June 04, 2005 8:39 AM
To: WGET@sunsite.dk
Cc: Kiran Atlluri
Subject: Re: wget and ASCII mode
From: Kiran Atlluri
Zitat von Oliver Schulze L. [EMAIL PROTECTED]:
Hi Mauro,
do you know if the regex patch from Tobias was applied to this release?
Thanks
Oliver
The last words on this topic that I remember were here:
http://www.mail-archive.com/wget@sunsite.dk/msg07436.html
Regards,
J.Roderburg
Thanks Jochen,
I'm downloading both now
Oliver
Jochen Roderburg wrote:
Zitat von "Oliver Schulze L." [EMAIL PROTECTED]:
Hi Mauro,
do you know if the regex patch from Tobias was applied to this release?
Thanks
Oliver
The last words on this topic that I remember were
From: Kiran Atlluri
[...]
I am trying to retrieve a ?.csv? file on a unix system using wget (ftp
mode).I
When I retrieve a file using normal FTP and specify ASCII mode, I
successfully get the file and there are no ? ^ M ? at the end of line in
this file.
But when I use wget all the
Hi,
Neither, rc1 or alpha2 have prce patch included.
I think that prce is a very usefull patch, and it should be
added to CVS and not enabled by default in the ./configure script.
So, if you want to use prce, just ./configure --with-prce
and everybody is happy.
Just my 2c
Oliver
Jochen
Zitat von Oliver Schulze L. [EMAIL PROTECTED]:
Neither, rc1 or alpha2 have prce patch included.
I think that prce is a very usefull patch, and it should be
added to CVS and not enabled by default in the ./configure script.
So, if you want to use prce, just ./configure --with-prce
and
Hi Jochen,
yes, I readed it.
Thats why I suggested using an option to ./configure in order to
enabled it.
And, it should be disabled by default.
Its a nice options for all, because, if you don't have pcre, you won't
receive
any warning and it won't hurt nobody.
HTH
Oliver
Jochen Roderburg
Hi Mauro,
do you know if the regex patch from Tobias was applied to this release?
Thanks
Oliver
Mauro Tortonesi wrote:
dear friends,
i have just released the first release candidate of wget 1.10:
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-rc1.tar.gz
i have just released the first release candidate of wget 1.10:
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-rc1.tar.gz
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-rc1.tar.bz2
you are encouraged to download the tarballs, test if the code works
properly and report any
Mark Anderson [EMAIL PROTECTED] writes:
Is there an option, or could you add one if there isn't, to specify
that I want wget to write the downloaded html file, or whatever, to
stdout so I can pipe it into some filters in a script?
Yes, use `-O -'.
Jim Peterson [EMAIL PROTECTED] writes:
Using Fedora Core 3, when I wget http://www.studylight.org/;, it prints
out:
--02:52:30-- http://www.studylight.org/
= `index.html'
Resolving www.studylight.org... 63.164.18.58
Connecting to www.studylight.org[63.164.18.58]:80...
On Tuesday 17 May 2005 01:56 am, Jim Peterson wrote:
Using Fedora Core 3, when I wget http://www.studylight.org/;, it prints
out:
--02:52:30-- http://www.studylight.org/
= `index.html'
Resolving www.studylight.org... 63.164.18.58
Connecting to
Windows MSVC6 binary for testing purposes here:
http://xoomer.virgilio.it/hherold/
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL
Joerg Ottermann [EMAIL PROTECTED] writes:
i try to archive some pages using wget, but it seems, that i have some
problems when TE:chunked is used.
The server must not use Transfer-Encoding: chunked in response to an
HTTP/1.0 request. Are you sure that is the problem?
Vitaly Lomov [EMAIL PROTECTED] writes:
Hello
I am trying to get a site http://www.cro.ie/index.asp with the following flags
-r -l2
or
-kr -l2
or
-Er -l2
or
-Ekr -l2
In all cases, the linked files are saved with '@' instead of '?' in
the name, but in the index.asp the link still refers
Vitaly Lomov [EMAIL PROTECTED] writes:
Maybe you're not letting Wget finish the mirroring. The links are
converted only after everything has been downloaded. I've now tried
`wget -Ekrl2 http://www.cro.ie/index.asp --restrict-file-names=windows'
(the last argument being to emulate
Andrzej [EMAIL PROTECTED] writes:
Will the patches be included in the stable 1.10?
Probably. 1.10 is in feature freeze, but this really is a bug fix.
I'd like to check with others if that change is deemed safe for
mirroring of other sites.
Clicking on that link redirects to that page:
Clicking on that link redirects to that page:
https://lists.man.lodz.pl/mailman/listinfo
and from all the links which are on that page the files are unnecessarily
downloaded (I do not want that page and the subpages).
So how can I block it?
Could you use -X /mailman/listinfo ?
I
Andrzej [EMAIL PROTECTED] writes:
Clicking on that link redirects to that page:
https://lists.man.lodz.pl/mailman/listinfo
and from all the links which are on that page the files are unnecessarily
downloaded (I do not want that page and the subpages).
So how can I block it?
Could
I believe 1.9.1 had a bug in this area when -m (which implies -l0) was
used. Could you try specifying -l50 along with the other options, and
after -m?
It still downloaded everything.
a.
Yup. So I assume that the problem you see is not that of wget mirroring, but
a combination of saving to a custom dir (with --cut-dirs and the like) and
conversion of the links. Obviously, the link to
http://znik.wbc.lublin.pl/Mineraly/Ftp/UpLoad/index.html which would be
correct for a
Andrzej [EMAIL PROTECTED] writes:
It's not the end of troubles though!
It works correctly *only* for the first time!
When I (or cron) run the same mirroring commands again over already
mirrored files to renew the mirror, then the correctly converted link of
the gif file (on the main
With that patch the mirror seems correct in the 2nd run. Please let
me know if it works for you.
*After* I deleted the files with the wrong URLs, the patched wget 1.9.1
retrieved the files correctly, and after second run did not change the
URLs for the wrong ones. So it worked on the
Windows (MSVC) test binary available at http://xoomer.virgilio.it/hherold/
Notes:
windows/wget.dep needs an attached patch (change gen_sslfunc to openssl.c,
change gen_sslfunc.h to ssl.h).
src/Makefile.in doesn't contain dependencies for http-ntlm$o
(windows/wget.dep either).
INSTALL should
Herold Heiko [EMAIL PROTECTED] writes:
windows/wget.dep needs an attached patch (change gen_sslfunc to openssl.c,
change gen_sslfunc.h to ssl.h).
Applied, thanks.
src/Makefile.in doesn't contain dependencies for http-ntlm$o
(windows/wget.dep either).
I don't have the dependency-generating
Cannot compile if ./configure --without-ssl :
===cut on===
gcc -I. -I. -DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\
-DLOCALE
DIR=\/usr/local/share/locale\ -O2 -Wall -Wno-implicit -c init.c
init.c:214: structure has no member named `random_file'
init.c:214: initializer element is not
Thanks for the report; this problem is fixed in CVS. The workaround
is to wrap the appropriate init.c line in #ifdef HAVE_SSL.
[EMAIL PROTECTED] writes:
Is there a publically accessible site that exhibits this problem?
I've set up a small example which illustrates the problem. Files can
be found at http://dev.mesca.net/wget/ (using demo:test as login).
Thanks for setting up this test case. It has uncovered at least
The obvious problem is that this command lacks --keep-session-cookies,
and the cookie it gets is session-based.
I tried to reproduce the bug in the more generic way.
But there are other problems
as well: if you examine the cookie.txt produced by (the amended
version of) the first command, you'll
Arndt Humpert [EMAIL PROTECTED] writes:
wget, win32 rel. crashes with huge files.
Thanks for the report. This problem has been fixed in the latest
version, available at http://xoomer.virgilio.it/hherold/ .
Thus it seems that it should not matter what is the sequence of the
options. If it does I suggest that the developers of wget place
appriopriate info in the manual.
Yes, you right. Anyway I found out often that it's sometimes quite tricky
setting up your command line to get exactly
Andrzej [EMAIL PROTECTED] writes:
Thus it seems that it should not matter what is the sequence of the
options. If it does I suggest that the developers of wget place
appriopriate info in the manual.
Yes, you right. Anyway I found out often that it's sometimes quite tricky
setting up
Andrzej [EMAIL PROTECTED] writes:
Multitude options in Wget is just an ilusion. In real life Wget
cannot cope with sites mirroring.
I agree with your criticism, if not with your tone. We are working on
improving Wget, and I believe that the problems you have seen will be
fixed in the
I agree with your criticism, if not with your tone. We are working on
improving Wget, and I believe that the problems you have seen will be
fixed in the versions to come. (I plan to look into some of them for
the 1.11 release.)
OK. Thanks. Good to hear that. Looking forward impatiently for
Is there a publically accessible site that exhibits this problem?
I've set up a small example which illustrates the problem. Files can be
found at http://dev.mesca.net/wget/ (using demo:test as login).
Three files:
setcookie.php:
--
? setcookie(wget,I love it!); ?
getcookie.php:
Thanks Patrick for a reply,
AFAIKS your command line is somehow complete mixed up.
Usually I call wget and first give it the path where to it should save all
files followed by more options and at last the url from where to get them
(usually in quotation marks to be sure).
According to man
On Wed, 20 Apr 2005, Hrvoje Niksic wrote:
Herold Heiko [EMAIL PROTECTED] writes:
I am greatly surprised. Do you really believe that Windows users
outside an academic environment are proficient in using the compiler?
I have never seen a home Windows installation that even contained a
Doug Kaufman [EMAIL PROTECTED] writes:
On Wed, 20 Apr 2005, Hrvoje Niksic wrote:
Herold Heiko [EMAIL PROTECTED] writes:
I am greatly surprised. Do you really believe that Windows users
outside an academic environment are proficient in using the compiler?
I have never seen a home Windows
Mauro Tortonesi [EMAIL PROTECTED] writes:
i totally agree with hrvoje here. in the worst case, we can add an
entry in the FAQ explaining how to compile wget with those buggy
versions of microsoft cc.
Umm. What FAQ? :-)
(sorry for the late answer, three days of 16+ hours/day migration aren't
fun, UPS battery exploding inside the UPS almost in my face even less)
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Herold Heiko [EMAIL PROTECTED] writes:
do have a compiler but aren't
Herold Heiko [EMAIL PROTECTED] writes:
From my impressions of the Windows world, non-developers won't touch
source code anyway -- they will simply use the binary.
I feel I must dissent.
I am greatly surprised. Do you really believe that Windows users
outside an academic environment are
On Wednesday 20 April 2005 04:58 am, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
i totally agree with hrvoje here. in the worst case, we can add an
entry in the FAQ explaining how to compile wget with those buggy
versions of microsoft cc.
Umm. What FAQ? :-)
the
On Wednesday 20 April 2005 05:55 am, Herold Heiko wrote:
(sorry for the late answer, three days of 16+ hours/day migration aren't
fun, UPS battery exploding inside the UPS almost in my face even less)
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Herold Heiko
hi alexander,
this is a known problem which is already fixed in cvs. perhaps you may want to
try using wget 1.10-alpha2:
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.gz
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2
--
Aequam memento rebus in
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Wednesday 20 April 2005 04:58 am, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
i totally agree with hrvoje here. in the worst case, we can add an
entry in the FAQ explaining how to compile wget with those buggy
versions of
On Wednesday 20 April 2005 02:42 pm, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Wednesday 20 April 2005 04:58 am, Hrvoje Niksic wrote:
Mauro Tortonesi [EMAIL PROTECTED] writes:
i totally agree with hrvoje here. in the worst case, we can add an
entry in the FAQ
On Friday 15 April 2005 07:24 am, Hrvoje Niksic wrote:
Herold Heiko [EMAIL PROTECTED] writes:
However there are still lots of people using Windows NT 4 or even
win95/win98, with old compilers, where the compilation won't work
without the patch. Even if we place a comment in the source file
hi wgetters !
a while ago, i wrote:
[1]
wget spans hosts when it shouldn't:
it looks like this behaviour is by design, but it should be documented.
[2]
wget seems to choke on directories that start with a dot. i guess it
thinks they are references to external pages and does not download
links
Jörn Nettingsmeier [EMAIL PROTECTED] writes:
[3]
wget does not parse css stylesheets and consequently does not
retrieve url() references, which leads to missing background
graphics on some sites.
this feature request has not been commented on yet. do think it
might be useful ?
I think
Hrvoje Niksic wrote:
Jörn Nettingsmeier [EMAIL PROTECTED] writes:
[3]
wget does not parse css stylesheets and consequently does not
retrieve url() references, which leads to missing background
graphics on some sites.
this feature request has not been commented on yet. do think it
might be useful
Jörn Nettingsmeier [EMAIL PROTECTED] writes:
wget does not parse css stylesheets and consequently does not
retrieve url() references, which leads to missing background
graphics on some sites.
this feature request has not been commented on yet. do think it
might be useful ?
I think it's very
Hrvoje Niksic wrote:
Jörn Nettingsmeier [EMAIL PROTECTED] writes:
wget does not parse css stylesheets and consequently does not
retrieve url() references, which leads to missing background
graphics on some sites.
this feature request has not been commented on yet. do think it
might be useful ?
I
Jörn Nettingsmeier [EMAIL PROTECTED] writes:
the same parser code might also work for urls in javascript. as it
is now, mouse-over effects with overlay images don't work, because
the second file is not retrieved. if we can come up with a good
heuristics to guess urls, it should work in both
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
the patch you've posted is really such an ugly workaround
(shame on microsoft
Exactly the same opinion here.
Please don't misunderstand me, personally for most of my work on windows I
use cygnus (including wget) anyway.
However there are still
Herold Heiko [EMAIL PROTECTED] writes:
However there are still lots of people using Windows NT 4 or even
win95/win98, with old compilers, where the compilation won't work
without the patch. Even if we place a comment in the source file or
the windows/Readme many of those will be discouraged,
Hi,
Does anybody know if the security vulnerabilities CAN-2004-1487 and
CAN-2004-1488 will be fixed in the new version ?
There seems to be at least some truth in the reports (ignore the insulting
tone of the reports).
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CAN-2004-1487
Karsten Hopp [EMAIL PROTECTED] writes:
Does anybody know if the security vulnerabilities CAN-2004-1487 and
CAN-2004-1488 will be fixed in the new version ?
Yes on both counts.
There seems to be at least some truth in the reports (ignore the
insulting tone of the reports).
Hrvoje Niksic [EMAIL PROTECTED] writes:
[EMAIL PROTECTED] writes:
If possible, it seems preferable to me to use the platform's C
library regex support rather than make wget dependent on another
library...
Note that some platforms don't have library support for regexps, so
we'd have to
On Wednesday 13 April 2005 07:39 am, Herold Heiko wrote:
With MS Visual Studio 6 still needs attached patch in order to compile
(disable optimization for part of http.c and retr.c if cl.exe version
=12).
Windows msvc test binary at http://xoomer.virgilio.it/hherold/
hi herold,
the patch
[EMAIL PROTECTED] (Steven M. Schweda) writes:
#define VERSION_STRING 1.10-alpha1_sms1
Was there any reason to do this with a source module instead of a
simple macro in a simple header file?
At some point that approach made it easy to read or change the
version, as the script dist-wget
With MS Visual Studio 6 still needs attached patch in order to compile
(disable optimization for part of http.c and retr.c if cl.exe version =12).
Windows msvc test binary at http://xoomer.virgilio.it/hherold/
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL
[EMAIL PROTECTED] writes:
If possible, it seems preferable to me to use the platform's C
library regex support rather than make wget dependent on another
library...
Note that some platforms don't have library support for regexps, so
we'd have to bundle anyway.
From: Mauro Tortonesi [EMAIL PROTECTED]
[...] i think
that if you want your patches to be merged in our CVS, you should follow the
official patch submission procedure (that is, posting your patches to the
wget-patches AT sunsite DOT dk mailing list. each post should include a brief
On Tue, 12 Apr 2005, Steven M. Schweda wrote:
Also, am I missing something obvious, or should the configure script
(as in, To configure Wget, run the configure script provided with the
distribution.) be somewhere in the CVS source? I see many of its
relatives, but not the script itself.
[EMAIL PROTECTED] (Steven M. Schweda) writes:
Also, am I missing something obvious, or should the configure script
(as in, To configure Wget, run the configure script provided with
the distribution.) be somewhere in the CVS source?
The configure script is auto-generated and is therefore not
On Tuesday 12 April 2005 06:17 pm, Jeanne McIlvain wrote:
Hi!
I attempted to download wget onto my mac. I was disappointed to find
that it would not work. I thought that I read it was applicable to
macs, but am I wrong? Please let me know, Thank you so much.
- please respond to [EMAIL
From: Hrvoje Niksic [EMAIL PROTECTED]
Also, am I missing something obvious, or should the configure script
(as in, To configure Wget, run the configure script provided with
the distribution.) be somewhere in the CVS source?
The configure script is auto-generated and is therefore not in
Sanjay Madhavan [EMAIL PROTECTED] writes:
wget 1.9.1 fails when trying to download a very large file.
The download stopped in between and attempting to resume shows a negative
sized balance to be downloaded.
e.g.ftp://ftp.solnet.ch/mirror/SuSE/i386/9.2/iso/SUSE-Linux-9.2-FTP-DVD.iso
I may run into this in the future. What is the threshold for large
files failing on the -current version of wget??? I'm not expecting to
d/l anything over 200MB, but is that even too large for it?
Sorry to threadjack, but it seemed an appropiate question...
Bryan
On Apr 11, 2005 2:46 AM,
Bryan [EMAIL PROTECTED] writes:
I may run into this in the future. What is the threshold for large
files failing on the -current version of wget???
The threshold is 2G (2147483648 bytes).
I'm not expecting to d/l anything over 200MB, but is that even too
large for it?
That's not too
Tobias Tiederle [EMAIL PROTECTED] writes:
let's say you have the following structure:
index.html
|-cool.html
| |-page1.html
| |-page2.html
| |- ...
|
|-crap.html
|-page1.html
|-page2.html
now you want to download the whole structure, but you want to
exclude the crap (with
501 - 600 of 1221 matches
Mail list logo