Stepan Kasal <[EMAIL PROTECTED]> writes:
>> On Fri, Aug 26, 2005 at 02:07:16PM +0200, Hrvoje Niksic wrote:
>> > I've applied a slightly modified version of this patch, thanks.
>
> you also changed "OpenSSL" to "SSL" in one of the error messages.
Daniel Stenberg <[EMAIL PROTECTED]> writes:
> On Fri, 26 Aug 2005, Hrvoje Niksic wrote:
>
>> + /* The OpenSSL library can handle renegotiations automatically, so
>> + tell it to do so. */
>> + SSL_CTX_set_mode (ssl_ctx, SSL_MODE_AUTO_RETRY);
>> +
>
I've applied a slightly modified version of this patch, thanks. Note
that use of "else if" requires an additional "fi" the absence of which
caused a parse error. I used "elif" instead.
"Jonathan" <[EMAIL PROTECTED]> writes:
> Would it be possible (and is anyone else interested) to have the
> subject line of messages posted to this list prefixed with '[wget]'?
I am against munging subject lines of mail messages. The mailing list
software provides headers such as `Mailing-List'
Thanks for the report; I've applied this patch:
2005-08-26 Jeremy Shapiro <[EMAIL PROTECTED]>
* openssl.c (ssl_init): Set SSL_MODE_AUTO_RETRY.
Index: openssl.c
===
--- openssl.c (revision 2063)
+++ openssl.c (working c
This should now be fixed in the repository, in a slightly different
manner (by setting SSL_MODE_AUTO_RETRY on the SSL context).
Thanks for the report.
Albert Chin <[EMAIL PROTECTED]> writes:
> +AC_CHECK_FUNCS(strtoll, , [
> + dnl The following taken from gnulib:
> + dnl Copyright (C) 2002, 2003, 2004 Free Software Foundation, Inc.
> + AC_MSG_CHECKING([whether defines strtoimax as a macro])
> + AC_EGREP_CPP([inttypes_h_defines_strtoimax],
Stepan Kasal <[EMAIL PROTECTED]> writes:
> 1) I removed the AC_DEFINEs of symbols HAVE_GNUTLS, and HAVE_OPENSSL.
> AC_LIB_HAVE_LINKFLAGS defines HAVE_LIBGNUTLS and HAVE_LIBSSL, which
> can be used instead. wget.h was fixed to expect these symnbols.
> (You might think your defines are more aptly n
You're right, this should be documented. I was under the impression
that the Subversion's checkout protocol (at least) was
backward-compatible, but I've never actually *tried* it.
Would someone be willing to host an issue tracker for Wget? Of the
ones I've seen bugzilla seems to be the best, but "trac" is also quite
promising. "Roundup" currently installed at
wget-bugs.ferrara.linux.it is nice, but it lacks some features: for
example, you can't set the type of resolution a
ed <[EMAIL PROTECTED]> writes:
> Yes, on the computer with the FTP client it is. Not sure about the
> server, though. So I was hoping there was a workaround on the client
> end.
>
> Or say my client *isn't* set for UTF-8, would there be a switch or
> something to work around it?
Your bug report d
Peter Skye <[EMAIL PROTECTED]> writes:
> Thanks for the -p suggestion, that might be the cure. I think I've
> figured out the basic problem -- the HTML page is at
> http://www.ucomics.com/ but the image is at
> http://images.ucomics.com/.
Then you need to use something like -H -Ducomics.com to t
Herold Heiko <[EMAIL PROTECTED]> writes:
> Windows MSVC binary for 1.10.1 at
> http://xoomer.virgilio.it/hherold/
>
> Likely I won't be able to follow wget development and compile
> anything with reasonable delay after a given source release for at
> least several months, due to personal constrain
"Carl G. Ponder" <[EMAIL PROTECTED]> writes:
> This would also be okay for my purposes. When I first ran "wget" I
> was assuming it would get everything, and was surprised and
> disappointed when it didn't. Would you be able to make a
> modification like this any time soon?
I wouldn't count on it
[EMAIL PROTECTED] (Steven M. Schweda) writes:
>> [...] I for one would prefer Wget to be smarter and try to download
>> dot files by default, without the user's intervention.
>
>Given the variability in FTP servers (even among UNIX FTP servers) I
> don't see how this could be done reliably.
I
"Carl G. Ponder" <[EMAIL PROTECTED]> writes:
> Hrvoje Niksic wrote:
>> If nothing else works out, we can add something like that. I for one
>> would prefer Wget to be smarter and try to download dot files by
>> default, without the user's intervention.
&
Carl Ponder <[EMAIL PROTECTED]> writes:
> How about this, then document "wget" as follows:
>
> By default, for wildcard and recursive operations, "wget"
> *ignores* invisible files (like ".profile", ".rhosts", etc.)
> that begin with '.'.
But that's the catch, it really doe
Carl Ponder <[EMAIL PROTECTED]> writes:
> Hey -- how about making the "-a" the default, then add a command-line
> switch that supresses "-a" for servers it won't work with?
That would mean using a non-standard extension by default, and putting
the burden to the user to disable it when it misfires
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> Mauro Tortonesi wrote:
>
>> this is a very interesting point, but the patch you mentioned above uses
> the
>> LIST -a FTP command, which AFAIK is not supported by all FTP servers.
>
> As I recall, that's why the patch was not accepted. However, it would
Behdad Esfahbod <[EMAIL PROTECTED]> writes:
> It happened to me to unintentionally run two commands:
>
> wget -b -c http://some/file.tar.gz
>
> and hours later I figured out that the 1GB that I've downloaded
> is useless since two wget processes have been downloading the
> same data twice and ap
Linda Walsh <[EMAIL PROTECTED]> writes:
> I noticed after my post in the archives that this bug is fixed in
> 1.10.
>
> Now if I can just get the server-ops to fix their CVS server, that'd
> be great -- I've checked out CVS projects from other sites and not
> had inbound TCP attempts to some 'auth
Wget's source code repository was migrated from CVS to Subversion. To
check out the latest code base, use the subversion client:
svn co http://svn.dotsrc.org/repo/wget/trunk/ wget
Albert Chin <[EMAIL PROTECTED]> writes:
>> [1]
>> Both C and POSIX require standard functions to also be defined as
>> functions, even if those that are normally invoked as macros of the
>> same name. The 2004 edition of POSIX explicitly speaks of strtoimax
>> and friends as "functions". See htt
Albert Chin <[EMAIL PROTECTED]> writes:
> On Thu, Aug 11, 2005 at 11:17:25PM +0200, Hrvoje Niksic wrote:
>> OK, in presence of LFS, Wget will use either strtoll, strtoimax, or
>> its own strtoll implementation if none are available.
>
> I looked at your configure.
OK, in presence of LFS, Wget will use either strtoll, strtoimax, or
its own strtoll implementation if none are available.
Albert Chin <[EMAIL PROTECTED]> writes:
> None of the following platforms have strtoll():
> HP-UX 10.20, 11.00, 11.11
Do those platforms have 64-bit off_t, i.e. large file support? If so,
do they have another strtoll-like function, such as strtoq?
> There is a replacement strtoll() in gnulib
Mauro Tortonesi <[EMAIL PROTECTED]> writes:
> oops, my fault. i was in a hurry and i misunderstood what
> Abdurrahman was asking. what i wanted to say is that we talked about
> supporting the same html file download mode of firefox, in which you
> save all the related files in a directory with the
Mauro Tortonesi <[EMAIL PROTECTED]> writes:
> On Saturday 09 July 2005 10:34 am, Abdurrahman ÃARKACIOÄLU wrote:
>> MS Internet Explorer can save a web page as a whole. That means all the
>> images,
>>
>> Tables, can be saved as a file. It is called as "Web Archieve, single file
>> (*.mht)".
>>
>
Thanks for the report; I believe this bug is fixed in Wget's
subversion repository.
Robin Laurén <[EMAIL PROTECTED]> writes:
> My question is about the number on one of the last lines of the
> logged output, the reported download speed. What exactly does
> wget's download speed report? Is this the speed of just the data
> downloaded, or does the value include the lag time betwe
Mauro Tortonesi <[EMAIL PROTECTED]> writes:
> i agree with hrvoje. but this is just a side-effect of the real
> problem: the semantics of -O with a multiple files download is not
> well defined.
-O with multiple URLs concatenates all content to the given file.
This is intentional and supported: f
"Phill Bertolus" <[EMAIL PROTECTED]> writes:
> Hi guys,
>
> in http.c there are a couple of lines that say
>
> if (opt.save_headers)
> fwrite(head,1,strlen(head), fp);
>
> Unfortunately head has been deallocated by this time in 1.10. The
> 1.9.1 version correctly saved the head info in to
Jeroen Demeyer <[EMAIL PROTECTED]> writes:
> I am a big fan of wget, but I discovered a minor annoyance (not sure
> if it even is a bug):
>
> When downloading multiple files with wget to a single output
> (e.g. wget -Oout http://file1 http://file2 http://file3), the
> timestamp of the resulting fi
Greg Ramos <[EMAIL PROTECTED]> writes:
> I have downloaded two versions of wget, and both give me this error:
This problem is caused by Apache installing a buggy fnmatch.h in the
compiler's default include path. As a workaround, remove the
definition of SYSTEM_FNMATCH in sysdep.h.
Thanks for the report. The problem seems to come from Wget's use of
AI_ADDRCONFIG hint to getaddrinfo. Wget 1.10.1 will not use that
hint.
Gisle Vanem <[EMAIL PROTECTED]> writes:
> "Hrvoje Niksic" wrote:
>
>> Wouldn't you need to have separate targets for linking as well?
>
> Sure. That target would simply depend on $(MSVC_OBJECTS) etc.:
>
> wget-msvc.exe: $(MSVC_OBJECTS)
>
Gisle Vanem <[EMAIL PROTECTED]> writes:
> If you adopt this style, I urge you to reconsider the "#undef
> HAVE_OPENSSL" in config.h.
You're right; I never thought through the effect of the #undef lines
on symbols defined via Makefile! configure-generated config.h has the
undefs commented out, pr
Herold Heiko <[EMAIL PROTECTED]> writes:
> What about the doc directory and Makefile.doc ?
I don't see much use for Info files on Windows. Furthermore, I don't
think many Windows builders have makeinfo lying around on their hard
disk...
Those who know about Info (or, for that matter, Texinfo) d
Another step towards easier support for multiple Windows compilers
(such as reintroduction of Watcom support) would be to use a single
Makefile. It really sucks that adding a new object file requires
changing src/Makefile.in and *three* different Windows Makefiles.
To simplify the infrastructure,
Gisle Vanem <[EMAIL PROTECTED]> writes:
> Adding yet another define, seems so obfuscate the code more, IMO.
It does, but having #ifdef __COMPILER__ all over the place makes the
code just as obfuscated, only in another way. And it makes
introducing another compiler that much harder because you ha
Jogchum Reitsma <[EMAIL PROTECTED]> writes:
> I'm not sure it's a bug, but behaviour descibes below seems strange
> to me, so I thought it was wise to report it:
Upgrade to Wget 1.10 and the problem should go away. Earlier versions
don't handle files larger than 2GB properly.
. */
Index: src/ChangeLog
===
--- src/ChangeLog (revision 2008)
+++ src/ChangeLog (working copy)
@@ -1,3 +1,10 @@
+2005-07-07 Hrvoje Niksic <[EMAIL PROTECTED]>
+
+ * mswindows.h: Define an alias for stat and fstat,
Rodrigo Botafogo <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED]:~/Download/Linux> wget -c
> ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso
> --09:55:03--
> ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso
> => `SUSE-9.3-Eval-DVD.iso
Gisle Vanem <[EMAIL PROTECTED]> writes:
> - Defintion of gai_strerror() needs to be put after the one in
>So either include that in config-compiler.h or
> move the whole ENABLE_IPV6 section to mswindows.h.
Both solutions kind of defeat the purpose of config-compiler.h, which
is supposed to
[ Moving discussion to the Wget list ]
Herold Heiko <[EMAIL PROTECTED]> writes:
>> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
>
>> Yup. Do you think we could require GNU make so we don't have to
>> support multiple flavors of Makefiles?
>
> FWIW, gnu
tch fixes the crash, but such responses still don't
get downloaded. I'm not sure what would be the best way to handle a
response whose status cannot be determined.
Thanks for the report.
2005-07-06 Hrvoje Niksic <[EMAIL PROTECTED]>
* http.c (gethttp): When freeing M
Alain Guibert <[EMAIL PROTECTED]> writes:
> (1) Libc 5.4.33 own mktime() produces wrong by some minutes results for
> all summer dates when tm_isdst is forced to false 0. Wget's
> mktime_from_utc() forces tm_isdst=0 at a stage, and produces wrong by
> some minutes result only for one hour, beginni
I invite you to try out the GnuTLS support in the Wget repository. It
is still very rudimentary (no fancy SSL options), but the basics seem
to work.
Patches that enhance this would be very welcome, as my experience with
SSL in general and GnuTLS in particular is very limited.
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes:
>>It's already in the repository.
>
> I think you forget to put -DHAVE_SELECT statement
> into makefile.src.mingw at
> http://svn.dotsrc.org/repo/wget/branches/1.10/windows/.
>
> Am I right ?
That was published in a separate patch -- specificall
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes:
> Now, it works. Thanks a lot.
>
> But I want to understand what is going on ? Was it a bug ?
It was a combination of two Wget bugs, one in actual code and other in
MinGW configuration.
Wget 1.9.1 and earlier used to close connections to the se
I believe this patch should fix the problem. Could you apply it and
let me know if it fixes things for you?
2005-07-02 Hrvoje Niksic <[EMAIL PROTECTED]>
* http.c (gethttp): Except for head_only, use skip_short_body to
skip the non-20x error message before leaving g
Alain Bench <[EMAIL PROTECTED]> writes:
> Not here. This seems to be locale dependant, requiring exact
> localized input. Here MS Calculator accepts pasted "123 456 789,01"
> as correct 123456789.01, but when pasted wget's English
> "123,456,789.01" it fails, interpreting this as 123.456789 and
>
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes:
> Here are the results..
> ---request begin---
> GET /images/spk.ico HTTP/1.0
> Referer: http://www.spk.gov.tr/
> User-Agent: Wget/1.10
> Accept: */*
> Host: www.spk.gov.tr
> Connection: Keep-Alive
>
> ---request end---
> HTTP request sent, await
"A. Carkaci" <[EMAIL PROTECTED]> writes:
> ---request begin---
> GET /images/spk.ico HTTP/1.0
> Referer: http://www.spk.gov.tr/
> User-Agent: Wget/1.10
> Accept: */*
> Host: www.spk.gov.tr
> Connection: Keep-Alive
> ---request end---
> HTTP request sent, awaiting response...
> ---response begin--
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes:
> I succesfully compiled Wget 1.10 using mingw. Although Heiko
> Herold's wget 1.10 (original wget.exe I mean) (from
> http://space.tin.it/computer/hherold/) succesfully download the
> following site, my compiled wget (produced by mingw32-make) h
James Wiebe <[EMAIL PROTECTED]> writes:
> How do you get past an https login screen (as opposed to a plain
> http (non-secure) one)?
The procedure is, as far as I know, exactly the same for both.
> Using an idea from msg "Login string" Richard Emanilov Wed, 16 Mar 2005
> 13:38:09 -0800
> I trie
James Wiebe <[EMAIL PROTECTED]> writes:
> I'm writing to report my unsuccessful compile
> of WGet ver 1.10 on Windows 2000 and also XP with MSVC++ 6.0.
This is a known bug in MSVC++ 6.0. You can work around it by
compiling retr.c and http.c with no (or at least less) optimization.
Marc Niederwieser <[EMAIL PROTECTED]> writes:
> option --mirror is described as
> shortcut option equivalent to -r -N -l inf -nr.
> but option "-nr" is not implemented.
> I think you mean "--no-remove-listing".
Thanks for the report, I've now fixed
Borland has warned me about unused variable row_qty in this code:
{
wgint row_qty = ROW_BYTES;
if (dp->rows == dp->initial_length / ROW_BYTES)
row_qty -= dp->initial_length % ROW_BYTES;
... code that doesn't use row_qty ...
}
Does GCC have an option to produce such a warning? -Wall doe
tdated, thanks for the report. (At one point
-B really only worked with -F, in fact it was designed to help deal
with relative links and -F.)
2005-06-28 Hrvoje Niksic <[EMAIL PROTECTED]>
* wget.texi (Logging and Input File Options): Don't claim that
--base requir
I plan to add such an option, but it will require some reorganization
of how Wget does the download internally.
How do you propose that the option would work? Only in conjunction
with -B or also with -r, -p and other options that generate URLs? If
the latter is the case, printing the URIs would
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> Hrvoje Niksic wrote:
>
>> 1. Does wget -4 http://... work?
>
> Yes
Then, as a workaround you can put inet4_only=yes to your ~/.wgetr.
>> What OS are you running this on?
>
> Red Hat Linux release 6.2 (Z
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> I got a "Name or service not known" error from wget 1.10 running on
> Linux. When I installed an earlier version of wget, it worked just
> fine. It also works just fine on version 1.10 running on Windows.
> Any ideas?
I think we're beginning to experien
[EMAIL PROTECTED] (Steven M. Schweda) writes:
> from Hrvoje Niksic:
>
>> [...] Unfortunately EOL conversions break
>> automatic downloads resumption (REST in FTP),
>
>Could be true.
>
>> manual resumption (wget -c),
>
>Could be true. (I never u
"Maciej W. Rozycki" <[EMAIL PROTECTED]> writes:
> Bugs are of course inevitable and you shouldn't be surprised seeing
> them especially as on exotic platforms (you even admit you've never
> been able to reproduce some of the other's problems on your
> systems).
Please note that a platform doesn't
Василевский Сергей <[EMAIL PROTECTED]> writes:
> some time appear this error
> assertion "ptr != NULL" failed: file "xmalloc.c", line 190
What were you doing when the error appeared? Do you have the rest of
Wget's output?
Dan Jacobson <[EMAIL PROTECTED]> writes:
> Why must -B need -F to take effect? Why can't one do
> xargs wget -B http://bla.com/ -i - < zzz.html fff/gg/h.html
> !
I'm not sure I understand the combination of `xargs' and `-i -'. This
seems to work for me:
$ wget -B http://www.example.com/foo/ -i-
[EMAIL PROTECTED] (Steven M. Schweda) writes:
> It does seem a bit odd that no one has noticed this fundamental
> problem until now, but then I missed it, too.
Long ago I intentionally made Wget use binary mode by default and not
muck with line endings because I believed exact data transfer was
i
"John Haymaker" <[EMAIL PROTECTED]> writes:
> I am trying to download all pages in my site except secure pages that
> require login.
>
> Problem: when wget encounters a secure page requiging the user to log in,
> it hangs there for up to an hour. Then miraculously, it moves on.
By "secure pag
"Post, Mark K" <[EMAIL PROTECTED]> writes:
> I read the entire message, but I probably didn't have to. My
> experience with libtool in packages that really are building
> libraries has been pretty painful. Since wget doesn't build any,
> getting rid of it is one less thing to kill my builds in t
David Fritz <[EMAIL PROTECTED]> writes:
> "I64" is a size prefix akin to "ll". One still needs to specify the
> argument type as in "%I64d" as with "%lld".
That makes sense, thanks for the explanation!
Gisle Vanem <[EMAIL PROTECTED]> writes:
> "Hrvoje Niksic" <[EMAIL PROTECTED]> wrote:
>
>> It should print a line containing "100". If it does, it means
>> we're applying the wrong format. If it doesn't, then we must find
>&
Hrvoje Niksic <[EMAIL PROTECTED]> writes:
> This would indicate that the "%I64" format, which Wget uses to print
> the 64-bit "download sum", doesn't work for you.
For what it's worth, MSDN documents it: http://tinyurl.com/ysrh/.
Could you be compiling
Herold Heiko <[EMAIL PROTECTED]> writes:
> Downloaded: bytes in 2 files
>
> Note missing number of bytes.
This would indicate that the "%I64" format, which Wget uses to print
the 64-bit "download sum", doesn't work for you. What does this
program print?
#include
int
main (void)
{
__int64 n
Alain Bench <[EMAIL PROTECTED]> writes:
> Removing separators will break existing apps parsing wget's output.
> Such apps exist?
They do exist, but *any* change in Wget's output will break them.
Since they probably do the equivalent of sed s/,//g anyway, the
removal of separators is likely to be
Thanks to the effort of Mauro Tortonesi and the prior work of Bruno
Haible, Wget has been modified to no longer use Libtool for linking in
external libraries. If you are interested in why that might be a
cause for celebration, read on.
A bit of history: Libtool was integrated in Wget by Dan Hark
<[EMAIL PROTECTED]> writes:
> Sorry for the crosspost, but the wget Web site is a little confusing
> on the point of where to send bug reports/patches.
Sorry about that. In this case, either address is fine, and we don't
mind the crosspost.
> After taking a look at it, i implemented the followi
"Mark Street" <[EMAIL PROTECTED]> writes:
> Many thanks for the explanation and the patch. Yes, this patch
> successfully resolves the problem for my particular test case.
Thanks for testing it. It has been applied to the code and will be in
Wget 1.10.1 and later.
"Oliver Schulze L." <[EMAIL PROTECTED]> writes:
> I think that having a link to an email address is not that usefull,
> because people can just write to that email address because its a
> mailling list.
Good point. An even better link might be to the gmane archive, where
you can read the list, b
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> Hrvoje Niksic wrote:
>
>> In fact, I know of no application that accepts numbers as Wget prints
> them.
>
> Microsoft Calculator does.
Sorry, I forgot to qualify that as "(Unix) command-line application"
Leonid <[EMAIL PROTECTED]> writes:
> Those guys who find numbers like 11782023180 easy to read and can
> tell for a fraction of a second that it was 11Gb
I'm not such person; Wget would in fact print:
Length: 11782023180 (11.0G)
Alain Bench <[EMAIL PROTECTED]> writes:
> MHO: They are ununderstandable, unusable, unclean, and big. They may
> give a false bad impression of source/project misorganization. We
> want to drop them, wipe any proof of their existence from any
> archives and mirrors, then honestly deny they ever ex
Alain Bench <[EMAIL PROTECTED]> writes:
> On Thursday, June 23, 2005 at 3:16:28 PM +0200, Hrvoje Niksic wrote:
>
>> Since Wget 1.10 also prints sizes in kilobytes/megabytes/etc., I am
>> thinking of removing the thousand separators from size display.
>
> IMHO t
es the problem by:
* Making sure that path consistently gets prepended in all entry
points to cookie code;
* Removing the special logic from path_match.
With that change your test case seems to work, and so do all the other
tests I could think of.
Please let me know if it works for you, and than
Hrvoje Niksic <[EMAIL PROTECTED]> writes:
> Thanks. Is there a copy of C89 (or a close draft) online? It would
> be very useful for checking such things.
Oops: typing `c89 draft' into google produces this as the first link:
http://dev.unicals.com/papers/c89-draft.html
Does anyone feel that the ChangeLog-branches directories distributed
with Wget are desirable or necessary? These side ChangeLogs are
accumulating and they *repeat* ChangeLog text many times over!
For example, the 1.8, 1.9, and 1.10 branch changelogs have a lot of
overlapping contents. The src/Ch
"Oliver Schulze L." <[EMAIL PROTECTED]> writes:
> Looks really nice. Maybe it needs a link to instructions on how to
> subscribe to the mailling list.
You can always add it. :-)
But we already have a link to the home page where the information
resides. Link to subscription details probably do
[EMAIL PROTECTED] (Larry Jones) writes:
> Hrvoje Niksic writes:
>>
>> strpbrk is a BSD 4.3 [1] function apparently also mandated by POSIX,
>> C99, and present on Windows and VMS. Is there a system we care about
>> that doesn't have it?
>
> It was also man
strpbrk is a BSD 4.3 [1] function apparently also mandated by POSIX,
C99, and present on Windows and VMS. Is there a system we care about
that doesn't have it?
Since Wget 1.10 also prints sizes in kilobytes/megabytes/etc., I am
thinking of removing the thousand separators from size display. The
reasons are:
* The separators need to be manually removed when the numbers are
pasted into any software that deals with numbers, such as "bc".
This problem w
Karsten Hopp <[EMAIL PROTECTED]> writes:
>> svn checkout http://svn.dotsrc.org/repo/wget/branches/1.10/ wget-stable
>
> minor issues with that:
> [wget-stable] > ./autogen.sh
> [wget-stable] > ./configure --prefix=/usr/
> configure: configuring for GNU Wget 1.10.1-beta
>
"Matthew J Harms" <[EMAIL PROTECTED]> writes:
> I'm sure you've already had this suggested, and I don't know if it
> will work, due to the complexity of the suggestion, but is there a
> way you could implement the capability of wget to download any file
> that meets a criteria yet use wildcards (i
Does anyone use the Watcom compiler or its open-source offspring?
There are some ugly Watcom-specific ifdefs in Wget that we'd be better
off without -- unless someone is actually using it.
During the last couple of weeks I spent some time improving the
wikipedia's page on Wget, ending up with a complete rewrite of the
original, very terse, page. Please let me know how you like it and if
you think it needs corrections or additions.
http://en.wikipedia.org/wiki/Wget
Will Kuhn <[EMAIL PROTECTED]> writes:
> Apparentl wget does not handle single quote or double quote very well.
> wget with the following arguments give error.
>
> wget
> --user-agent='Mozilla/5.0' --cookies=off --header
> 'Cookie: testbounce="testing";
> ih="b'!!!0T#8G(5A!!#c`#8HWs
Hrvoje Niksic <[EMAIL PROTECTED]> writes:
> If you want to check out the 1.10 branch (recommended for
> distributions because it only contains bug fixes), you can use:
>
> svn checkout http://svn.dotsrc.org/repo/wget/trunk/ wget
Oops! The above should read something like:
Mauro Tortonesi <[EMAIL PROTECTED]> writes:
> The new repository is accessible at:
>
> http://svn.dotsrc.org/repo/wget/
For the uninitiated, to checkout the repository, you need a reasonably
recent version of the subversion client and issue something like:
svn checkout http://svn.dotsrc.org/
Ariel <[EMAIL PROTECTED]> writes:
> Was looking for an option to skip existing files, and after some time
> (minutes? hours?) of no luck, i looked at that option -nc "Dont
> clobber existing files".
clobber == overwrite
http://www.science.uva.nl/~mes/jargon/c/clobber.html
That term has been use
Paul Smith <[EMAIL PROTECTED]> writes:
> I am giving the first steps with wget and I would like to know
> whether it is possible to get a list of the files that wget will
> download before downloading them.
Unfortunately not. Something like this is likely to be added in a
future release.
Althou
Jens Schleusener <[EMAIL PROTECTED]> writes:
> The reason for the above error is as already written - at least in
> my case using the self compiled libtool version 1.5
I don't think the libtool version used on the system makes any
difference (except for a developer at the point of "libtoolizing"
301 - 400 of 1929 matches
Mail list logo