Bent Neumann Jensen [EMAIL PROTECTED] writes:
I've tried to do it this way:
wget --save-cookies cookies.txt --post-data
'Kortnummer=Password=' http://213.173.230.31
wget --load-cookies cookies.txt -p http://213.173.230.31/cgi-bin/tabcntrl
Have you checked the contents of
[EMAIL PROTECTED] (Steven M. Schweda) writes:
Results from wget -V would be much more informative than knowing
the path(s) to the executable(s). (Should I know what SVN is?)
I believe SVN stands for Subversion, the version control software that
runs the repository.
[EMAIL PROTECTED] (Steven M. Schweda) writes:
#define PTR_FORMAT(p) 2 * sizeof (void *), (unsigned long) (p)
No bets, but you might try something like:
#define PTR_FORMAT(p) ((int)(2 * sizeof (void *))), (unsigned long) (p)
You are right; I'll make this change, thanks.
Note that, as
Hrvoje Niksic [EMAIL PROTECTED] writes:
[EMAIL PROTECTED] (Steven M. Schweda) writes:
#define PTR_FORMAT(p) 2 * sizeof (void *), (unsigned long) (p)
No bets, but you might try something like:
#define PTR_FORMAT(p) ((int)(2 * sizeof (void *))), (unsigned long) (p)
You are right; I'll make
[EMAIL PROTECTED] writes:
This may be useful when some network share are mounted to local file system
*** OK, so why do you want to download the file from local file system to
local
file system?
Because Wget can shows the download speed, restart a download with
`-c', etc. I second the
Frank McCown [EMAIL PROTECTED] writes:
But IIS does not handle .. the same way. IIS will simply ignore
.. and produce the page. So the following two URLs are referencing
the same HTML page:
http://www.merseyfire.gov.uk/pages/fire_auth/councillors.htm
and
Mauro Tortonesi [EMAIL PROTECTED] writes:
regex support is planned for the next release of wget. but i was
wondering if we should just extend the existing -A and -R option
instead of creating new ones. what do you think?
It would seriously break backward compatibility. If that is
acceptable,
Frank McCown [EMAIL PROTECTED] writes:
Earlier today I sent an email explaining that wget already handles
.. in the middle of a URL correctly, it just doesn't handle ..
immediately after the domain name correctly.
But it does, at least according to rfc1808, which mandates leading
.. to be
[EMAIL PROTECTED] (Steven M. Schweda) writes:
and adding it fixed many problems with FTP servers that log you in
a non-/ working directory.
Which of those problems would _not_ be fixed by my two-step CWD for
a relative path? That is: [...]
That should work too. On Unix-like FTP servers,
Arne Caspari [EMAIL PROTECTED] writes:
When called like:
wget user:[EMAIL PROTECTED]/foo/bar/file.tgz
and foo or bar is a read/execute protected directory while file.tgz is
user-readable, wget fails to retrieve the file because it tries to CWD
into the directory first.
I think the correct
Mauro Tortonesi [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Arne Caspari [EMAIL PROTECTED] writes:
I believe that CWD is mandated by the FTP specification, but you're
also right that Wget should try both variants.
i agree. perhaps when retrieving file A/B/F.X we should try to use:
GET
Hrvoje Niksic [EMAIL PROTECTED] writes:
That might work. Also don't prepend the necessary prepending of $CWD
to those paths.
Oops, I meant don't forget to prepend
Keith Moore [EMAIL PROTECTED] writes:
Now that I think about it, I'm convinced the switch should be
named something more positive. Seeing --no-redirections on the
command-line makes perfect sense, but I would really hate to see
no_redirections=on in the .wgetrc file.
Actually the switch
Post, Mark K [EMAIL PROTECTED] writes:
Odd. It didn't take me long to find this:
http://ftp.us.debian.org/debian/pool/main/w/wget/wget_1.10.2-1_i386.deb
It's questionnable whether that's installable on stable Debian.
Post, Mark K [EMAIL PROTECTED] writes:
Not really. Debian will let you install whatever you want, provided
the dependencies are satisfied.
Which is what is questionnable -- a package from unstable or testing
usually depends on a slightly higher version of libc. Have you tried
it?
If you
Mauro Tortonesi [EMAIL PROTECTED] writes:
i have just deployed a new bugzilla installation to use for wget.
you can find it at the URL:
http://wget.sunsite.dk/bugtracker
I am unable to create an account on this tracker. The account
creation form seemed to complete, but I received no mail
Schatzman, James (Mission Systems) [EMAIL PROTECTED] writes:
However, I have tried versions 1.10, 1.10.1, and 1.10.2, and they
all appear to do the same thing- attempt to connect directly to the
server (at port 443) instead of going through the proxy. Here is
what I see
1) HTTP server, no
Schatzman, James (Mission Systems) [EMAIL PROTECTED] writes:
I have double checked the wget documentation. There is no mention of
the https_proxy parameter. The manual and sample wgetrc that are
provided list http_proxy and ftp_proxy - that is all.
Apparently, the bug is with the
[EMAIL PROTECTED] writes:
Can anyone help ?
Or is it feature request ?
It's a feature request. It would be nice to have support for
compression and for HTTP/1.1 chunked transfer. (I clump those
together because they require changes at the same place in the code.)
Someone did talk about
Ray Arachelian [EMAIL PROTECTED] writes:
wget --serverip=192.168.0.10 http://www.blah.com/restofurl server1
With Wget 1.10 and further you can simulate this in a slightly
roundabout way:
wget --header Host: www.blah.com http://192.168.0.10/restofurl server1
...
Maybe this should make it into
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
A program can be built on one machine and run on one or more others.
On one machine, yes, but can it be built on one architecture and run
on another?
But the same Unix build can be run on different file systems.
I'm not aware
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Sat, 8 Oct 2005, Hrvoje Niksic wrote:
As you said, parsing UNIX directory listings is a nightmare. If
someone has a suggestion for better heuristics, please go ahead and
suggest.
Hmm, use MDTM/SIZE to attempt to get at file dates and sizes
Rahul Joshi [EMAIL PROTECTED] writes:
E.g. A wgetrc entry like reject = *404* will reject file like
Sample404File.html but reject = *404* will not reject the file.
I don't think quotes work in wgetrc. The examples in the manual were
meant to be typed in the shell.
[EMAIL PROTECTED] (Steven M. Schweda) writes:
I haven't actually looked, but if the code does _not_ assume that
the last thing on the line is the file name, then it would seem to
need some work.
The code does assume that the last thing on the line is the file name.
The twist is that a file
HonzaCh [EMAIL PROTECTED] writes:
My localeconv()-thousands_sep (as well as many other struct
members) reveals to empty string () (MSVC6.0).
How do you know? I mean, what program did you use to check this?
My quick'n'dirty one. See the source below.
Your source neglects to
I consider it a feature. Wget was designed specifically for
downloading in the background; it catches the hangup signal to it
knows not to write to the (now defunct) terminal. The idea is for
downloads in SSH/telnet/modem sessions to keep running if the user is
accidentally disconnected.
If I
Martin Koniczek [EMAIL PROTECTED] writes:
in contrast to the faq (http://www.gnu.org/software/wget/faq.html):
[...]
The FAQ is very imprecise here with its use of the term funny
characters. [...]
this faq additionally misled me - perhaps just kill the # from the
funny characters listing?
Mauro Tortonesi [EMAIL PROTECTED] writes:
yes, but i was thinking to define wget specific error codes.
I wouldn't object to those. The scripting people might find them
useful.
HonzaCh [EMAIL PROTECTED] writes:
Latest version (1.10.1) turns out an UI bug: the thousand separator
(space according to my local settings) displays as á (character
code 0xA0, see attch.)
Although it does not affect the primary function of WGET, it looks
quite ugly.
Env.: Win2k Pro/Czech
Arthur DiSegna [EMAIL PROTECTED] writes:
grep -v ^# sites.html | wget --spider -o log.txt -i -
Basically, I am asking WGET to look through sites.html to see if
certain files exist. This gives me an OK and a file size. I would to
go one step further and get the file date as well.
You could
Rahul Joshi [EMAIL PROTECTED] writes:
The issue of specifying or limiting the number of
links followed in every page:
Can anybody suggest the source file change or otherwise that I can
make to do this? I am not familiar with WGet source, so I would
appreciate any inputs...
In src/recur.c
Dennis Heuer [EMAIL PROTECTED] writes:
Your answer fits only half because I still have to choose -Ahtml,pdf
and I still get *at least* the first HTML page on my disk
The first HTML page will only be saved temporarily. You still
shouldn't be needing to use -Ahtml,pdf instead of just -Apdf.
Owen Cliffe [EMAIL PROTECTED] writes:
Is there a good reason why retrieve tree doesn't just return the
status of the last failed operation on failure?
The original reason (which I don't claim to be good) is because Wget
doesn't stop upon on error, it continues. Because of this returning a
Dennis Heuer [EMAIL PROTECTED] writes:
I've checked that on a different site and it worked. However: My
mainpoint (why I called this a (design) bug) is still valid. When I
target a page and say -Apdf it is clear that only the pdf links are
valid choices. The options -rl1 should not be
Arthur DiSegna [EMAIL PROTECTED] writes:
I am using -i to read urls from an HTML file. How can I make
comments to this file with out the log showing test.html: Invalid
URL #
You can always preprocess the file using something like:
grep -v '^#' inputfile | wget -i -
or, if you want to
Rahul Joshi [EMAIL PROTECTED] writes:
Does wget have any option/facility to remove the HTML tags of the
retrieved pages so that only the text content can be obtained? For
example:
[...]
No. You will need to use something like `lynx -dump' for that
purpose.
J F [EMAIL PROTECTED] writes:
td align=left valign=bottom background=index.htmlimg
src=images/blank.gif width=320 height=7 border=0 alt= align=bottoma
href=index.html#top title=Go To Topimg src=images/fluxbox_top.gif
width=104 height=55 border=0 title=Go to Top/a/td
---
Michael Shigorin [EMAIL PROTECTED] writes:
at least one of the people on ALT Linux Team used
`wget --non-verbose' in scripts; that broke with 1.10.
Sorry about that. I'm not sure if it makes sense to resurrect
--non-verbose for 1.10.2 or 1.11... most people seemed to use -nv,
which is still
Youssef Eldakar [EMAIL PROTECTED] writes:
Is an HTTP redirect counted as a hop as controlled by the -l option?
No.
Does Wget handle refreshes made using http-equiv?
Yes, but those are counted as a hop.
Jochen Roderburg [EMAIL PROTECTED] writes:
Hmm, this did not actually try to write over 'index.html', did it ;-)
Do the same with 'timestamping on' and you get
(not surprisingly and with 'all' wget versions I have around) :
index.html: Permission denied
Cannot write to `index.html'
Youssef Eldakar [EMAIL PROTECTED] writes:
I noticed that URLs in BASE HREF=... are not converted when
using the -k option to convert links in recursive downloads. In
which case, they have to be fixed manually in order for browsing to
work. Is there a better solution?
wget -k is supposed to
Linda Walsh [EMAIL PROTECTED] writes:
[...]
To answer the question raised in the subject: obviously, respecting
the robots file does not imply (even jokingly) that Wget's operator
is a robot, but that the program is an automated agent, aka crawler,
which once set up, analyzes HTML and downloads
Linda Walsh [EMAIL PROTECTED] writes:
But I've tried various combinations to download the rpms in the
directory:
wget -r -nH http://mirrors.kernel.org/suse/i386/9.3/suse/i586
wget -r -nH http://mirrors.kernel.org/suse/i386/9.3/suse/i586/.
(both just download an index.html file)
You need
This should now be fixed in the repository, in a slightly different
manner (by setting SSL_MODE_AUTO_RETRY on the SSL context).
Thanks for the report.
Thanks for the report; I've applied this patch:
2005-08-26 Jeremy Shapiro [EMAIL PROTECTED]
* openssl.c (ssl_init): Set SSL_MODE_AUTO_RETRY.
Index: openssl.c
===
--- openssl.c (revision 2063)
+++ openssl.c (working
Jonathan [EMAIL PROTECTED] writes:
Would it be possible (and is anyone else interested) to have the
subject line of messages posted to this list prefixed with '[wget]'?
I am against munging subject lines of mail messages. The mailing list
software provides headers such as `Mailing-List' and
You're right, this should be documented. I was under the impression
that the Subversion's checkout protocol (at least) was
backward-compatible, but I've never actually *tried* it.
Stepan Kasal [EMAIL PROTECTED] writes:
1) I removed the AC_DEFINEs of symbols HAVE_GNUTLS, and HAVE_OPENSSL.
AC_LIB_HAVE_LINKFLAGS defines HAVE_LIBGNUTLS and HAVE_LIBSSL, which
can be used instead. wget.h was fixed to expect these symnbols.
(You might think your defines are more aptly named,
Would someone be willing to host an issue tracker for Wget? Of the
ones I've seen bugzilla seems to be the best, but trac is also quite
promising. Roundup currently installed at
wget-bugs.ferrara.linux.it is nice, but it lacks some features: for
example, you can't set the type of resolution
Herold Heiko [EMAIL PROTECTED] writes:
Windows MSVC binary for 1.10.1 at
http://xoomer.virgilio.it/hherold/
Likely I won't be able to follow wget development and compile
anything with reasonable delay after a given source release for at
least several months, due to personal constraints.
Peter Skye [EMAIL PROTECTED] writes:
Thanks for the -p suggestion, that might be the cure. I think I've
figured out the basic problem -- the HTML page is at
http://www.ucomics.com/ but the image is at
http://images.ucomics.com/.
Then you need to use something like -H -Ducomics.com to tell
ed [EMAIL PROTECTED] writes:
Yes, on the computer with the FTP client it is. Not sure about the
server, though. So I was hoping there was a workaround on the client
end.
Or say my client *isn't* set for UTF-8, would there be a switch or
something to work around it?
Your bug report does not
Carl Ponder [EMAIL PROTECTED] writes:
How about this, then document wget as follows:
By default, for wildcard and recursive operations, wget
*ignores* invisible files (like .profile, .rhosts, etc.)
that begin with '.'.
But that's the catch, it really doesn't ignore
[EMAIL PROTECTED] (Steven M. Schweda) writes:
[...] I for one would prefer Wget to be smarter and try to download
dot files by default, without the user's intervention.
Given the variability in FTP servers (even among UNIX FTP servers) I
don't see how this could be done reliably.
I hoped
Tony Lewis [EMAIL PROTECTED] writes:
Mauro Tortonesi wrote:
this is a very interesting point, but the patch you mentioned above uses
the
LIST -a FTP command, which AFAIK is not supported by all FTP servers.
As I recall, that's why the patch was not accepted. However, it would be
useful
Carl Ponder [EMAIL PROTECTED] writes:
Hey -- how about making the -a the default, then add a command-line
switch that supresses -a for servers it won't work with?
That would mean using a non-standard extension by default, and putting
the burden to the user to disable it when it misfires. A
Behdad Esfahbod [EMAIL PROTECTED] writes:
It happened to me to unintentionally run two commands:
wget -b -c http://some/file.tar.gz
and hours later I figured out that the 1GB that I've downloaded
is useless since two wget processes have been downloading the
same data twice and appending
Wget's source code repository was migrated from CVS to Subversion. To
check out the latest code base, use the subversion client:
svn co http://svn.dotsrc.org/repo/wget/trunk/ wget
Linda Walsh [EMAIL PROTECTED] writes:
I noticed after my post in the archives that this bug is fixed in
1.10.
Now if I can just get the server-ops to fix their CVS server, that'd
be great -- I've checked out CVS projects from other sites and not
had inbound TCP attempts to some 'auth'
Albert Chin [EMAIL PROTECTED] writes:
On Thu, Aug 11, 2005 at 11:17:25PM +0200, Hrvoje Niksic wrote:
OK, in presence of LFS, Wget will use either strtoll, strtoimax, or
its own strtoll implementation if none are available.
I looked at your configure.in change and it won't work. strtoimax
Albert Chin [EMAIL PROTECTED] writes:
None of the following platforms have strtoll():
HP-UX 10.20, 11.00, 11.11
Do those platforms have 64-bit off_t, i.e. large file support? If so,
do they have another strtoll-like function, such as strtoq?
There is a replacement strtoll() in gnulib but
OK, in presence of LFS, Wget will use either strtoll, strtoimax, or
its own strtoll implementation if none are available.
Mauro Tortonesi [EMAIL PROTECTED] writes:
On Saturday 09 July 2005 10:34 am, Abdurrahman ÃARKACIOÄLU wrote:
MS Internet Explorer can save a web page as a whole. That means all the
images,
Tables, can be saved as a file. It is called as Web Archieve, single file
(*.mht).
Does it possible
Mauro Tortonesi [EMAIL PROTECTED] writes:
oops, my fault. i was in a hurry and i misunderstood what
Abdurrahman was asking. what i wanted to say is that we talked about
supporting the same html file download mode of firefox, in which you
save all the related files in a directory with the same
Mauro Tortonesi [EMAIL PROTECTED] writes:
i agree with hrvoje. but this is just a side-effect of the real
problem: the semantics of -O with a multiple files download is not
well defined.
-O with multiple URLs concatenates all content to the given file.
This is intentional and supported: for
Robin Laurén [EMAIL PROTECTED] writes:
My question is about the number on one of the last lines of the
logged output, the reported download speed. What exactly does
wget's download speed report? Is this the speed of just the data
downloaded, or does the value include the lag time between
Thanks for the report; I believe this bug is fixed in Wget's
subversion repository.
Thanks for the report. The problem seems to come from Wget's use of
AI_ADDRCONFIG hint to getaddrinfo. Wget 1.10.1 will not use that
hint.
Greg Ramos [EMAIL PROTECTED] writes:
I have downloaded two versions of wget, and both give me this error:
This problem is caused by Apache installing a buggy fnmatch.h in the
compiler's default include path. As a workaround, remove the
definition of SYSTEM_FNMATCH in sysdep.h.
Gisle Vanem [EMAIL PROTECTED] writes:
If you adopt this style, I urge you to reconsider the #undef
HAVE_OPENSSL in config.h.
You're right; I never thought through the effect of the #undef lines
on symbols defined via Makefile! configure-generated config.h has the
undefs commented out,
Gisle Vanem [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Wouldn't you need to have separate targets for linking as well?
Sure. That target would simply depend on $(MSVC_OBJECTS) etc.:
wget-msvc.exe: $(MSVC_OBJECTS)
link $(MSVC_LDFLAGS) -out:$@ $^ $(MSVC_EXT_LIBS)
Personally
I invite you to try out the GnuTLS support in the Wget repository. It
is still very rudimentary (no fancy SSL options), but the basics seem
to work.
Patches that enhance this would be very welcome, as my experience with
SSL in general and GnuTLS in particular is very limited.
Alain Guibert [EMAIL PROTECTED] writes:
(1) Libc 5.4.33 own mktime() produces wrong by some minutes results for
all summer dates when tm_isdst is forced to false 0. Wget's
mktime_from_utc() forces tm_isdst=0 at a stage, and produces wrong by
some minutes result only for one hour, beginning at
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes:
Here are the results..
---request begin---
GET /images/spk.ico HTTP/1.0
Referer: http://www.spk.gov.tr/
User-Agent: Wget/1.10
Accept: */*
Host: www.spk.gov.tr
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting
Alain Bench [EMAIL PROTECTED] writes:
Not here. This seems to be locale dependant, requiring exact
localized input. Here MS Calculator accepts pasted 123 456 789,01
as correct 123456789.01, but when pasted wget's English
123,456,789.01 it fails, interpreting this as 123.456789 and
beeping.
I believe this patch should fix the problem. Could you apply it and
let me know if it fixes things for you?
2005-07-02 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): Except for head_only, use skip_short_body to
skip the non-20x error message before leaving gethttp
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes:
Now, it works. Thanks a lot.
But I want to understand what is going on ? Was it a bug ?
It was a combination of two Wget bugs, one in actual code and other in
MinGW configuration.
Wget 1.9.1 and earlier used to close connections to the server
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes:
It's already in the repository.
I think you forget to put -DHAVE_SELECT statement
into makefile.src.mingw at
http://svn.dotsrc.org/repo/wget/branches/1.10/windows/.
Am I right ?
That was published in a separate patch -- specifically,
James Wiebe [EMAIL PROTECTED] writes:
I'm writing to report my unsuccessful compile
of WGet ver 1.10 on Windows 2000 and also XP with MSVC++ 6.0.
This is a known bug in MSVC++ 6.0. You can work around it by
compiling retr.c and http.c with no (or at least less) optimization.
James Wiebe [EMAIL PROTECTED] writes:
How do you get past an https login screen (as opposed to a plain
http (non-secure) one)?
The procedure is, as far as I know, exactly the same for both.
Using an idea from msg Login string Richard Emanilov Wed, 16 Mar 2005
13:38:09 -0800
I tried
wget
Abdurrahman ÇARKACIOĞLU [EMAIL PROTECTED] writes:
I succesfully compiled Wget 1.10 using mingw. Although Heiko
Herold's wget 1.10 (original wget.exe I mean) (from
http://space.tin.it/computer/hherold/) succesfully download the
following site, my compiled wget (produced by mingw32-make) hangs
A. Carkaci [EMAIL PROTECTED] writes:
---request begin---
GET /images/spk.ico HTTP/1.0
Referer: http://www.spk.gov.tr/
User-Agent: Wget/1.10
Accept: */*
Host: www.spk.gov.tr
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
1. Does wget -4 http://... work?
Yes
Then, as a workaround you can put inet4_only=yes to your ~/.wgetr.
What OS are you running this on?
Red Hat Linux release 6.2 (Zoot)
We should probably find a way to disable IPv6 on systems
Василевский Сергей [EMAIL PROTECTED] writes:
some time appear this error
assertion ptr != NULL failed: file xmalloc.c, line 190
What were you doing when the error appeared? Do you have the rest of
Wget's output?
Maciej W. Rozycki [EMAIL PROTECTED] writes:
Bugs are of course inevitable and you shouldn't be surprised seeing
them especially as on exotic platforms (you even admit you've never
been able to reproduce some of the other's problems on your
systems).
Please note that a platform doesn't
[EMAIL PROTECTED] (Steven M. Schweda) writes:
from Hrvoje Niksic:
[...] Unfortunately EOL conversions break
automatic downloads resumption (REST in FTP),
Could be true.
manual resumption (wget -c),
Could be true. (I never use wget -c.)
It's the consequence of EOL conversion
[EMAIL PROTECTED] (Steven M. Schweda) writes:
It does seem a bit odd that no one has noticed this fundamental
problem until now, but then I missed it, too.
Long ago I intentionally made Wget use binary mode by default and not
muck with line endings because I believed exact data transfer was
Dan Jacobson [EMAIL PROTECTED] writes:
Why must -B need -F to take effect? Why can't one do
xargs wget -B http://bla.com/ -i - !
zzz.html fff/gg/h.html
!
I'm not sure I understand the combination of `xargs' and `-i -'. This
seems to work for me:
$ wget -B http://www.example.com/foo/ -i-
Alain Bench [EMAIL PROTECTED] writes:
Removing separators will break existing apps parsing wget's output.
Such apps exist?
They do exist, but *any* change in Wget's output will break them.
Since they probably do the equivalent of sed s/,//g anyway, the
removal of separators is likely to be the
Herold Heiko [EMAIL PROTECTED] writes:
Downloaded: bytes in 2 files
Note missing number of bytes.
This would indicate that the %I64 format, which Wget uses to print
the 64-bit download sum, doesn't work for you. What does this
program print?
#include stdio.h
int
main (void)
{
__int64 n =
Hrvoje Niksic [EMAIL PROTECTED] writes:
This would indicate that the %I64 format, which Wget uses to print
the 64-bit download sum, doesn't work for you.
For what it's worth, MSDN documents it: http://tinyurl.com/ysrh/.
Could you be compiling Wget with an older C runtime that doesn't
support
David Fritz [EMAIL PROTECTED] writes:
I64 is a size prefix akin to ll. One still needs to specify the
argument type as in %I64d as with %lld.
That makes sense, thanks for the explanation!
Post, Mark K [EMAIL PROTECTED] writes:
I read the entire message, but I probably didn't have to. My
experience with libtool in packages that really are building
libraries has been pretty painful. Since wget doesn't build any,
getting rid of it is one less thing to kill my builds in the
John Haymaker [EMAIL PROTECTED] writes:
I am trying to download all pages in my site except secure pages that
require login.
Problem: when wget encounters a secure page requiging the user to log in,
it hangs there for up to an hour. Then miraculously, it moves on.
By secure pages do you
to cookie code;
* Removing the special logic from path_match.
With that change your test case seems to work, and so do all the other
tests I could think of.
Please let me know if it works for you, and thanks for the detailed
bug report.
2005-06-24 Hrvoje Niksic [EMAIL PROTECTED
Alain Bench [EMAIL PROTECTED] writes:
On Thursday, June 23, 2005 at 3:16:28 PM +0200, Hrvoje Niksic wrote:
Since Wget 1.10 also prints sizes in kilobytes/megabytes/etc., I am
thinking of removing the thousand separators from size display.
IMHO thousand (or myriad) separators
Alain Bench [EMAIL PROTECTED] writes:
MHO: They are ununderstandable, unusable, unclean, and big. They may
give a false bad impression of source/project misorganization. We
want to drop them, wipe any proof of their existence from any
archives and mirrors, then honestly deny they ever
Leonid [EMAIL PROTECTED] writes:
Those guys who find numbers like 11782023180 easy to read and can
tell for a fraction of a second that it was 11Gb
I'm not such person; Wget would in fact print:
Length: 11782023180 (11.0G)
Tony Lewis [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
In fact, I know of no application that accepts numbers as Wget prints
them.
Microsoft Calculator does.
Sorry, I forgot to qualify that as (Unix) command-line application
or something to that effect. I know that many GUI
Oliver Schulze L. [EMAIL PROTECTED] writes:
I think that having a link to an email address is not that usefull,
because people can just write to that email address because its a
mailling list.
Good point. An even better link might be to the gmane archive, where
you can read the list, but
201 - 300 of 1457 matches
Mail list logo