toad [EMAIL PROTECTED] writes:
** Keep-alive (persistent) HTTP connections are now supported.
Using keep-alive allows Wget to share one TCP/IP connection for
many retrievals, making multiple-file downloads faster and less
stressing for the server and the network.
Only for HTTP? Only for
Arkadiusz Miskiewicz [EMAIL PROTECTED] writes:
and please fix aclocal.m4 (it contains wget specific macros which
should be in acinclude.m4)
I'm not buying that. I think it's perfectly fine to have
Wget-specific macros in aclocal.m4.
and also acconfig.h is missing (autoheader breaks on it).
Samer Nassar [EMAIL PROTECTED] writes:
I am having trouble installing wget 1.7 on a solaris box. Here is part of
what I am getting:
In file included from log.c:31:
/usr/gnu/lib/gcc-lib/sparc-sun-solaris2.6/2.8.0/include/stdarg.h:163: warning:
redefinition
of `va_list'
Tomki [EMAIL PROTECTED] writes:
Looking for SSL libraries in default
checking for RSA_new in -lcrypto... no
checking for SSL_new in -lssl... no
Looking for SSL libraries in /usr/local/ssl
checking for RSA_new in -lcrypto... no
checking for SSL_new in -lssl... no
Hmm, we should investigate
Arkadiusz Miskiewicz [EMAIL PROTECTED] writes:
Why you are trying to make things more difficult than they really
are? Now I can't use new libtool.m4 (and other m4 macros) for wget
without patching it. aclocal.m4 should be autogenerated to make life
yeasier.
I've never autogenerated
Russ Allbery [EMAIL PROTECTED] writes:
Automake, on the other hand, I've never particularly liked and I think
frequently just makes everything more complicated, not simpler.
Several times I seriously considered switching to Automake, which
might work well for a smaller project like Wget, but
Jan Prikryl [EMAIL PROTECTED] writes:
It seems that -lsocket is not found as it requires -lnsl for
linking. -lnsl is not detected as it does not contain
`gethostbyname()' function.
That's weird. What does libnsl contain if not gethostbyname()?
-AC_CHECK_FUNCS(uname gethostname)
Daniel Stenberg [EMAIL PROTECTED] writes:
(Not related to this, but I thought I could through this in: One of
the blue-sky dreams I have for a rainy day, is converting wget to
use libcurl as transport layer for FTP(S)/HTTP(S)...)
Such a thing is not entirely out of the question. I'm not
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On 7 Jun 2001, Hrvoje Niksic wrote:
We build with rpath only if the SSL libraries are at a non-standard
location, i.e. one not recognized by the system (we first try to build
without rpath). In that case, building with rpath or its moral
OBone [EMAIL PROTECTED] writes:
I use 1 time wget and now everytime i connect to internet 2 pop-up
windows (spam) are opened automatically.
I have no idea what popup windows and spam you're referring to,
but I doubt it has anything to do with Wget.
Now i want an answer about that from you
Parsons, Donald [EMAIL PROTECTED] writes:
[...]
Thanks for the report; this will be fixed in the next release.
Until then, you can simply #define MAP_FAILED to -1.
Jochen Hein [EMAIL PROTECTED] writes:
I suggest the following patch:
diff -u -r wget-1.7.orig/src/main.c wget-1.7/src/main.c
--- wget-1.7.orig/src/main.c Sun May 27 21:35:05 2001
+++ wget-1.7/src/main.cSat Jun 9 17:58:55 2001
@@ -470,7 +470,8 @@
case 'V':
printf
[EMAIL PROTECTED] writes:
I find that wget is taking all my memory.
But you neglected to tell us what you were doing with Wget.
I'm afraid I cannot explain 92M of taken memory for regular usage,
but I can think of some degenerate cases where this might happen and
no way to prevent it.
Ehud Karni [EMAIL PROTECTED] writes:
On 04 Jun 2001 21:47:05 +0200, Hrvoje Niksic [EMAIL PROTECTED] wrote:
GNU Wget 1.7 has been released. It is available from
ftp://ftp.gnu.org/pub/gnu/wget/wget-1.7.tar.gz and mirrors of that
site (see list of mirror sites at http://www.gnu.org/order
Richard Travett [EMAIL PROTECTED] writes:
I'm going to try this again since last time I got only one response
which unfortunately, although helpful, didn't solve the problem. :-(
I won't include all the logs again (Maybe the length put people off
reading it!) but I'll just ask the
Marty Leisner [EMAIL PROTECTED] writes:
It seems the man page is generated in the build directory...
But it tries to install the man page out of the source directory...
Thanks for the patch; a similar fix is already in the CVS and will be
part of the next release.
Jan Prikryl [EMAIL PROTECTED] writes:
Jan Prikryl [EMAIL PROTECTED] writes:
It seems that -lsocket is not found as it requires -lnsl for
linking. -lnsl is not detected as it does not contain
`gethostbyname()' function.
That's weird. What does libnsl contain if not gethostbyname()?
Arkadiusz Miskiewicz [EMAIL PROTECTED] writes:
please try:
wget --mirror http://www.ire.pw.edu.pl/zejim/rois/
Thanks for the report. I believe this patch should fix the problem.
2001-06-14 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (recursive_retrieve): Also check undesirable_urls
I believe I've fixed the most important problems with Wget 1.7 and am
ready to release 1.7.1 on the weekend. Specifically:
* Libtool has been updated to 1.4. This should make Wget build on
platforms where the old libtool failed to produce working
executables.
* The check for OpenSSL now
[EMAIL PROTECTED] writes:
1 - Use RAND_egd() for reading true random data if such is available (this
needs to be checked for in the configure script, as RAND_egd() wasn't
introduced until OpenSSL 0.9.5). This would also benefit from a command
line option to
B. Watson [EMAIL PROTECTED] writes:
partial output from ./configure --with-ssl=/usr/local/ssl:
[...]
Thanks for the report; this will be fixed in the next release.
B. Watson [EMAIL PROTECTED] writes:
my Solaris 7 machine is using egcs-2.95.3, same as one of the Linux
boxes... why is $host_os used to determine which flag to pass,
since the flag is the same on gcc, regardless of which OS it's
running on?
This is not true. -R definitely doesn't work on
Andre Majorel [EMAIL PROTECTED] writes:
Executive summary: complete success.
[...]
Thanks a lot for testing.
Eric Bock [EMAIL PROTECTED] writes:
Here's an odd site (with a really dumb link at the top...)
http://heragency.com
wget crashed with a 182M core when I fed it that site with -r -l0
set :/
Which version of Wget is it? Wget 1.7 seems to cope with it, at lest
for me.
Also wget seems to die
Kathryn C/Maureen O [EMAIL PROTECTED] writes:
I recently downloaded and compiled wget 1.6. I've successfully
retrieved documents using the http protocol, but I cannot ftp a file in
ascii mode.
I read the documentation and found that, wget also supports the
'type' feature for FTP
Daniel Stenberg [EMAIL PROTECTED] writes:
On 18 Jun 2001, Hrvoje Niksic wrote:
There are good reasons to use alloca instead of malloc
I know this is taken out of context, but I'm curious on this. What
are the good technical reasons to use alloca instead of malloc?
* Reduced memory
[EMAIL PROTECTED] writes:
[about alloca vs malloc]
If you allocate with malloc and then accidentally overwrite it, you get a
corrupted heap.
If you allocate with alloca and then accidentally overwrite it, you get a
corrupted stack.
Guess which is easier to notice :-)
Seriously, which?
Eric Bock [EMAIL PROTECTED] writes:
This is why many people allocate all the memory they will need at
the beginning, and either never allocate memory again, or
reallocate it (sensibly) as needed.
Thanks, but no thanks. Such programs are frought with arbitrary
limits just so they can
Bill Bumgarner [EMAIL PROTECTED] writes:
In html-parse, the following case near line 449 is missing the \ in
'\' -- it comes out as '' and, as such, causes some compilers to
barf.
Could you name an actual compiler that barfs on it? The constant ''
is perfectly legal in C, and I'm positive
Bazuka [EMAIL PROTECTED] writes:
If I am running Wget overnight to crawl some sites (say about 50,000
URLs) and it crashes/hangs up for some reason after retrieving half
of them, is it possible to restart it from the point where it
crashed (instead of downloading everything again ) ?
The
Anees Shaikh [EMAIL PROTECTED] writes:
I'm trying to use the code in html-parse.c (v1.7) in standalone mode
Excellent!
For some reason, img src=... tags are recognized but then skipped
almost every time they are encountered. When using the full program
and recursive retrieve, the images
Anees Shaikh [EMAIL PROTECTED] writes:
So I think the problem is with malformed img tags. The parser fails
if the tag is of this form:
img src=/library/homepage/images/curve.gif alt= border=0 /
[...]
This problem with img tags seems to be quite common (redhat.com,
ibm.com,
Anees Shaikh [EMAIL PROTECTED] writes:
Hrvoje, you mentioned that you planned to modify the parser to
handle these tags. Any ideas on timetable?
How about now? :-)
I have created a simple patch that deals with this. However,
preliminary testing indicated a problem with the semantics.
First, my apologies for the long delay in answering.
The idea behind this patch, and the patch itself, are very
interesting. I'll look into it for Wget 1.8 (1.7.1 should be a
bugfix-only release.)
Several random musings:
* It would be nice to have an option to use only one filter, so that
My sincerest apologies for long time of inactivity.
I am in the process of through the patch queue in order to apply the
most urgent bugfixes to 1.7 so that we can roll out 1.7.1 and then
concentrate on 1.8, which will contain some of the new features
submitted to the patch list.
I promise to
[EMAIL PROTECTED] writes:
This is a list of URLs which wget 1.7 won't download saying:
404 Not found. But Mozilla/IE/other download just fine...
Do they really? I typed http://www.123go.3d.pl/files/3.zip; into
Mozilla, and it also replied with Not Found.
Perhaps the download works when
After several months of inactivity, prerelease 2 of Wget 1.7.1 is
available for testing.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.7.1-pre2.tar.gz
(The `.betas' directory is intentionally unreadable, but the file is
there.)
The intention behind the 1.7.1
Herold Heiko [EMAIL PROTECTED] writes:
OTOH, consider www.goofball.com, create a login,
[...]
wget --load-cookies=cookies2.txt --save-cookies=cookies2s.txt-e
robots=off -nc -e timestamping=off -v -a html.log -r -l0 -Ahtm,html
-X/searchbin http://www.goofball.com
wget does seem to go into
Ferenc VERES [EMAIL PROTECTED] writes:
My download stopped, then next day I continued with -c, from 34%.
At the end I saw: 134% downloaded, in end of the lines ;-) (it was
FTP:// transfer, a 680MB iso image)
GNU Wget
1.5.3
Try it with a later version. I believe that this bug has been
I have just released Wget 1.7.1. It should show up on ftp.gnu.org as
soon as the ftp-upload people process my request. Until then, you can
get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/wget-1.7.1.tar.gz
I have tagged the repository with WGET_1_7_1 tag for future reference
and
Philip Chong [EMAIL PROTECTED] writes:
Currently with wget-1.7, the value of waitretry is ignored from both
the user's ~/.wgetrc and SYSTEM_WGETRC. The compatibility hack in
line 697 of src/main.c overwrites any waitretry value specified in
~/.wgetrc with the wait value, and the only way to
Herold Heiko [EMAIL PROTECTED] writes:
The latest changes require the attached patch to windows/* for the
md5 and res files (borland/watcom stuff not tested).
Thanks for the update. I've applied your patch to CVS.
Note: it's always a good idea to Cc a patch to the patches list.
Bernhard Simon [EMAIL PROTECTED] writes:
Thanks for the patch; I believe this problem was fixed in 1.7.1,
released several hours ago. Could you please check if that version
works out of the box?
Yes, it works out of the box now. Thanks!
Always nice to hear. Thanks for testing it.
Herold Heiko [EMAIL PROTECTED] writes:
Personally I'd be happy either way, but you'll never be able to make
happy everybody. Choose what you prefer
I'd love to choose what I prefer, but I'd like to avoid my wild
preferences ruining it for everyone else. :-) Thanks for the
support, though.
Andre Pang [EMAIL PROTECTED] writes:
On Mon, Nov 19, 2001 at 05:07:31PM +0100, Hrvoje Niksic wrote:
Or, to pick another example, say you want to download the second
kilobyte of a file:
--range=1025..2048
--range=1024..2047
I haven't been following that closely, but how are you going
Daniel Stenberg [EMAIL PROTECTED] writes:
Then again, both versions could be supported if they just use
different syntaxes.
Please note that there is a third version which Andre elided. We're
deciding for one or more of:
--range=1025..2048
--range=1024..2047
--range=1024..2048 # my
Tomas Hjelmberg [EMAIL PROTECTED] writes:
Does anyone know when the meta name=robots tag will work? 1.8?
Doesn't it work already? It's supposed to work as of 1.7.
Daniel Stenberg [EMAIL PROTECTED] writes:
This subject says it all. The leak is minor, the fix could be made
something like this:
What memory leak are you referring to in the subject?
Your patch replaces an assert() with a return NULL. The only way that
assert() could be tripped is by
[EMAIL PROTECTED] writes:
It's not just a memory leak. Length = 0 is declared as a can't
happen. If length is zero, wget will suddenly end due to the
assert. If a bad input file can lead to length being zero, then
using assert is bad on principle. One should never assert external
input.
Daniel Stenberg [EMAIL PROTECTED] writes:
You're right of course. Hm. No, it can probably only happen if the first byte
in an existing file is a zero-byte, so that strlen() return 0...
Yup. Or the first byte of any line. I.e. Wget will die if it
encounters the \n\0 sequence of characters,
Hrvoje Niksic [EMAIL PROTECTED]
* url.c (parseurl): Don't depend on the now-obsolete TYPE.
Index: src/url.c
===
RCS file: /pack/anoncvs/wget/src/url.c,v
retrieving revision 1.51
diff -u -r1.51 url.c
--- src/url.c 2001/11/19
Ian Abbott [EMAIL PROTECTED] writes:
Try it with a later version. I believe that this bug has been
fixed in most of its variations.
That stuff only works for recursive transfers, not when resuming
transfer of a single file.
You're right.
The patch in msg01481 (my modified version)
Wojtek Kotwica [EMAIL PROTECTED] writes:
Translating wget.texi manual into Polish I found following bugs:
[...]
Thanks for the corrections; I have fixed them in the current sources:
2001-11-22 Hrvoje Niksic [EMAIL PROTECTED]
* wget.texi (Proxies): Fix typo.
(Proxies): Sync
Herold Heiko [EMAIL PROTECTED] writes:
This updates the generation of hlp/info/html when compiling on
windows platform.
Thanks for the patch; I've now applied it to CVS.
Also, could you include a ChangeLog entry with your patches? I've
used this for ChangeLog:
2001-11-22 Hrvoje Niksic
Tomas Hjelmberg [EMAIL PROTECTED] writes:
Sorry, but can't anybody say at least that I'm wrong when I state that the
meta name=robots tag doesn't work?
Has anyone got it to work under any circumstances?
These are two different questions.
The answer to the first one is irrelevant, because
Tomas Hjelmberg [EMAIL PROTECTED] writes:
I want to exclude /var/www/html/tomas.html from being indexed.
It looks like:
[...]
meta name=robots content=noindex
titleTomas/title
[...]
I invoke wget with:
wget -r http://localhost
And tomas.html is unfourtnuately downloaded
like it.
2001-11-23 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c (determine_screen_width): New function.
* main.c (main): New option `--progress=TYPE'.
(main): Implement compatibility with the old option `--dot-style'.
* init.c: Removed cmd_spec_dotstyle
Herold Heiko [EMAIL PROTECTED] writes:
Currently perl is used to build the pod and thus the man page,
But it's still autodetected, right? And if it's not found, everything
works, except you don't get the man page, but that's ok since the man
page is not really supported anyway.
wanted, I
Thanks for the patch; I've now applied it to the CVS sources.
are host:port if the URL is on a
non-standard port.
2001-11-25 Hrvoje Niksic [EMAIL PROTECTED]
* url.c (reencode_string): Use unsigned char, not char --
otherwise the hex digits come out wrong for 8-bit chars such as
nbsp.
(lowercase_str): New function
Jörg Mattik [EMAIL PROTECTED] writes:
I have found a problem with the new wget 1.7. After the download
with the command:
wget -k -m -np -b -o xmm1.log http://vegemite.chem.nottingham.ac.uk/~xmakemol/
I found the file download.html modified by wget. The critical line 52 in
this file
oded [EMAIL PROTECTED] writes:
I know this topic has been discussed on this list in several
occasions, but I would like to raise the issue once more, if you
please : download throttling (rate limiting).
It's on my personal todo list for Wget 1.8 (to be released soonish).
I'll look into your
FWIW, I've implemented a --limit-rate option. I had no idea correct
rate limiting was that tricky.
For this to work, you will need either a working usleep or select. If
your system doesn't have either (Windows?), you will need to provide a
usleep-like call.
Herold Heiko [EMAIL PROTECTED] writes:
In the current cvs sources HAVE_RANDOM is mentioned in Changelog, used
in a not exactly obvious way in main.c at line 577, but nowhere else.
Is this a work in progress or a leftover of previous code ?
Backup code for systems without 'random()' (hint
Herold Heiko [EMAIL PROTECTED] writes:
What about aother check, use random() if available, otherwise rand()
if available, otherwise your barebone generator ? On the other hand,
for what it is used currently probably there isn't the need for that
complication.
Exactly. If there is a
Reza225 [EMAIL PROTECTED] writes:
Hi, Wget does not follow links that are redirected to other pages via
cgi bin.
It should follow them, as long as they're on the same host. If
they're not on the same host, you should use -H to enable Wget to span
hosts while mirroring.
Herold Heiko [EMAIL PROTECTED] writes:
There seems (looking at the docs) no way to say use bar; in case of
fallback to dot, use style X - it will fallback to 1K dots always. It
should be possible to specify something like bar:micro bar:mega or
bar:binary, too: use bar, in case of fallback
Ian Abbott [EMAIL PROTECTED] writes:
You may wish to try out the new --range option in wget 1.8-dev
(available via anonymous CVS),
Note that --range is not yet in the CVS.
Ian Abbott [EMAIL PROTECTED] writes:
I got a segmentation fault when retrieving URLs from a file.
2001-11-27 Ian Abbott [EMAIL PROTECTED]
* retr.c (retrieve_from_file): Initialize `new_file' to NULL to
prevent seg fault.
Good catch. I've applied this, thanks!
Herold Heiko [EMAIL PROTECTED] writes:
[... use of tail -f and log file ...]
Somebody or something always has to be the first :-)
Also, do you know that you can use `--progress=bar:force' and still
log to a file? tail -f will work just fine, although the resulting
file will not be pretty when
Attila Horvath [EMAIL PROTECTED] writes:
Am using version 1.8 (dev) of 'wget'
Thanks for testing it.
and am trying to fetch web pages recursively where page names are
duplicated. I don't want duplicate pages to be clobbered
(overwritten) so I'm specifying:
wget -r -l 3 -nc
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 28 Nov 2001 [EMAIL PROTECTED] wrote:
SunOS 4 is known to not have memmove.
Isn't configure supposed to notice that ?
Yes it is (*supposed* to do that)! But then you have to tell it to check for
that and the current configure doesn't.
Herold Heiko [EMAIL PROTECTED] writes:
There seems (looking at the docs) no way to say use bar; in case of
fallback to dot, use style X - it will fallback to 1K dots always.
This should work now. This is how it should work now, assuming no
additional wgetrc customization:
$ wget URL
...
Ian Abbott [EMAIL PROTECTED] writes:
On 27 Nov 2001, at 15:16, Hrvoje Niksic wrote:
So, does anyone know about the portability of rand()?
It's in the ANSI/ISO C spec (ISO 9899). It's always been in UNIX
(or at least it's been in there since UNIX 7th Edition), and I
should think it's
Ian Abbott [EMAIL PROTECTED] writes:
Thanks for the suggestion and the code example. Two points, though:
* Isn't it weird that the undefined symbol is _memmove, not memmove?
It looks as if a header file is translating the symbol, thinking
that _memmove exists.
Not really. UNIX C
Daniel Stenberg [EMAIL PROTECTED] writes:
On Thu, 29 Nov 2001, Maciej W. Rozycki wrote:
On Wed, 28 Nov 2001, Ian Abbott wrote:
However, the Linux man page for bcopy(3) do not say the strings can
overlap
Presumably the man page is incorrect
Well, can we actually guarantee that bcopy()
David [EMAIL PROTECTED] writes:
I have a problem on using wget, as follows:
What version of Wget are you using?
I want to download a bunch of files in, say,
www.server.com/dir/files, and I found out that wget is contacting
www.server.com:80, and the files it get is not what I'm looking
Jerome Lapous [EMAIL PROTECTED] writes:
One option that can be interesting is to print the donwload result
on standard output instead of a file. It would avoid rights problem
when the same shell is used by multiple users.
Have you tried `-q -O -'?
Herold Heiko [EMAIL PROTECTED] writes:
giga is not yet documented :-)
It never was. Maybe it should use 10M dots?
Herold Heiko [EMAIL PROTECTED] writes:
Ok, fine. But, in order to avoid misunderstanding, wouldn't it be
better to have a wrapper function (msleep ?) and use that where
really millisecond granularity is desired ? Otherwise sooner or
later somebody could use usleep where really microsecond
David [EMAIL PROTECTED] writes:
The version I'm using is 1.7.1
That's strange, then. What would you want Wget to contact if not
www.server.com:80 when you specify www.server.com? 80 is the default
TCP port for HTTP transfers.
Christian Fraenkel [EMAIL PROTECTED] writes:
well, i forgot the attachment *doh*
But you still forgot the ChangeLog. :-) I used this:
2001-11-30 Christian Fraenkel [EMAIL PROTECTED]
* init.c: New command `ssl_egd_sock'.
* main.c (main): New option `--sslegdsock'.
T. Bharath [EMAIL PROTECTED] writes:
There seems to be mem leak happening when a persistant ssl
connection is enabled. The leak actually happens in gethttp() and
its related set of functions that register and check for persistant
connectionsWhen a server promises a persistant connection we
Torgeir Hansen [EMAIL PROTECTED] writes:
Error compiling wget-1.7.1 on HP-UX10.20 with gcc version 3.0.1:
---
snprintf.c:343: `short unsigned int' is promoted to `int' when passed
through `...'
snprintf.c:343: (so you should pass `int' not `short unsigned int' to
`va_arg')
snprintf.c:357:
Ian Abbott [EMAIL PROTECTED] writes:
A memmove() replacement has already been added to Wget in CVS, but
here is a patch to use bcopy() if available.
I was thinking about this for some time. The reasons I chose to roll
in the memmove() from GNU libit rather than use bcopy are the
following:
[ Sorry for answering so late; I'm going through a back log. ]
Mark Holt [EMAIL PROTECTED] writes:
I am writing a site copier program for a free web hosting company.
In testing the -k option, I find that it only converts absolute
links to relative links if it has already downloaded the page
Here is a beta version of what is to be 1.8. In this case, beta
does not mean instability, just that I'd like a wider audience to take
a look at the thing before we can call it a prerelease. In either
case, I'm aiming for a release soon, possibly within the week.
Get it from:
Jamie Zawinski [EMAIL PROTECTED] writes:
[...]
It's downloading about every 4th subdirectory under gallery/2001/;
if you look at the index.html file there, you'll see that all links
are in identical syntax, so I don't see why it's downloading 07-13/
but skipping 07-14/.
And then,
Jamie Zawinski [EMAIL PROTECTED] writes:
The log says it all; it's treating the # as part of the URL
instead of stripping it.
Thanks for the report; I believe this bug has been fixed in 1.8:
{florida}[~/work/wget/src]$ ./wget http://www.dnalounge.com/backstage/log/latest.html
--04:41:48--
L. Cranswick [EMAIL PROTECTED] writes:
With the latest CVS version of wget (and the 1.7 distribution) is
there a known reason and fix why wget core dumps on the following
site?
Could you please check whether the core dump persists with the latest
CVS version, or with the recently announced
John Poltorak [EMAIL PROTECTED] writes:
Is it possible to include OBJEXT in Makefile.in to make this more
cross-platform?
I suppose so. I mean, o is already defined to .@U@o, but I'm not
exactly sure what the U is supposed to stand for.
Robin B. Lake [EMAIL PROTECTED] writes:
I have the output listing from ./configure and make for
Wget-1.8-Beta under OS 10.1.1 Macintosh. I don't want to bore the
mailing list by including it. To whom should I send it off-list?
If the compilation worked, you needn't send it at all. If the
Mikko Kurki-Suonio [EMAIL PROTECTED] writes:
I.e. all HTML pages from this point downwards, PLUS all the images(etc.)
they refer to -- no matter where they are in the directory tree
I realize -k offers a partial solution, but it doesn't work for offline
viewing.
If I use -np, wget
Robin B. Lake [EMAIL PROTECTED] writes:
I am trying to get real-time stock quotes from my broker's Web site.
If I come in via an http:// request, I get 20-minute delayed data.
If I log in with my name and password via my browser, I get
real-time data. By monitoring the IP packets, it seems
Jan Nieuwenhuizen [EMAIL PROTECTED] writes:
The function log.c:logvprintf () calls vsnprintf without restarting
the stdargs,
True. I still haven't resolved how to solve this elegantly. One
problem with your patch is that it apparently uses a Gcc extension:
-void debug_logprintf PARAMS
Here is the next 1.8 beta. Please test it if you can -- try compiling
it on your granma's Ultrix box, run it on your niece's flashy web
site, see if cookies work, etc.
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
(The `.betas' directory is
Hiroshi Takekawa [EMAIL PROTECTED] writes:
Please pay attention to po files before the release of wget 1.8.
I'm not sure how that can be done. Strings do change at the last
moment, and it's impossible to wait for all the translators.
I would appreciate if you (or the one in charge) would
Jochen Roderburg [EMAIL PROTECTED] writes:
wget.18 ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
--08:47:50--
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta2.tar.gz
= `wget-1.8-beta2.tar.gz/.listing'
Hrvoje Niksic [EMAIL PROTECTED]
* utils.c (file_merge): If BASE doesn't contain a slash, just
return a copy of FILE.
Index: src/utils.c
===
RCS file: /pack/anoncvs/wget/src/utils.c,v
retrieving revision 1.30
diff -u
Ian Abbott [EMAIL PROTECTED] writes:
On 1 Dec 2001 at 4:04, Hrvoje Niksic wrote:
As a TODO entry summed up:
* -p should probably go _two_ more hops on FRAMESET pages.
More generally, I think it probably needs to be made to work for
nested framesets too.
Maybe. You can make it work
101 - 200 of 1457 matches
Mail list logo