I sincerely apologize if you are receiving this e-mail in
error.
I have been taking an herbal formula for 10 months and it has saved my
family
and my life. Because of this I have made a commitment to get the word out
on
this life-changing product, so that it will be available to everyone.
My
I believe I've fixed the most important problems with Wget 1.7 and am
ready to release 1.7.1 on the weekend. Specifically:
* Libtool has been updated to 1.4. This should make Wget build on
platforms where the old libtool failed to produce working
executables.
* The check for OpenSSL now us
[ Note that Karl might not have seen your list-only response. ]
Daniel Stenberg <[EMAIL PROTECTED]> writes:
> On Sat, 9 Jun 2001 [EMAIL PROTECTED] wrote:
>
>> About doing the random number seed for ssl, I dug up what lynx does, and
>> it looks like it wouldn't be difficult to do something simil
Arkadiusz Miskiewicz <[EMAIL PROTECTED]> writes:
> please try:
> wget --mirror http://www.ire.pw.edu.pl/zejim/rois/
Thanks for the report. I believe this patch should fix the problem.
2001-06-14 Hrvoje Niksic <[EMAIL PROTECTED]>
* recur.c (recursive_retrieve): Also check undesirable
Jan Prikryl <[EMAIL PROTECTED]> writes:
>> Jan Prikryl <[EMAIL PROTECTED]> writes:
>>
>> > It seems that -lsocket is not found as it requires -lnsl for
>> > linking. -lnsl is not detected as it does not contain
>> > `gethostbyname()' function.
>>
>> That's weird. What does libnsl contain if no
At 09:55 13.06.01 +0200, you wrote:
> > 421 Too many users logged for this account. Try again later.
>That's it. You are logged in more times than that you are allowed to.
>Seems clear to me.
No, my point was that wget should retry here at this point.
> > Is this a known issue? Perhaps it's more
Christian Trefzer <[EMAIL PROTECTED]> writes:
> I am using wget 1.6 and have changed src/html.c to search more html
> tags for resources wget should download. Thus html_allow[] looks a
> bit different (see attached html.c) - the newly added lines have a
> comment attached.
Please note that Wg
At 22:34 14.06.01 +0200, you wrote:
>Jan Prikryl <[EMAIL PROTECTED]> writes:
>
> >> 421 Too many users logged for this account. Try again later.
> >
> > That's it. You are logged in more times than that you are allowed to.
> > Seems clear to me.
>
>I think his point is that Wget should retry. And
"Marty Leisner" <[EMAIL PROTECTED]> writes:
> It seems the man page is generated in the build directory...
>
> But it tries to install the man page out of the source directory...
Thanks for the patch; a similar fix is already in the CVS and will be
part of the next release.
Richard Travett <[EMAIL PROTECTED]> writes:
> I'm going to try this again since last time I got only one response
> which unfortunately, although helpful, didn't solve the problem. :-(
>
> I won't include all the logs again (Maybe the length put people off
> reading it!) but I'll just ask the qu
"Ehud Karni" <[EMAIL PROTECTED]> writes:
> On 04 Jun 2001 21:47:05 +0200, Hrvoje Niksic <[EMAIL PROTECTED]> wrote:
>>
>> GNU Wget 1.7 has been released. It is available from
>> ftp://ftp.gnu.org/pub/gnu/wget/wget-1.7.tar.gz and mirrors of that
>> site (see list of mirror sites at http://www.gnu
[EMAIL PROTECTED] writes:
> I find that wget is taking all my memory.
But you neglected to tell us what you were doing with Wget.
I'm afraid I cannot explain 92M of taken memory for "regular" usage,
but I can think of some degenerate cases where this might happen and
no way to prevent it.
> Pl
Kevin Brand <[EMAIL PROTECTED]> writes:
> Which release will contain the fix for wget that will allow an ampersand in the URL
>as part of an RVAL ( in escaped form ) like this:
>
> ...runit.cgi?var1=rval1&var2=sometext%26moretext
Hopefully release 1.8.
Jan Prikryl <[EMAIL PROTECTED]> writes:
>> 421 Too many users logged for this account. Try again later.
>
> That's it. You are logged in more times than that you are allowed to.
> Seems clear to me.
I think his point is that Wget should retry. And there he is right --
Wget's FTP code was meant
Jochen Hein <[EMAIL PROTECTED]> writes:
> I suggest the following patch:
>
> diff -u -r wget-1.7.orig/src/main.c wget-1.7/src/main.c
> --- wget-1.7.orig/src/main.c Sun May 27 21:35:05 2001
>+++ wget-1.7/src/main.cSat Jun 9 17:58:55 2001
> @@ -470,7 +470,8 @@
> case 'V':
>
Andre Majorel <[EMAIL PROTECTED]> writes:
> Quick and dirty fix : insert the following in utils.c before the
> reference to MAP_FAILED :
>
> #ifndef MAP_FAILED
> # define MAP_FAILED -1
> #endif
Not dirty at all: this fix will be in the next release, except it will
be in sysdep.h, and -1 will b
"Parsons, Donald" <[EMAIL PROTECTED]> writes:
[...]
Thanks for the report; this will be fixed in the next release.
Until then, you can simply #define MAP_FAILED to -1.
It seems the man page is generated in the build directory...
But it tries to install the man page out of the source directory...
In doc/Makefile:
install.man: $(MAN)
$(top_srcdir)/mkinstalldirs $(DESTDIR)$(mandir)/man$(manext)
$(INSTALL_DATA) $(srcdir)/$(MAN) $(DESTDIR)$(mand
Hi All,
I'm going to try this again since last time I got only one response
which unfortunately, although helpful, didn't solve the problem. :-(
I won't include all the logs again (Maybe the length put people off
reading it!) but I'll just ask the question:
If I use wget to ftp a file from a re
On 04 Jun 2001 21:47:05 +0200, Hrvoje Niksic <[EMAIL PROTECTED]> wrote:
>
> GNU Wget 1.7 has been released. It is available from
> ftp://ftp.gnu.org/pub/gnu/wget/wget-1.7.tar.gz and mirrors of that
> site (see list of mirror sites at http://www.gnu.org/order/ftp.html).
I downloaded wget-1.7, wh
No problem for me.
I'm extremely low on time these days, but I'll move that page (again,
sigh) somewhere else asap.
Suggestions for good free webservers are welcome, I'm familiar with
geocities only, but I'd rather not return there - this was the second
time they suspended that tiny single page s
Hi!
For all who cannot download the windows binaries,
they are now available through my site:
http://www.jensroesner.de/wgetgui/data/wget20010605-17b.zip
And while you are there, why not download wGetGUI v0.4?
:) http://www.jensroesner.de/wgetgui
If Heiko is reading this:
May I just keep the fil
Hi Jens and Chad,
Using this or the links from the wget page, I consistently get only 35K of
the zip file and of course this is not a valid zip file.
I've had this problem now for several days, maybe a week or more.
Don't know what other information would help diagnose the source of the
proble
On Fri, 15 Jun 2001, Jens [iso-8859-1] Rösner wrote:
> Strange, it works for me with this link
> http://space.tin.it/computer/hherold/wget20010605-17b.zip
> the old binary "1.6" is not availabe.
> If you cannot download it (have you tried with wGet? :),
> I can mail it to you, or if more people h
Hi Chad!
Strange, it works for me with this link
http://space.tin.it/computer/hherold/wget20010605-17b.zip
the old binary "1.6" is not availabe.
If you cannot download it (have you tried with wGet? :),
I can mail it to you, or if more people have the problem, add it
temporarily to my site.
CU
J
Hey all,
I'm still unable to download wget binary from
http://space.tin.it/computer/hherold/ for either 1.6 or 1.7 . Anyone have a
good link?
Chad
"Story, Ian" wrote:
> > I have been a very happy user of wget for a long time. However, today I
> > noticed that some sites, that don't run on port 80, don't work well with
> > wget. For instance, when I tell wget to go get http://www.yahoo.com, it
> > automatically puts :80 at the end, like th
Mike Castle wrote:
> I try to build all autoconfed packages outside of the source directory.
> (It is suggested that they allow this type of build in the GNU Coding
> Standards.)
>
> The generated man page, wget.1, ends up in the build directory, but install
> looks for it in srcdir:
Yes, this
We have been using wget with the -p option to retrieve page requisites.
We have noticed that it does not appear to work when tag is
encountered in the requested page.
The tag and its href are copied verbatim, and required images etc. are
not retrieved and mapped locally.
By way of example, one
29 matches
Mail list logo