On Thu, 30 May 2002 03:43:06 +0200, Hrvoje Niksic
[EMAIL PROTECTED] wrote:
Ian Abbott [EMAIL PROTECTED] writes:
This is a bit late,
Sorry it didn't make it in. I guess we could publish it on the web
site, so that people who wish to compile 1.8.2 with Borland C++ can do
so. Heiko's Wget
On Wed, 29 May 2002 05:14:14 +0200, Hrvoje Niksic [EMAIL PROTECTED] wrote:
Wget 1.8.2, a bugfix release of Wget, has been released, and is now
available from the GNU ftp site:
ftp://ftp.gnu.org/pub/gnu/wget/wget-1.8.2.tar.gz
This is a bit late, but here is a patch to compile it with
On Sat, 25 May 2002 19:03:45 +0200, Hrvoje Niksic
[EMAIL PROTECTED] wrote:
Ian Abbott [EMAIL PROTECTED] writes:
The 1.8.2 branch is pretty similar to 1.8.1 at the moment and
doesn't compile with any version of Borland C++.
Should we care to fix that before the release? I'm not sure how
On Fri, 24 May 2002 20:34:38 +0400, Valery Kondakoff [EMAIL PROTECTED]
wrote:
I'm not sure I understand what exactly '21' means. As far as I
understand '' is a redirection sign. So - '1' means stdout and '2'
means stderr?
They refer to the three standard file descriptors - 0 is standard input
On Mon, 27 May 2002 16:22:57 +0200, Hrvoje Niksic
[EMAIL PROTECTED] wrote:
Jacques Beigbeder [EMAIL PROTECTED] writes:
I ran into a trouble with:
wget -m http://some/site
because of a line like:
img src=a.gif v:shapes=...
v:shapes contains a character ':', so a.gif isn't
On Fri, 24 May 2002 15:41:01 +0400, Valery Kondakoff [EMAIL PROTECTED]
wrote:
Hello, Herold!
24 ìàÿ 2002 ã., you wrote to me:
HH You could do something like tail -f on the logfile if you have a similar
HH program installed, or log to output and | tee logfile, but all of those
HH require
On Fri, 24 May 2002 08:03:15 -0700 (PDT), Doug Kaufman
[EMAIL PROTECTED] wrote:
On Fri, 24 May 2002, Valery Kondakoff wrote:
I downloaded two win32 'tee' ports, and they works as expected when
I'm entering in command line something like this: 'wget.exe -V |
tee.exe wget.log', but after I
On Wed, 22 May 2002 18:04:34 +0200, Herold Heiko
[EMAIL PROTECTED] wrote:
Latest cvs should compile correctly with borland compilers.
The latest CVS (main branch) should compile correctly with Borland C++
5.52 (which is a free download from Borland's site), but will not
compile with earlier
On Tue, 21 May 2002 19:24:01 +0200, Hrvoje Niksic
[EMAIL PROTECTED] wrote:
[Windows '?' problem]
Ian, feel free to apply the necessary change to the 1.8.2 branch.
Okay, I'll do it after work today. I've been a little busy the last few
days!
On Tue, 21 May 2002 06:04:59 +0200, Hrvoje Niksic
[EMAIL PROTECTED] wrote:
As promised, here comes the first (and hopefully only) pre-test for
the 1.8.2 bugfix release. Get it from:
http://fly.srk.fer.hr/~hniksic/wget-1.8.2-pre2.tar.gz
Windows versions will still have problems saving
On Fri, 17 May 2002 11:24:25 +0100, Ian Abbott [EMAIL PROTECTED]
wrote:
On Fri, 17 May 2002 08:34:27 +0200, Jan Klepac [EMAIL PROTECTED]
wrote:
I'd like to download all archive files wn16pcm.r[0..9][0..9] from the
directory on ftp server but
wget --passive-ftp ftp://ftp.ims.uni-stuttgart.de/pub
On Fri, 17 May 2002 12:41:21 +0200, Stephan Beyer [EMAIL PROTECTED]
wrote:
not interested in adding the Gopher feature to wget or should I still wait
some time?
I have no objections to adding gopher support, but it's up to the main
developer (Hrvoje Niksic) whether it ends up in GNU Wget. I
On Fri, 17 May 2002 16:59:07 +0400, Pavel Stepchenko [EMAIL PROTECTED]
wrote:
#!/bin/sh
wget=/usr/local/bin/wget -t0 -nr -nc -x --timeout=20 --wait=61 --waitretry=120
$wget ftp://nonanonymous:[EMAIL PROTECTED]/file1.zip
sleep 60
$wget ftp://nonanonymous:[EMAIL PROTECTED]/file2.zip
Why WGET
On Wed, 15 May 2002 23:41:39 +0200, [EMAIL PROTECTED]
[EMAIL PROTECTED] wrote:
Hi, I generate de file you wanted (with -d option). I also
used --load-cookies option.
The generated file can be found at:
http://bigben.pointclark.net/~bertra_b/wget_debug
Note: I replaced the values of the
On Thu, 16 May 2002 12:22:42 +0200, Gurkan Sengun
[EMAIL PROTECTED] wrote:
what about this parameter
With no FILE, or when FILE is -, read standard input.
(read url's actually)
This is not a bug. Please use [EMAIL PROTECTED] for feature requests.
It's a nice idea, but rather than `-' it
On Wed, 15 May 2002 18:44:19 +0900, Kiyotaka Doumae [EMAIL PROTECTED]
wrote:
I found a bug of wget with HTTPS resursive get, and proposal
a patch.
Thanks for the bug report and the proposed patch. The current scheme
comparison checks are getting messy, so I'll write a function to check
schemes
On 12 May 2002 02:54:52 -0500, asher [EMAIL PROTECTED] wrote:
hi, I've been trying to figure out how wget prints all over the screen
with out using curses, and I'm hoping someone can help. from the code,
I'm pretty sure it's just printing to the C-stream stderr, but I can't
for the life of me
On Tue, 7 May 2002 17:18:57 +0800 , Fung Chai [EMAIL PROTECTED]
wrote:
I went through the source code (src/retr.c) of wget-1.8.1 and notice that
the ftp_proxy must be HTTP; the user cannot specify it as ftp://proxy:port.
In the direct mode (ie, use_proxy is set to false), retrieve_url() will use
On Fri, 3 May 2002 18:37:22 +0200, Emmanuel Jeandel
[EMAIL PROTECTED] wrote:
ejeandel@yoknapatawpha:~$ wget -r a:b
Segmentation fault
Patient: Doctor, it hurts when I do this
Doctor: Well don't do that then!
Seriously, this is already fixed in CVS.
On Fri, 3 May 2002 14:14:37 +0200 , [EMAIL PROTECTED] wrote:
Cannot write to
`www.travelocity.com/Vacations/0,,TRAVELOCITY||Y,00.html@HPTRACK=icon_vac'
(No such file or directory).
Presumably this happens because the pipes, in particular, are illegal chars
for a filename. So my question is:
On Wed, 1 May 2002 22:12:08 +0300, robots [EMAIL PROTECTED]
wrote:
HTMLHEAD/HEADBODY
FONTF-Secure give you the W32.Klez.E removal toolsbr
W32.Klez.E is a dangerous virus that spread through email.br
br
For more information,please visit http://www.F-Secure.com/FONT/BODY/HTML
Just in case there
On Mon, 29 Apr 2002 12:03:23 -0500 (CDT), you wrote:
While using wget with www.slashdot.org, the site makes use of HREF's in
the following manner 'A HREF=//slashdot.org/image.gif'. It appears
that when wget is following the link, it is then looking for
On 22 Apr 2002 at 21:38, Renaud Saliou wrote:
Hi,
wget -t 3 -d -r -l 3 -H --random-wait -nd --delete-after
-A.jpg,.gif,.zip,.png,.pdf http://http://www.microsoft.com
DEBUG output created by Wget 1.8.1 on linux-gnu.
zsh: segmentation fault wget -t 3 -d -r -l 3 -H --random-wait -nd
On 23 Apr 2002 at 18:19, Hrvoje Niksic wrote:
On technical grounds, it might be hard to shoehorn Wget's mode of
operation into what `tar' expects. For example, Wget might need to
revisit directories in random order. I'm not sure if a tar stream is
allowed to do that.
You can add stuff to
On 19 Apr 2002 at 10:42, Daniel Stenberg wrote:
On Fri, 19 Apr 2002, System Attendant wrote:
ScanMail for Microsoft Exchange has taken action on the message, please
refer to the contents of this message for further details.
Please.
Can the admin of this ScanMail stop polluting this
On 19 Apr 2002 at 16:30, Hrvoje Niksic wrote:
To quote from there:
[...] Only hosts within the specified domain can set a cookie for
a domain and domains must have at least two (2) or three (3)
periods in them to prevent domains of the form: .com, .edu,
and va.us. Any
On 11 Apr 2002 at 18:55, Nelson H. F. Beebe wrote:
what happens if you configure it with the option
--x-includes=/usr/local/include ?
On SGI IRIX 6.5, in a clean directory, I unbundled wget-1.8.1.tar.gz,
and did this:
% env CC=c89 ./configure --x-includes=/usr/local/include
On 12 Apr 2002 at 17:21, Thomas Lussnig wrote:
So that if one fd become -1 the loader take an new url and initate the
download.
And than shedulingwould work with the select(int,) what about this
idee ?
It would certainly make handling the logging output a bit of a
challenge,
On 11 Apr 2002 at 21:00, Hrvoje Niksic wrote:
This change is fine with me. I vaguely remember that this test is
performed in two places; you might want to create a function.
Certainly. Where's the best place for it? utils.c?
. One of those
places performed a case-insensitive comparison so I made my
function do that too.
Hrvoje, you may wish to review whether checking the new extensions
in all three places (but particularly recur.c) is a good idea or
not before I commit the patch.
src/ChangeLog entry:
2002-04-12 Ian
On 12 Apr 2002 at 14:12, [EMAIL PROTECTED] wrote:
IGaming Exchange and IGaming News News Letter information
You have chosen to remove yourself from all of the IGaming Exchange
and IGaming News email list. If you have any questions or comments
about the news letters please feel free to
On 11 Apr 2002 at 19:14, Hrvoje Niksic wrote:
Nelson H. F. Beebe [EMAIL PROTECTED] writes:
c89 -I. -I. -I/opt/include -DHAVE_CONFIG_H
-DSYSTEM_WGETRC=\/usr/local/etc/wgetrc\ -DLOCALEDIR=\/usr/local/share/locale\ -O
-c connect.c
cc-1164 c89: ERROR File = connect.c, Line = 94
On 10 Apr 2002 at 3:09, Jens Rösner wrote:
wgetrc works fine under windows (always has)
however, .wgetrc is not possible, but
maybe . does mean in root dir under Unix?
The code does different stuff for Windows. Instead of looking for
'.wgetrc' in the user's home directory, it looks for a
On 9 Apr 2002 at 10:34, Hrvoje Niksic wrote:
Ian Abbott [EMAIL PROTECTED] writes:
On 5 Apr 2002 at 18:17, Noel Koethe wrote:
Will this be changed so the user could use -nv with /dev/null
and get only errors or warnings displayed?
So what I think you want is for any log message tagged
On 8 Apr 2002 at 11:43, Urs Thuermann wrote:
Please CC: any answers to my email address, since I'm not on this
list.
I'd like wget to get the time stamp of a file that is downloaded via
FTP and to set the mtime after writing the file to the local disk.
When using HTTP, this already
On 9 Apr 2002 at 16:52, Ian Abbott wrote:
Wget recently adopted use of another extension (SIZE) and has long
supported another extension (REST), so it could potentially adopt
other extensions if commonly used.
Correction: 'REST' is a standard FTP protocol command, not an
extension.
On 4 Apr 2002 at 17:13, Matthew Boedicker wrote:
I am trying to wget Apache log files (via ftp) and since the new file will
always contain at least the old, I want it to overwrite the file each time.
Is there any way to do this? If there isn't, may I suggest it as a new
option?
I agree a
committing it. The
patch does not include any documentation changes - these will
follow if the patch is committed.
N.B. The patch contains a form-feed. I'm not sure if that will
survive the email passage.
2002-04-05 Ian Abbott [EMAIL PROTECTED]
* wget.h (enum log_options): Set order
On 4 Apr 2002 at 5:51, Tristan Horn wrote:
Just wanted to point out that as of version 1.8.1, wget doesn't correctly
recognize A HREF=//foo/bar-style links.
tris.net/index.html: merge(http://tris.net/;, //www.arrl.org/) -
http://tris.net//www.arrl.org/
(it should return
On 4 Apr 2002 at 13:21, Robert Mücke wrote:
So it seems to be important to correct this behaviour. I think you only need
to set up a test site (maybe with some subdirs) containing one file with
an errorous href= tag to reproduce this (maybe only in parts
depending on your server
On 3 Apr 2002 at 14:56, Markus Werle wrote:
Jens Rösner wrote:
So, I do not know what your problem is, but is neither wget#s nor cuj's
fault, AFAICT.
:-(
I've just built Wget 1.7 on Linux and it seemed to download your
problem file okay. So I don't know what your problem is either!
On 3 Apr 2002 at 17:09, Markus Werle wrote:
Ian Abbott wrote:
On 3 Apr 2002 at 14:56, Markus Werle wrote:
I've just built Wget 1.7 on Linux and it seemed to download your
problem file okay. So I don't know what your problem is either!
Ah! The kind of problem I like most!
Did You
On 28 Mar 2002 at 18:01, Jens Rösner wrote:
I came across a crash caused by a cookie
two days ago. I disabled cookies and it worked.
I'm hoping you had debug output on when it crashed, otherwise this
is a different crash to the one I already know about. Can you
confirm this, please?
On 31 Mar 2002 at 14:23, ¶À«¾§ wrote:
may I ask some question?
do wget offer put function? (FTP put)
No current version of wget offers this function.
I need wget function, but reverse way, like put...
can wget do it? or is there any tool offer this?
There is a command-line tool called
On 26 Mar 2002 at 19:33, Tony Lewis wrote:
I wrote:
wget is parsing the attributes within the script tag, i.e., script
src=url. It does not examine the content between script and
/script.
and Ian Abbott responded:
I think it does, actually, but that is mostly harmless.
You're
On 26 Mar 2002 at 7:05, Tony Lewis wrote:
Csaba Ráduly wrote:
I see that wget handles SCRIPT with tag_find_urls, i.e. it tries to
parse whatever it's inside.
Why was this implemented ? JavaScript is most
used to construct links programmatically. wget is likely to find
bogus URLs
On 26 Mar 2002 at 19:01, Jens Rösner wrote:
I am using wget to parse a local html file which has numerous links into
the www.
Now, I only want hosts that include certain strings like
-H -Daudi,vw,online.de
It's probably worth noting that the comparisons between the -D
strings and the
On 22 Mar 2002 at 4:08, Hrvoje Niksic wrote:
The suggestion of having more than one admin is good, as long as there
are people who volunteer to do it besides me.
I'd volunteer too, but don't want to be the only person moderating
the lists for the same reasons as yourself. (I'm also completely
On 19 Mar 2002 at 22:53, Löfstrand Thomas wrote:
I use wget to get files from a FTP server.
The proxy server is Symantecs web security 2.0 product for solaris
which has a antivirus function.
I have used wget with -d option to see what is going on, and it seems
like the proxyserver returns
On 12 Mar 2002 at 3:18, sr111 wrote:
I have to modify some files in order to build
win32 port of wget using the free Borland C++Builder
compiler. Please refer to the attachment file for the
details.
I've modified Chin-yuan Kuo's patch for the current CVS. It builds
fine with the
This seems more appropriate for the main Wget list. The
wget-patches list is for patches!
--- Forwarded message follows ---
From: Tony Lewis [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject:Proposed new --unfollowed-links option for
--- Forwarded message follows ---
From: Tony Lewis [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject:Processing of JavaScript
Date sent: Fri, 8 Mar 2002 00:04:43 -0800
Some web sites include URL references within
--- Forwarded message follows ---
From: Tony Lewis [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject:Automatic posting to forms
Date sent: Thu, 7 Mar 2002 23:43:28 -0800
As promised in my earlier note, there is a second
On 8 Mar 2002 at 10:50, Mathias Kratzer wrote:
I admit that the lines in my original file contain a really stupid
syntax error. As an absolute beginner with the Markup Languages I have
just tried to learn from some hyperlink examples but obviously
misunderstood their formal
On 7 Mar 2002 at 17:50, Mathias Kratzer wrote:
While calling Wget 1.5.2 by
wget -F -O 69_4_522_Ref.res -i 69_4_522_Ref.mrq
on the attached file 69_4_522_Ref.mrq has worked very well I am left
with the error message
No URLs found in 69_4_522_Ref.mrq
whenever I try the same
On 6 Mar 2002 at 12:43, Mats Palmgren wrote:
I have a cron job that downloads Mozilla every night using wget.
Last night I got:
wget: retr.c:253: calc_rate: Assertion `msecs = 0' failed.
I think this can happen if the system time is reset backwards while
wget is downloading stuff.
On 20 Feb 2002 at 12:54, Noel Koethe wrote:
wget 1.8.1 is shipped with the files in doc/
wget.info
wget.info-1
wget.info-2
wget.info-3
wget.info-4
They are build out of wget.texi if I remove them
and makeinfo is installed.
The files are removed when runing make realclean.
I think
Here is a patch for a potential feature change. I'm not sending it
to the wget-patches list yet, as I'm not sure if it should be
applied as is, or at all.
The feature change is a minor amendment to the (bogus) test for
whether an existing local copy of a file is text/html when the or
not when
[The message I'm replying to was sent to [EMAIL PROTECTED]. I'm
continuing the thread on [EMAIL PROTECTED] as there is no bug and
I'm turning it into a discussion about features.]
On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
I've tried -w 30
--waitretry=30
--wait=30
On 14 Feb 2002 at 16:02, Steven Enderle wrote:
Sorry for not including any version information.
This is version 1.8.1, which I am using.
Sorry for not reading your bug report properly. I should have
realised that this was a different bug to the hundreds (it seems!)
of other reports about
On 14 Feb 2002 at 10:41, Steven Enderle wrote:
assertion percentage = 100 failed: file progress.c, line 552
zsh: abort (core dumped) wget -m -c --tries=0
ftp://ftp.scene.org/pub/music/artists/nutcase/mp3/timeofourlives.mp3
hope this helps in any way.
Thanks for the report. That's a
On 12 Feb 2002 at 12:30, Holger Pfaff wrote:
I'm having trouble using wget 1.8.[01] over a (squid24-) proxy
to mirror a ftp-directory:
# setenv ftp_proxy http://139.21.68.25:
# wget181 -r -np -l0 ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
--12:06:58--
On 12 Feb 2002 at 7:54, Winston Smith wrote:
# wget181 -r -np -l0
ftp://ftp.funet.fi/pub/Linux/mirrors/redhat/redhat/linux/updates
ummm... looks like the -l0 might be limiting your recursion level to 0
levels
No. '-l0' is the same as '-l inf'.
On 8 Feb 2002 at 4:26, Fred Holmes wrote:
At 02:54 AM 2/8/2002, Hrvoje Niksic wrote:
Wget currently uses KB as abbreviation for kilobyte. In a Debian
bug report someone suggested that kB should be used because it is
more correct. The reporter however failed to cite the reference for
this,
On 4 Feb 2002 at 15:21, Christian Busch wrote:
Hello,
i have a question. On a ftp-site that we need to mirror, our login is
wget -cm
ftp://christian.busch%40brainjunction.de:**xx**@esd.intraware.com/
as you see I tried to encode the @ as %40 as described in the manual.
This does
On 1 Feb 2002 at 8:17, Daniel Stenberg wrote:
You may count this mail as advocating for HTTP 1.1 support, yes! ;-)
I did write down some minimal requirements for HTTP/1.1 support on
a scrap of paper recently. It's probably still buried under the
more recent strata of crap on my desk somewhere!
On 31 Jan 2002 at 9:25, Fred Holmes wrote:
wget -N http://www.karenware.com/progs/*.*
fails with a not found whether the filespec is * or *.*
The * syntax works just fine with ftp
Is there a syntax that will get all files with http?
You could try
wget -m -l 1 -n
On 31 Jan 2002 at 8:41, Bruce BrackBill wrote:
The problem is, that my web pages are served up by php
and the content lengh is not defined. So as the manual states
I use --ignore-length. But when wget retrieves an image
it slows right down, possibly because it is ignoring
the
On 31 Jan 2002 at 9:48, Bruce BrackBill wrote:
Thanks for your responce Ian. When I use it without
--ignore-length option it appears that wget SOMETIMES ignores
the last_modified_date OR wget says to itself ( hey, I see the
file is older than the local copy, but hey, since the server
isn't
On 17 Jan 2002 at 2:15, Hrvoje Niksic wrote:
Michael Jennings [EMAIL PROTECTED] writes:
WGet returns an error message when the .wgetrc file is terminated
with an MS-DOS end-of-file mark (Control-Z). MS-DOS is the
command-line language for all versions of Windows, so ignoring the
On 21 Jan 2002 at 14:56, Thomas Lussnig wrote:
Why not just open the wgetrc file in text mode using
fopen(name, r) instead of rb? Does that introduce other
problems?
I think it has to do with comments because the defeinition is that
starting with '#' the rest of the line
is ignored. And
On 17 Jan 2002 at 18:17, Hrvoje Niksic wrote:
Ian Abbott [EMAIL PROTECTED] writes:
I'm also a little worried about the (time_t *)cookie-expiry_time
cast, as cookie-expiry time is of type unsigned long. Is a time_t
guaranteed to be the same size as an unsigned long?
It's not, but I have
On 16 Jan 2002 at 17:50, Hrvoje Niksic wrote:
Wget's strptime implementation comes from an older version of glibc.
Perhaps we should simply sync it with the latest one from glibc, which
is obviously capable of handling it?
That sounds like a good plan.
On 16 Jan 2002 at 17:45, Hrvoje Niksic wrote:
Aside from google, ~0UL is Wget's default value for the expiry time,
meaning the cookie is non-permanent and valid throughout the session.
Since Wget sets the value, Wget should be able to print it in DEBUG
mode.
Do you think this patch would
On 16 Jan 2002 at 8:02, David Robinson (AU) wrote:
In the meantime, however, '?' is problematic for Win32 users. It stops WGET
from working properly whenever it is found within a URL. Can we fix it
please.
My proposal for using escape sequences in filenames for problem
characters is up for
On 15 Jan 2002 at 14:48, Brent Morgan wrote:
Thanks to everyone for looking at this problem. I am not a developer
and at my wits end with this problem. I did determine with a different
cookie required site that it is still not working.
Could you change line 1017 of cmpt.c to read as
I came across this extract from a table on a website:
td ALIGN=CENTER VALIGN=CENTER WIDTH=120 HEIGHT=120a
href=66B27885.htm msover1('Pic1','thumbnails/MO66B27885.jpg');
onMouseOut=msout1('Pic1','thumbnails/66B27885.jpg');img
SRC=thumbnails/66B27885.jpg NAME=Pic1 BORDER=0 /a/td
Note the string
On 15 Jan 2002 at 0:27, Hrvoje Niksic wrote:
Brent Morgan [EMAIL PROTECTED] writes:
The -d debug option crashes wget just after it reads the input file.
Huh? Ouch! Wget on Windows is much less stable than I imagined. Can
you run it under a debugger and see what causes the crash?
I
This is an initial proposal for naming the files and directories
that Wget creates, based on the URLs of the retrieved documents.
At the moment there are many complaints about Wget failing to save
documents which have '?' in their URLs when running under Windows,
for example. In general, the set
On 10 Jan 2002 at 17:09, Matt Butt wrote:
I've just tried to download a 3Gb+ file (over a network using HTTP) with
WGet and it died at exactly 2Gb. Can this limitation be removed?
In principle, changes could be made to allow wget to be
configured
for large file support, by using the
On 11 Jan 2002 at 10:51, Picot Chappell wrote:
Thanks for your response. I tried the same command, using your URL, and it
worked fine. So I took a look at the site I was retrieving for the failed
test.
It's a ssl site (didn't think about it before) and I noticed 2 things. The
Frame
On 8 Jan 2002 at 20:31, Mike wrote:
What I'm looking for is something like the way FTP_Lite operates,
Can I nominate a single log file in the wgetrc for use by all the
wget processes that spawn off from my bash ?
There is the -a FILE (--append-output=FILE) option to append to a
logfile. A
On 7 Jan 2002 at 11:52, Jan Starzynski wrote:
for GNU Wget 1.8 I get the following assertion failed message:
get: progress.c:673: create_image: Zusicherung »p - bp-buffer = bp-width«
nicht erfüllt.
(snip)
In the changelogs of 1.8.1 I could not find a hint that this has been fixed
until
/ChangeLog entry:
2002-01-07 Ian Abbott [EMAIL PROTECTED]
* url.c (uri_merge_1): Deal with net path relative URL (one that
starts with //).
And the actual patch:
Index: src/url.c
===
RCS file: /pack/anoncvs/wget/src/url.c
On 3 Jan 2002 at 13:58, Henric Blomgren wrote:
Wget-bug:
GNU Wget 1.8
[...]
[root@MAGI .temporary]# wget: progress.c:673: create_image: Assertion `p -
bp-buffer = bp-width' failed.
Please use Wget 1.8.1. That bug has already been fixed!
On 18 Dec 2001 at 23:13, Hrvoje Niksic wrote:
Ian Abbott [EMAIL PROTECTED] writes:
If I have a website http://somesite/ with three files on it:
index.html, a.html and b.html, such that index.html links only to
a.html and a.html links only to b.html then the following command
On 19 Dec 2001 at 17:40, Alexey Aphanasyev wrote:
Hrvoje Niksic wrote:
The `gnu-md5.o' object is missing. Can you show us the output from
`configure'?
Yes, sure. Please find it attached bellow.
Have you tried running make distclean before ./configure? It is
possible that some of your
have the Referer set to that set by the --referer
option or nothing at all, and not necessarily the URL of the
referring page.
src/ChangeLog entry:
2001-12-18 Ian Abbott [EMAIL PROTECTED]
* recur.c (retrieve_tree): Pass on referring URL when retrieving
recursed URL.
Index: src
I don't have time to look at this problem today, but I thought I'd
mention it now to defer the 1.8.1 release.
If I have a website http://somesite/ with three files on it:
index.html, a.html and b.html, such that index.html links only to
a.html and a.html links only to b.html then the following
On 14 Dec 2001 at 14:49, Peng GUAN wrote:
Maybe a bug in file fnmatch.c, line 54:
( n==string || (flags FNM_PATHNAME) n[-1] == '/'))
the n[-1] should be change to *(n-1).
I like the easy ones. Those are equivalent in C. As to which of the
too looks the nicest is a matter of aesthetics
On 11 Dec 2001 at 18:40, [EMAIL PROTECTED] wrote:
It seems to me that if an output_document is specified, it is being
clobbered at the very beginning (unless always_rest is true). Later in
http_loop stat() comes up with zero length. Hence there's always a size
mismatch when --output-document
On 11 Dec 2001 at 16:09, Hrvoje Niksic wrote:
Summer Breeze [EMAIL PROTECTED] writes:
Here is a sample entry:
66.28.29.44 - - [08/Dec/2001:18:21:20 -0500] GET /index4.html%0A
HTTP/1.0 403 280 - Wget/1.6
/index4.html%0A looks like a page is trying to link to /index4.html,
but the
On 1 Dec 2001 at 4:04, Hrvoje Niksic wrote:
As a TODO entry summed up:
* -p should probably go _two_ more hops on FRAMESET pages.
More generally, I think it probably needs to be made to work for
nested framesets too.
On 29 Nov 2001 at 12:48, Herold Heiko wrote:
--12:27:26-- http://www.cnn.com/
(try: 3) = `www.cnn.com/index.html'
Found www.cnn.com in host_name_addresses_map (008D01B0)
Releasing 008D01B0 (new refcount 1).
Retrying.
(ecc.)
Same with other hosts
Could somebody please confirm if
On 29 Nov 2001 at 13:14, Daniel Stenberg wrote:
On Thu, 29 Nov 2001, Maciej W. Rozycki wrote:
On Wed, 28 Nov 2001, Ian Abbott wrote:
However, the Linux man page for bcopy(3) do not say the strings can
overlap
Presumably the man page is incorrect
Yes, I think so.
Well, can we
On 29 Nov 2001 at 14:40, Hrvoje Niksic wrote:
Ian, can you clarify what you meant by BSD man pages? Which BSD?
NetBSD: http://www.tac.eu.org/cgi-bin/man-cgi?bcopy+3
OpenBSD: http://www.openbsd.org/cgi-bin/man.cgi?query=bcopysektion=3
FreeBSD:
On 28 Nov 2001 at 18:08, Hrvoje Niksic wrote:
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 28 Nov 2001, zefiro wrote:
ld: Undefined symbol
_memmove
Do you have any suggestion ?
SunOS 4 is known to not have memmove.
May I suggest adding the following (or
On 27 Nov 2001, at 15:16, Hrvoje Niksic wrote:
So, does anyone know about the portability of rand()?
It's in the ANSI/ISO C spec (ISO 9899). It's always been in UNIX
(or at least it's been in there since UNIX 7th Edition), and I
should think it's always been in the MS-DOS compilers, but I
I got a segmentation fault when retrieving URLs from a file.
2001-11-27 Ian Abbott [EMAIL PROTECTED]
* retr.c (retrieve_from_file): Initialize `new_file' to NULL to
prevent seg fault.
Index: src/retr.c
===
RCS
On 27 Nov 2001 at 13:07, John Masinter wrote:
It seems that wget will download an entire large file regardless of what
I specify for the quota. For example I am trying to download only the
first 100K of a 800K file. I specify this:
wget -Q 100K http://url-goes-here
It then proceeds to
1 - 100 of 144 matches
Mail list logo