I request the following error:
== SIZE compact-3.0-rc4.iso ... wget: xmalloc.c:190: checking_free:
Assertion `ptr != ((void *)0)' failed.
zsh: 23726 abort wget
what's this?
--
Eugene
Hello list,
I am a big fan of wget, but I discovered a minor annoyance (not sure if
it even is a bug):
When downloading multiple files with wget to a single output
(e.g. wget -Oout http://file1 http://file2 http://file3), the timestamp
of the resulting file becomes the timestamp of the *last*
On Monday 08 August 2005 11:29 am, Hrvoje Niksic wrote:
Jeroen Demeyer [EMAIL PROTECTED] writes:
I am a big fan of wget, but I discovered a minor annoyance (not sure
if it even is a bug):
When downloading multiple files with wget to a single output
(e.g. wget -Oout http://file1
Mauro Tortonesi [EMAIL PROTECTED] writes:
i agree with hrvoje. but this is just a side-effect of the real
problem: the semantics of -O with a multiple files download is not
well defined.
-O with multiple URLs concatenates all content to the given file.
This is intentional and supported: for
Robin Laurén [EMAIL PROTECTED] writes:
My question is about the number on one of the last lines of the
logged output, the reported download speed. What exactly does
wget's download speed report? Is this the speed of just the data
downloaded, or does the value include the lag time between
Thanks for the report; I believe this bug is fixed in Wget's
subversion repository.
On Sunday 10 July 2005 09:52 am, Tony Lewis wrote:
Thomas Boerner wrote:
Is this behaviour: robots.txt takes precedence over -p a bug or
a feature?
It is a feature. If you want to ignore robots.txt, use this command line:
wget -p -k www.heise.de/index.html -e robots=off
hrvoje was
On Saturday 09 July 2005 10:43 am, Robert Scheck wrote:
Hi folks,
today I noticed, that there isn't a up-to-date translation of wget 1.10, so
I retranslated the de.po file.
And currently, 1.10.1-beta1's de.po equals exactly that one of 1.10, so the
translation can also used for 1.10, if
On Wednesday 03 August 2005 08:14 am, dan1 wrote:
Hello.
I am using wget since a long time now. I like it very much.
However I have 2 requests of enhancements that I think to be important and
very useful:
1. There should be a 'download acceleration' mode that triggers several
downloads at
I hope that doesn't happen. While respecting robots.txt is not an
absolute requirement, it is considered polite. I would not want the
default behavior of wget to be considered impolite.
Mark Post
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Monday, August
On Saturday 09 July 2005 10:34 am, Abdurrahman ÇARKACIOĞLU wrote:
MS Internet Explorer can save a web page as a whole. That means all the
images,
Tables, can be saved as a file. It is called as Web Archieve, single file
(*.mht).
Does it possible for wget ?
not at the moment, but it's a
am new to wget and I was wondering if any one out
there can assist me with the following error messages
in my config.log file,
What do I need to do to get wget working ? please
respond !!
$ ./configure
PATH: /usr/ucb
## --- ##
## Core tests. ##
## --- ##
On Thursday 04 August 2005 03:43 pm, kayode giwa wrote:
am new to wget and I was wondering if any one out
there can assist me with the following error messages
in my config.log file,
What do I need to do to get wget working ? please
respond !!
i am not familiar with solaris, but it seems
I would say the analogy is closer to a very rabid person operating a web
browser. I've never been greatly inconvenienced by having to re-run a
download while ignoring the robots.txt file. As I said, respecting
robots.txt is not a requirement, but it is polite. I prefer my tools to
be polite
hi to everybody,
i am very sorry if for the last few weeks i have been completely silent, but
hurricanes:
http://www.tortonesi.com/cgi-bin/blosxom.cgi/2005/07/11#dennis-20050711
and work:
http://www.tortonesi.com/cgi-bin/blosxom.cgi/2005/07/27#jsr82ext-20050727
15 matches
Mail list logo