kayode giwa <[EMAIL PROTECTED]> writes:
> am new to wget and I was wondering if any one out
> there can assist me with the following error messages
> in my config.log file,
> What do I need to do to get wget working ? please
> respond !!
>
>
>
> $ ./configure
>
>
> PATH: /usr/ucb
>
>
> ## --
hi to everybody,
i am very sorry if for the last few weeks i have been completely silent, but
hurricanes:
http://www.tortonesi.com/cgi-bin/blosxom.cgi/2005/07/11#dennis-20050711
and work:
http://www.tortonesi.com/cgi-bin/blosxom.cgi/2005/07/27#jsr82ext-20050727
http://www.tortonesi.com/cgi-bi
I would say the analogy is closer to a very rabid person operating a web
browser. I've never been greatly inconvenienced by having to re-run a
download while ignoring the robots.txt file. As I said, respecting
robots.txt is not a requirement, but it is polite. I prefer my tools to
be polite unle
On Thursday 04 August 2005 03:43 pm, kayode giwa wrote:
> am new to wget and I was wondering if any one out
> there can assist me with the following error messages
> in my config.log file,
> What do I need to do to get wget working ? please
> respond !!
i am not familiar with solaris, but it seems
am new to wget and I was wondering if any one out
there can assist me with the following error messages
in my config.log file,
What do I need to do to get wget working ? please
respond !!
$ ./configure
PATH: /usr/ucb
## --- ##
## Core tests. ##
## --- ##
configure:150
On Monday 08 August 2005 07:30 pm, Post, Mark K wrote:
> I hope that doesn't happen. While respecting robots.txt is not an
> absolute requirement, it is considered polite. I would not want the
> default behavior of wget to be considered impolite.
IMVHO, hrvoje has a good point when he says that
On Saturday 09 July 2005 10:34 am, Abdurrahman ÇARKACIOĞLU wrote:
> MS Internet Explorer can save a web page as a whole. That means all the
> images,
>
> Tables, can be saved as a file. It is called as "Web Archieve, single file
> (*.mht)".
>
> Does it possible for wget ?
not at the moment, but it
I hope that doesn't happen. While respecting robots.txt is not an
absolute requirement, it is considered polite. I would not want the
default behavior of wget to be considered impolite.
Mark Post
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Monday, August 0
On Wednesday 03 August 2005 08:14 am, dan1 wrote:
> Hello.
>
> I am using wget since a long time now. I like it very much.
>
> However I have 2 requests of enhancements that I think to be important and
> very useful:
>
> 1. There should be a 'download acceleration' mode that triggers several
> down
i have just made a few changes to the wget development page:
http://www.gnu.org/software/wget/wgetdev.html
to add instructions on how to access our new subversion repository and remove
every reference to the old and no more supported CVS repository.
this was a badly needed change that i had to
On Saturday 09 July 2005 10:43 am, Robert Scheck wrote:
> Hi folks,
>
> today I noticed, that there isn't a up-to-date translation of wget 1.10, so
> I retranslated the de.po file.
>
> And currently, 1.10.1-beta1's de.po equals exactly that one of 1.10, so the
> translation can also used for 1.10,
On Sunday 10 July 2005 09:52 am, Tony Lewis wrote:
> Thomas Boerner wrote:
> > Is this behaviour: "robots.txt takes precedence over -p" a bug or
> > a feature?
>
> It is a feature. If you want to ignore robots.txt, use this command line:
>
> wget -p -k www.heise.de/index.html -e robots=off
hrvoje
Thanks for the report; I believe this bug is fixed in Wget's
subversion repository.
Robin Laurén <[EMAIL PROTECTED]> writes:
> My question is about the number on one of the last lines of the
> logged output, the reported download speed. What exactly does
> wget's download speed report? Is this the speed of just the data
> downloaded, or does the value include the lag time betwe
Mauro Tortonesi <[EMAIL PROTECTED]> writes:
> i agree with hrvoje. but this is just a side-effect of the real
> problem: the semantics of -O with a multiple files download is not
> well defined.
-O with multiple URLs concatenates all content to the given file.
This is intentional and supported: f
On Monday 08 August 2005 11:29 am, Hrvoje Niksic wrote:
> Jeroen Demeyer <[EMAIL PROTECTED]> writes:
> > I am a big fan of wget, but I discovered a minor annoyance (not sure
> > if it even is a bug):
> >
> > When downloading multiple files with wget to a single output
> > (e.g. wget -Oout http://fi
A few comments about the bug tracker saga...
roundup is a really cool piece of software, but it seems that its developers
don't really give a damn about backward compatibility and painless upgrades:
http://roundup.sourceforge.net/doc-0.8/upgrading.html
(well, I can't really blame them since th
"Phill Bertolus" <[EMAIL PROTECTED]> writes:
> Hi guys,
>
> in http.c there are a couple of lines that say
>
> if (opt.save_headers)
> fwrite(head,1,strlen(head), fp);
>
> Unfortunately head has been deallocated by this time in 1.10. The
> 1.9.1 version correctly saved the head info in to
Jeroen Demeyer <[EMAIL PROTECTED]> writes:
> I am a big fan of wget, but I discovered a minor annoyance (not sure
> if it even is a bug):
>
> When downloading multiple files with wget to a single output
> (e.g. wget -Oout http://file1 http://file2 http://file3), the
> timestamp of the resulting fi
Hello list,
I am a big fan of wget, but I discovered a minor annoyance (not sure if
it even is a bug):
When downloading multiple files with wget to a single output
(e.g. wget -Oout http://file1 http://file2 http://file3), the timestamp
of the resulting file becomes the timestamp of the *last* fil
I request the following error:
==> SIZE compact-3.0-rc4.iso ... wget: xmalloc.c:190: checking_free:
Assertion `ptr != ((void *)0)' failed.
zsh: 23726 abort wget
what's this?
--
Eugene
situation:
when site already downloaded - you knew that any of files not needed and
you delete them, but want to monitor for new files that arrive on main
site after date XXX:
suugestion :
add one more option "--after-date" or "--newer-than-days" or "--max-age"
or any similar
Best Regards
22 matches
Mail list logo