Hi,
The manual says
"If the local file does not exist, or the sizes of the files do not
match, Wget will download the remote file no matter what the
time-stamps say."
In two cases I'm not seeing this:
1) With if-modified-since I don't believe the content-length is
checked at all
2)
Hi,
The manual says
"If the local file does not exist, or the sizes of the files do not
match, Wget will download the remote file no matter what the
time-stamps say."
In two cases I'm not seeing this:
1) With if-modified-since I don't believe the content-length is
checked
2) Without
* main.c: Add "--rejected-log" option.
* init.c: Add "rejectedlog" command.
* options.h: Add "rejected_log" parameter string.
* wget.texi: Add brief documentation on new --rejected-log option.
* recur.c: Optionally log details of URLs not traversed.
Add reject_reason enum.
(download_chil
Hey thanks for the reply,
On Mon, Jul 27, 2015 at 05:03:53PM +0200, Giuseppe Scrivano wrote:
> I tried the patch but some tests fail after I apply it, could you verify
> that "make check" passes without problems? It seems wget exits with the
> wrong return code.
I was unaware there were tests. I
Jookia <166...@gmail.com> writes:
> This allows you to figure out why URLs are being rejected and some context
> around it.
> ---
> doc/wget.texi | 5 ++
> src/init.c| 2 +
> src/main.c| 3 ++
> src/options.h | 2 +
> src/recur.c | 149
>
> what happens when you also specify -e robots=off in the command? <
Then it works.
Thanks a lot!
-- dave
On Mon, Jul 27, 2015 at 11:53 AM, Giuseppe Scrivano
wrote:
> Dave Ohlsson writes:
>
> > I have tried several wget options, with no luck.
> >
> > What could be the problem?
>
> what happe
Giuseppe Scrivano wrote:
Dave Ohlsson writes:
I have tried several wget options, with no luck.
What could be the problem?
what happens when you also specify -e robots=off in the command?
You're correct. I forgot I had 'robots = off' in my
%WGETRC% file.
--
--gv
Dave Ohlsson wrote:
I would like to download this page:
https://noppa.aalto.fi/noppa/kurssi/ms-a0210/viikkoharjoitukset
as well as its subpages, especially the .pdf documents:
...
When I give this command:
$ wget --page-requisites --convert-links --recursive --level=0
--no-check-
Dave Ohlsson writes:
> I have tried several wget options, with no luck.
>
> What could be the problem?
what happens when you also specify -e robots=off in the command?
Regards,
Giuseppe
Hi,
I would like to download this page:
https://noppa.aalto.fi/noppa/kurssi/ms-a0210/viikkoharjoitukset
as well as its subpages, especially the .pdf documents:
https://noppa.aalto.fi/noppa/kurssi/ms-a0210/viikkoharjoitukset/MS-A0210_thursday_30_oct.pdf
https://noppa.aalto.fi/noppa/kurssi/
10 matches
Mail list logo