On Sun, Jun 22, 2014 at 1:10 AM, György Chityil
<[email protected]> wrote:
> Thank you so much! This is a perfect reply. Couldn't have asked for more.
>
> While not a bug, an additional idea came to my mind while reading your
> reply. If this robots checking feature will be fixed, it would be great to
> be able to enable robots checking for simple, one off requests as w

That does not make sense. The robots.txt file is expected to be
checked and adhered to by robots / spiders trying to crawl the
website, not by clients trying to access a single / few pages. Wget
should NOT check the robots file for single downloads. It should be
following the rules in robots.txt only when acting like a spider, i.e.
when in recursive mode.


-- 
Thanking You,
Darshit Shah

Reply via email to