https://bugzilla.wikimedia.org/show_bug.cgi?id=62468

--- Comment #6 from Nathan Larson <nathanlarson3...@gmail.com> ---
(In reply to Nemo from comment #5)
> IA doesn't crawl on request.
> On what "Allow" directives and other directives do or should take
> precedence, please see (and reply) on
> https://archive.org/post/1004436/googles-robotstxt-rules-interpreted-too-
> strictly-by-wayback-machine

I might reply to that, as more information becomes available. Today, I set my
site's robots.txt to say:

User-agent: *
Disallow: /w/

User-agent: ia_archiver
Allow: /*&action=raw

So, I guess a few months from now, I'll see whether the archive of my wiki for
12 March 2014 and thereafter has the raw pages. If not, that's a bug, I think.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to