--- Comment #6 from Nathan Larson <> ---
(In reply to Nemo from comment #5)
> IA doesn't crawl on request.
> On what "Allow" directives and other directives do or should take
> precedence, please see (and reply) on
> strictly-by-wayback-machine

I might reply to that, as more information becomes available. Today, I set my
site's robots.txt to say:

User-agent: *
Disallow: /w/

User-agent: ia_archiver
Allow: /*&action=raw

So, I guess a few months from now, I'll see whether the archive of my wiki for
12 March 2014 and thereafter has the raw pages. If not, that's a bug, I think.

You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
Wikibugs-l mailing list

Reply via email to