--- Comment #1 from Alexandros Kosiaris <akosia...@wikimedia.org> ---
Commenting just to make something clear. Changing the robots.txt will not have
the Internet Archive automagically archive pads. The reason being that no links
exist for any spider to follow. It might be possible for pads whose links have
been posted in various places to be archived but whether that will happen or
not depends entirely on IA's spider implementation. The "no links" problem can
be solved by having a page list all pads. That in turn could possibly be solved
with any of the various pad listing plugins but last we checked none of them
were production quality.
Some more info can be found here:
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
Wikibugs-l mailing list