--- Comment #3 from Tim Landscheidt <> ---
(In reply to comment #2)
> Why would the first be a WONTFIX?

Because there are tools that are linked from every wiki page and any spider
accessing them brings the house down.  As tools are created and updated without
any review by admins and wiki edits are not monitored as well, blacklisting
them after the meltdown doesn't work.

So unlimited spider access is not possible.

> For the second see the docs,

Unfortunately, there is no specification for robots.txt; that's the core of the

> Allow: /$

> is supposed to work (at least with Google).

According to [[de:Robots Exclusion Standard]] with Googlebot, Yahoo! Slurp and
msnbot.  And the other spiders?  Will they read it in the same way or as "/"? 
How do we whitelist "/?Rules"?

You are receiving this mail because:
You are on the CC list for the bug.
Wikibugs-l mailing list

Reply via email to