So if we can ban bots from the page histories or turn them off for the
bot user agents or something then we might have a cure. Perhaps we just
need to upgrade our media wiki software or find out how other sites
using this software deal with the same issue of bots reading page
histories.

The wiki could be configured to use /haskellwiki/index.php?.. urls for diffs (I believe this can be done by changing $wgScript). Then robots.txt could be changed to
> Disallow: /haskellwiki/index.php
Which bans robots from everything except normal pages.

Twan
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to