On 21.10.2012 15:13, imacat wrote:
     I found the following rule in the robots.txt of our wiki:

User-Agent: *
Disallow: /

     Does any know if there is any special reason why it is set so?  Does
any have any reason to keep it?  I'm thinking of removing this rule.

+1, blocking all search robots makes no sense.
Google etc. are also much more successful in finding relevant results to non-trivial searches. The wiki-builtin search had many problems [1], many of which are fixed in the meantime though.

[1] http://www.mediawiki.org/wiki/Search_issues

Herbert

Reply via email to