As no objection was seen, I have removed robots.txt on our Wiki host.
Please speak out if anyone still found an problem that we should keep it.

On 2012/10/22 17:33, Rory O'Farrell said:
> On Mon, 22 Oct 2012 11:29:18 +0200
> Herbert Duerr <h...@apache.org> wrote:
> 
>> On 21.10.2012 15:13, imacat wrote:
>>>      I found the following rule in the robots.txt of our wiki:
>>>
>>> User-Agent: *
>>> Disallow: /
>>>
>>>      Does any know if there is any special reason why it is set so?  Does
>>> any have any reason to keep it?  I'm thinking of removing this rule.
>>
>> +1, blocking all search robots makes no sense.
>> Google etc. are also much more successful in finding relevant results to 
>> non-trivial searches. The wiki-builtin search had many problems [1], 
>> many of which are fixed in the meantime though.
>>
>> [1] http://www.mediawiki.org/wiki/Search_issues
>>
>> Herbert
>>
> 
> +1
> 


-- 
Best regards,
imacat ^_*' <ima...@mail.imacat.idv.tw>
PGP Key http://www.imacat.idv.tw/me/pgpkey.asc

<<Woman's Voice>> News: http://www.wov.idv.tw/
Tavern IMACAT's http://www.imacat.idv.tw/
Woman in FOSS in Taiwan http://wofoss.blogspot.com/
Apache OpenOffice http://www.openoffice.org/
EducOO/OOo4Kids Taiwan http://www.educoo.tw/
Greenfoot Taiwan http://greenfoot.westart.tw/

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to