Assuming you have Wikimedia-style URLs:
User-agent: *
Disallow: /w/
Disallow: /wiki/Special:Search
Disallow: /wiki/Special:Random

Your server will be able to handle a lot more if you set up as much caching
as you can <http://www.mediawiki.org/wiki/Manual:Cache>.  No sense letting
all that spare RAM rot. :)


On Sat, Jan 31, 2009 at 10:02 PM, Philip Beach <[email protected]>wrote:

> I already have checked the access logs. It appears that Google and Yahoo
> are
> indeed generating a lot of traffic. Good idea Rob, I've been working on
> this
> for a while.
>
> Just out of curiosity, what should my robots.txt look like for Mediawiki.
> Does anything need to be disallowed?
>
> On Sat, Jan 31, 2009 at 8:30 PM, Platonides <[email protected]> wrote:
>
> > You should check the access logs for which is causing the error.
> >
> >
> > _______________________________________________
> > MediaWiki-l mailing list
> > [email protected]
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> _______________________________________________
> MediaWiki-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to