Google offers such a tool with their Webmaster Tools <
https://www.google.com/webmasters/tools>; I'm sure there are plenty of
similar tools out there.

On Sun, Feb 1, 2009 at 1:17 PM, Philip Beach <beachboy4...@gmail.com> wrote:

> Ok thanks, but how can I be sure it's working before all my pages drop
> off google  (if it were wrong). Is there some way to validate it in
> the context of my site?
>
> Thanks again
>
>
> On 2/1/09, Benjamin Lees <emufarm...@gmail.com> wrote:
> > TryUser-agent: *
> > Disallow: /index.php
> > Disallow: /skins/
> > Disallow: /Special:Search
> > Disallow: /Special:Random
> >
> > Some other good rules to include are
> > Disallow: /MediaWiki:
> > Disallow: /Template:
> >
> > and maybe
> > Disallow: /Category:
> >
> > On Sun, Feb 1, 2009 at 2:06 AM, Philip Beach <beachboy4...@gmail.com>
> wrote:
> >
> >> Haha true about rotting RAM, I'll look into that. I am not using
> wikimedia
> >> style URL's, sadly :( it just didn't happen when the site was first set
> up
> >> and I can't move it now, for various reasons. All of my files are in the
> >> web-root /. However, through an apache alias, my url is
> >> mywiki.com/Pagename.
> >>
> >> How would robots.txt look for that? Would I simply drop the preceeding
> >> /wiki, like this?
> >>
> >> User-agent: *
> >> Disallow: /Special:Search
> >> Disallow: /Special:Random
> >>
> >> Thanks a ton!
> >>
> >> On Sun, Feb 1, 2009 at 1:47 AM, Benjamin Lees <emufarm...@gmail.com>
> >> wrote:
> >>
> >> > Assuming you have Wikimedia-style URLs:
> >> > User-agent: *
> >> > Disallow: /w/
> >> > Disallow: /wiki/Special:Search
> >> > Disallow: /wiki/Special:Random
> >> >
> >> > Your server will be able to handle a lot more if you set up as much
> >> caching
> >> > as you can <http://www.mediawiki.org/wiki/Manual:Cache>.  No sense
> >> letting
> >> > all that spare RAM rot. :)
> >> >
> >> >
> >> > On Sat, Jan 31, 2009 at 10:02 PM, Philip Beach <
> beachboy4...@gmail.com
> >> > >wrote:
> >> >
> >> > > I already have checked the access logs. It appears that Google and
> >> Yahoo
> >> > > are
> >> > > indeed generating a lot of traffic. Good idea Rob, I've been working
> >> > > on
> >> > > this
> >> > > for a while.
> >> > >
> >> > > Just out of curiosity, what should my robots.txt look like for
> >> Mediawiki.
> >> > > Does anything need to be disallowed?
> >> > >
> >> > > On Sat, Jan 31, 2009 at 8:30 PM, Platonides <platoni...@gmail.com>
> >> > wrote:
> >> > >
> >> > > > You should check the access logs for which is causing the error.
> >> > > >
> >> > > >
> >> > > > _______________________________________________
> >> > > > MediaWiki-l mailing list
> >> > > > MediaWiki-l@lists.wikimedia.org
> >> > > > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >> > > >
> >> > > _______________________________________________
> >> > > MediaWiki-l mailing list
> >> > > MediaWiki-l@lists.wikimedia.org
> >> > > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >> > >
> >> > _______________________________________________
> >> > MediaWiki-l mailing list
> >> > MediaWiki-l@lists.wikimedia.org
> >> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >> >
> >> _______________________________________________
> >> MediaWiki-l mailing list
> >> MediaWiki-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >>
> > _______________________________________________
> > MediaWiki-l mailing list
> > MediaWiki-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to