https://bugzilla.wikimedia.org/show_bug.cgi?id=33406
--- Comment #25 from MZMcBride <[email protected]> --- From <https://bugzilla.wikimedia.org/robots.txt>: --- User-agent: * Disallow: /*.cgi Disallow: /*show_bug.cgi*ctype=* Allow: / Allow: /*index.cgi Allow: /*show_bug.cgi Allow: /*describecomponents.cgi Allow: /*page.cgi --- http://www.robotstxt.org/faq/robotstxt.html seems to indicate that wildcards are unsupported in robots.txt files: --- Wildcards are _not_ supported: instead of 'Disallow: /tmp/*' just say 'Disallow: /tmp/'. --- There also seems to be an assumption that Allow rules can override previous Disallow rules. I'm not sure if this is actually the case. If *.cgi is disallowed, will *show_bug.cgi become allowed with a later directive? https://encrypted.google.com/search?hl=en&q=site%3Abugzilla.wikimedia.org indicates that, as stated in comment 24, bugzilla.wikimedia.org is not being indexed by Google at all currently. -- You are receiving this mail because: You are on the CC list for the bug. _______________________________________________ Wikibugs-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikibugs-l
