>>>>> "FC" == Faue, Caralynn <[email protected]> writes:

FC> Hello,

FC> I am so sorry for the intrusion, but I have come across your
FC> comment on bugzilla
FC> (https://bugzilla.wikimedia.org/show_bug.cgi?id=8473) regarding
FC> allowing search engines to crawl specialallpages.php. I am
FC> somewhat of a MediaWiki / php newbie, however our organization
FC> does have a WIMP installation of MediaWiki installed. I have been
FC> maintaining code in specialallpages.php that allowed Microsoft
FC> Search Server 2008 to crawl the wiki. The code is similar to this:

FC> $wgOut->setRobotpolicy( 'index,follow' );

FC> This was working, but recently it appears that my customization of
FC> this file is being ignored (when I view source the
FC> meta tag shows noindex, nofollow again). Do you have any insight into
FC> why this might be happening? I am not sure exactly
FC> when it started, however I did notice it after our upgrade to
FC> MediaWiki version 1.13.3. I have verified that the source
FC> code did not get overwritten during the upgrade.
FC> $wgOut->setRobotpolicy( 'index,follow' ); is still contained in
FC> specialallpages.php.

FC> Again, I am sorry to just email you directly, I have been watching the
FC> bugzilla site for updates, but none have been
FC> posted.

I'll Cc the newsgroup.
I ended up using sitemaps, but I would love to not make sitemaps, if
the aforementioned bug was fixed.
(I never stray outside of LocalSettings.php with my changes.)

FC> Thanks in advance,

FC> Caralynn Faue

FC> Application Developer

FC> MTS Systems Corporation


_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to