Comment #3 on issue 3904 by [email protected]: robots.txt: disallow old
stable and old development doc
http://code.google.com/p/lilypond/issues/detail?id=3904
There is a serious problem with Google search preferring old versions of
the documentation to the newest version, but I think it's a bad idea to
totally disable searching of old versions, even if you explicitly include
the version number as a search keyword or search for a command that's since
been renamed. Sometimes I want to search old documentation. Disallowing all
user agents in robots.txt also makes the pages disappear retroactively from
the Wayback Machine, which is clearly not the intention here.
I think the best approach would be to put the documentation for the current
stable version under both /doc/${VERSION} and /doc/current, with
/doc/current as the preferred prefix for most documentation links. That
would allow the up-to-date documentation to build PageRank over time,
instead of starting from scratch every time the version number is bumped.
Eventually it would be consistently ranked over versioned urls.
It would also be nice if a prominent notice were displayed at the top of
pages describing other versions, like the warning you get when viewing an
old version of a Wikipedia article.
--
You received this message because this project is configured to send all
issue notifications to this address.
You may adjust your notification preferences at:
https://code.google.com/hosting/settings