I was just wondering if a built in support for dynamic robots.txt files and google sitemaps wouldn't be nice. I think that dynamically generating google sitemap files would really help users of typo to get their page out to the masses and controlling the robots.txt file could reduce load and keep unwanted content of the search engines. For example there is no need for google to be always searching the RSS feeds.
Having some control over it in the admin section is something I would love. How about the rest? Robots.txt info: http://www.robotstxt.org/wc/robots.html http://www.pageresource.com/zine/robotstxt.htm Google Sitemaps info: https://www.google.com/webmasters/sitemaps/docs/en/navigation.html https://www.google.com/webmasters/sitemaps/docs/en/faq.html -- -------------- Jon Gretar Borgthorsson http://www.jongretar.net/ _______________________________________________ Typo-list mailing list [email protected] http://rubyforge.org/mailman/listinfo/typo-list
