Do we need this for every subpage (like de, download, www) or can it be implemented more centralized?

Marcus



Am 08/22/2011 04:57 PM, schrieb [email protected]:
Author: rbircher
Date: Mon Aug 22 14:57:21 2011
New Revision: 1160285

URL: http://svn.apache.org/viewvc?rev=1160285&view=rev
Log:
Update robots txt and disalow all agent whele the clean up phase

Modified:
     incubator/ooo/site/trunk/content/openofficeorg/de/robots.txt

Modified: incubator/ooo/site/trunk/content/openofficeorg/de/robots.txt
URL: 
http://svn.apache.org/viewvc/incubator/ooo/site/trunk/content/openofficeorg/de/robots.txt?rev=1160285&r1=1160284&r2=1160285&view=diff
==============================================================================
--- incubator/ooo/site/trunk/content/openofficeorg/de/robots.txt (original)
+++ incubator/ooo/site/trunk/content/openofficeorg/de/robots.txt Mon Aug 22 
14:57:21 2011
@@ -1,16 +1,2 @@
  User-Agent: *
-Disallow: /source/
-Disallow: /issues/
-Disallow: /search/
-Disallow: /project/
-Disallow: /nonav/
-Disallow: /testdir/
-Disallow: /servlets/ErrorPage
-Disallow: /servlets/NewsItemView
-Disallow: /servlets/ProjectDocumentList
-Disallow: /servlets/ProjectHome
-Disallow: /servlets/ProjectMailingListList
-Disallow: /servlets/ProjectMemberList
-Disallow: /servlets/ProjectNewsList
-Disallow: /servlets/Search
-Disallow: /servlets/SearchList
+Disallow: /

Reply via email to