[
http://jira.xwiki.org/jira/browse/XWIKI-343?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_22785
]
Sergiu Dumitriu commented on XWIKI-343:
---------------------------------------
I think we do need a default robots.txt file, or otherwise an open wiki will
get deleted one day when Google decides to follow all the Delete links...
> Default robots.txt paths in web build
> -------------------------------------
>
> Key: XWIKI-343
> URL: http://jira.xwiki.org/jira/browse/XWIKI-343
> Project: XWiki Core
> Issue Type: Improvement
> Components: Admin, Build, Infrastructure and Tests, Packaging
> Affects Versions: 1.0 B1
> Environment: all
> Reporter: Brian M. Thomas
> Priority: Minor
> Fix For: Future
>
>
> Because a web application is the best judge of which application paths are
> useful and appropriate for indexing by web crawlers, provision should be made
> to have the application's deployment set paths starting at its context root
> in the server's robots.txt file.
> There is, at present, no known standard for doing this, but we'll be looking
> for one, and creating one if there isn't one already.
> As an initial guess, probably just a list of paths to disallow (relative to
> the context root), to be read by the deployment process and used to replace
> the set of existing Disallow lines in robots.txt that start with the
> application's deployed context root, formatted appropriately. A sed script
> could do this on Unix machines.
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://jira.xwiki.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira
_______________________________________________
notifications mailing list
[email protected]
http://lists.xwiki.org/mailman/listinfo/notifications