Here's an idea that might be a little complicated to describe: Every time a [potential] user interacts with the documentation there is human attention, intelligence, and effort that is engaging the system. What if the system were designed to harness some of that?
If a documentation framework included an interface that would enable users to provide feedback in a variety of ways (e.g., annotations, comments, ratings, etc.) that might be useful information. If a user is known, their feedback could be qualified or weighted in some way based on [expectations generated from] their history. If that analysis were automated and tied into the ticket system, developers and maintainers would only need to address fairly abstract alerts. Developer interaction with the ticket system could then provide feedback to the analysis system but I suspect I am probably way off the reservation at this point. Also, I suspect some kind of rules engine would need to be integrated, either embedded or as an extension. _______________________________________________ fossil-users mailing list [email protected] http://lists.fossil-scm.org:8080/cgi-bin/mailman/listinfo/fossil-users

