So, the first step of the rules project -- the sandbox idea, and the build scripts that "compile" code from the sandbox into the "rules" directory, is pretty much done, as http://wiki.apache.org/spamassassin/RulesProjectPlan notes.
If you're running an existing auto-mass-checker or buildbot slave, you need to ensure it's got the svn credentials set up to check out the current source tree. (See below for details.) We may still be making some changes in the next few weeks -- e.g. there's an issue that we may need to have another set of "source" rules files -- but the fundamentals are operational. This gives us the ability to keep rules in a "source tree", where they can preserve their revision history, be kept alongside as-yet-unpromoted development rules, and break without breaking the entire "built" ruleset. The next step is to get the mass-checking systems up to scratch. There are a few improvements to make to those, and the addition of the new buildbot, to implement. The task list is at http://wiki.apache.org/spamassassin/RulesProjectPlan , if you're curious... PS: HOW TO FIX YOUR AUTOMATED CHECKOUT As promised above -- details on fixing the svn credentials so your automated scripts can get the latest code: - su to the user the automated code is running as; - cd to the SVN checkout directory it uses; - type "svn up"; - hit "p" (to accept the certificate permanently) when it asks you if you want to accept the svn server's certificate. That should do it. --j.
