https://issues.apache.org/SpamAssassin/show_bug.cgi?id=7013

--- Comment #10 from Kevin A. McGrail <[email protected]> ---
(In reply to Mark Martinec from comment #7)
> I'd be happier if this change were reverted and the definition
> of the BAYES_99 rule reverted to what we had before.
> 
> The new BAYES_999 (under whatever name) would then just match
> a range from 0.999 to 1.0 and just contribute some additional
> score points, complementing the BAYES_99.
> 
> It would fix/regain compatibility with existing code base
> and non-disruptively add another scoring possibility.
> 
> Inventing a new tag could still be done later if needed.

+1 as well.

I'm running a make test now and will commit in a few.

Can one of you please make an update on dev/users announcing the change that
BAYES_99 is being reverted and BAYES_999 will be scored as an overlapping
additive rule?  

BAYES_999 will lower to 0.2 tomorrow so people with high scores on it might
want to lower them.

John, the scores are static, FYI.  The rules are in 23_bayes and scores in
50_scores.  The sandbox work I did is what started this mess ;-)

Regards,
KAM

-- 
You are receiving this mail because:
You are the assignee for the bug.

Reply via email to