On 2011/03/01 2:38 PM, [email protected] wrote:
https://issues.apache.org/SpamAssassin/show_bug.cgi?id=6534


Replying to list as I doubt this needs to be attached to the bug...

http://ruleqa.spamassassin.org/20110226-r1074804-n/T_RCVD_IN_UCEPROTECT_L1/detail
It has very high overlaps with MSPIKE_BL at 80%, PSBL at 73% and HOSTKARMA_BL
at 89%.

Should overlaps with BL's not included in published rules really count against it? It could certainly explain why I'm seeing better success over here. Plus, what about tests for L2 and L3? Are those not included somewhere?

Yet despite its similarities with those high safety rated blacklists,
2% of our ham corpus from the past week hit in UCEPROTECT_L1, and this has been
pretty consistent for the previous four weeks.


I haven't spent any time over the years reading ruleqa results, but is it normal for a rule's S/O to bounce all over the place from week to week, or is that an indication of a problem in the testing methodology? Also, is it safe to assume that stats from contributors who are grayed out are not counted in the overall stats?

Every HAM hit on my end (that has been tracked) has been tracked down to a message from constantcontact.com servers. I can certainly see how they would end up on a blacklist, but also skew HAM results. Any way of checking if that's the case for the corpus, too?

--
/Jason

Reply via email to