Dana Holland said:
> Chris Thielen wrote:
>
>> Dana, could you paste the output from "sa-learn --dump magic"?
>
> # sa-learn --dump magic
> 0.000          0          2          0  non-token data: bayes db version
> 0.000          0       2411          0  non-token data: nspam
> 0.000          0       6972          0  non-token data: nham
> 0.000          0    1146536          0  non-token data: ntokens
> 0.000          0 1077124857          0  non-token data: oldest atime
> 0.000          0 1077205613          0  non-token data: newest atime
> 0.000          0 1077205625          0  non-token data: last journal
> sync atime
> 0.000          0 1077205769          0  non-token data: last expiry atime
> 0.000          0      43200          0  non-token data: last expire
> atime delta
> 0.000          0     194438          0  non-token data: last expire
> reduction count
>

OK, scratch that line of thought.   Your bayes DB looks well trained and
has plenty of tokens.


--
Chris Thielen

Easily generate SpamAssassin rules to catch obfuscated spam phrases
(0BFU$C/\TED SPA/\/\ P|-|RA$ES):
http://www.sandgnat.com/cmos/

Reply via email to