On 2010-08-26 16:11, Grant Peel wrote:
Hi all,

I have serveral hundred domains on a box. Each domain's mail is controlled by a specific UNIX user.

Inside every user's directory, they have a user_prefs file.

While I have use_bayes 0 in the main config, some users have opted to turn on bayes in thier user_prefs.

This morning I noticed that one particular ~/.spamassassin/bayes* files had grown to 1.5 GB.

I have put:

use_bayes 0
bayes_auto_learn        0
bayes_auto_expire       1
bayes_expiry_max_db_size 50000

in the local.cf file, and restarted spamd.

The database did not appear to trim, so I tried:

sa-learn -u "user" -D --force-expire

and the database is still 1.5 GB.

I know I am doing something(s) incorrect, but can't figure out what.

How do I properly trim the offending file(s)?

Is there a command to trim all databases (sers) on the box?

Any advice would be appreciated.

I bet the biggest is bayes_seen.
You can safely delete the bayes_seen file (unless you plan to "unlearn" msgs). I will stzart growing again, fast.

the bayes_tokens file is the one which gets trimmed by expiration.
bayes_seen is what I call a parasite :-)

On a busy box, to avoid freezes I'd recommend settin
bayes_auto_expire       0

and do a cron'd force-expire during low traffic hours, eithe daily or weekly, depending on the bayes_tokes size.




h2h

Reply via email to