https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7797

Noah Meyerhans <[email protected]> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |[email protected]

--- Comment #4 from Noah Meyerhans <[email protected]> ---
(In reply to Bill Cole from comment #3)

> Beyond that problem of there being little spam to catch above 500k and SA
> being bad at catching the big spam which does exist, there is a risk in SA's
> rule system of the "catastrophic backtracking" problem inherent in all
> regular expression matching systems. We believe that we have eliminated
> risky rules in the default ruleset but users can still create risky rules
> themselves, and limiting the scan size provides some protection.

In the corpuses that have been examined when making the assertion that
relatively little spam is >500k in size, have attachments been accounted for?
I've seen quite a few spam messages recently in the 1-2 MB range that include
(sometimes zipped) PDF attachments.  I'd expect attached Word/Excel docs to
have similar sizes. Plugins like OLEVBMacro and PDFInfo can be useful against
this sort of spam, but only if they see it.

> Because this is a limit that users (or distributions) are entirely free to
> adjust to their own tastes and because the rationales for it ARE NOT
> obsoleted by technological advancement, I do not believe that we should
> change the default limit.

You're right that the rationale hasn't changed, but I believe the specific
response to that rationale has. The economics of sending large spam have
changed, as have the economics of filtering it.  Thus, the request isn't to
reconsider the rationale, but to reconsider how we apply it.

-- 
You are receiving this mail because:
You are the assignee for the bug.

Reply via email to