On 12/27/2010 11:46 AM, Jack L. Stone wrote:
At 02:26 PM 12.27.2010 -0500, David F. Skoll wrote:
On Mon, 27 Dec 2010 11:16:23 -0800
Ted Mittelstaedt<t...@ipinc.net>  wrote:

Larry Wall never envisioned the octopus monstrosity that Perl has
become.

Um.

Just because you can write overly-complex slow Perl code doesn't mean that
all Perl code is necessarily overly-complex or slow.

Not that I am unhappy with the existence of SA but anyone who uses it
must understand that an enormous amount of CPU power is wasted on SA
merely due to the inefficiency of it being written in Perl.

While Perl is part of the problem, a lot of the problem is SA itself
and some of it is simply the nature of content-based anti-spam
techniques... slinging around regexes, normalizing HTML, extracting
URLs sanely, extracting Bayes tokens, etc. is going to be slow no
matter how you do it.

Regards,

David.


In my case a very small percentage of mail actually reaches SA because of
several filters in front of it. Sendmail, Regex-milter, Greylist-milter,
and other milters catch most of the truly bad stuff, and then hands off
finally to SA. Thus, my server load is not so bad now. It used to be heavy
indeed before adding the front filters.


We also do clam-av. Yes I know most virus emitters are going to be blacklisted and SA would catch them anyway but this gives us some
visibility as to how much of the incoming spam is actually viruses.

greylisting, though, is by far the best. But I have noticed an increasing number of sites out there - and this is large sites - who
apparently are honked-off that people greylist, and they will bounce
delivery of mail that is issued an error 4xx in violation of the
standard.  Off the top of my head I seem to remember seeing this from
several airline company mailers that send out the advertisements to
their frequent flyer members, and that send out electronic ticketing
receipts.  Jerks!

Ted

Reply via email to