You seem to be talking all around the real issue, and that is that all "global" (not "individual") content filtering should be done prior to invoking any other third-party apps, which is where it is most appropriately done. Individual content filtering is user specific, and so it make sense that this filtering be done after messages are returned to IMail, and just prior to delivery.
Of all of the different mail servers we manage at our data centers, including Postfix and Sendmail, I can't imagine not doing all of the processing you can do before delivering to a third-part app for additional processing. It just would not make sense to do part of the message spam processing, send it to some other app for processing, and then receive it back to do more spam processing. What a waste of time and resources. If at any point in the process you can reach final disposition before invoking other processes, then you should do so. When we setup Amavis-New as a Postfix content filter, we first do all spam processing we want to do with Postfix, and if the message has not been disposed of there, then inject it into Amavis, were we invoke SpamAssassin, Razor, and DCC, and if the message is not disposed somewhere along this process, then send it back to Postfix for final delivery--not for more spam processing. Now that's not to say that more individual content filtering is not done after Postfix receives the message back. Users can setup additional content filtering with Procmail, if they like. But again, this is individual content filtering, not global content filtering, and should be done as the last test prior to delivery. Although I don't directly manage any of the Sendmail servers, I will check to see how they handle their content filtering, as well. But from the looks of the headers, it appears to be very similar to how we run our Postfix servers on Linux. Okay, so let's just agree to admit that it's a major design flaw in IMail v.8, and leave it at that--and hope that IPSwitch will fix it in a future update... ;-) Bill ----- Original Message ----- From: "Sanford Whiteman" <[EMAIL PROTECTED]> To: "Bill Landry" <[EMAIL PROTECTED]> Sent: Friday, June 27, 2003 12:46 PM Subject: Re[8]: [Declude.JunkMail] Test on Imail X-header > > Scott has mentioned on this list many times in > > the past the process order between IMail and > > Declude... > > Yes, and you're not interpreting the order correctly. > > IMail has always performed all content-based filtering after submission, during the filtering/delivery stage once represented by SMTP32 and now succeeded by QM (your quip "in case you haven't noticed" unfortunately shows that you didn't even read one of my posts in which I described the performance pitfalls of adding YET ANOTHER system service, splitting queue management and content scanning into two services instead of the one unified executable that has been offered--once a process, now a service). > > I think the fundamental thing you're missing is that the SMTP daemon does not do content filtering, and never has. Envelope filtering "powered" by the KILL.LST is not content filtering, by definition. The content filtering, which used to be limited to the basic IMail rules engine but now has been beefed up to include a seeded list of spam phrases and patterns and statistical averaging as well, has always been done by the delivery process: again, the basic content filtering used to be done by SMTP32, now the enhanced content filtering is done by QM. This is, again, unsurprising. > > Your projections about the wisdom of putting content filtering into the SMTP daemon do not agree with best practices for Win32 system programming (nor do I know of *nix milters that operate at such a level). They may make sense to you, but they don't make sense on a system that could be easily socket- and process-starved by such a decision...if you'd worked on an enterprise WinSock application, you'd understand. > > And the resources that you're suggesting are "wasted" by spooling a file, then triggering another process to read and scan the file are only IPC resources, and the disk and CPU resources have been appropriately moved away from the primary, non-multi-threaded daemon to be managed by a self-scheduling secondary daemon. As I tried to explain two posts ago, to further expand this into a secondary content scanning daemon and a tertiary delivery daemon would mean that files *not* selected for deletion after content scanning would need to be spooled by SMTPD, read by CS, written by CS, read by QM, and written again by QM--suboptimal design that, if not mandated by third-party interoperability concerns (as clearly it was not), should be avoided. > > -Sandy > > -- > ------------------------------------ > Sanford Whiteman, Chief Technologist > Broadleaf Systems, a division of > Cypress Integrated Systems, Inc. > mailto:[EMAIL PROTECTED] > ------------------------------------ > -- > --- > [This E-mail was scanned for viruses by Declude Virus (http://www.declude.com)] > > --- > This E-mail came from the Declude.JunkMail mailing list. To > unsubscribe, just send an E-mail to [EMAIL PROTECTED], and > type "unsubscribe Declude.JunkMail". The archives can be found > at http://www.mail-archive.com. > --- [This E-mail was scanned for viruses by Declude Virus (http://www.declude.com)] --- This E-mail came from the Declude.JunkMail mailing list. To unsubscribe, just send an E-mail to [EMAIL PROTECTED], and type "unsubscribe Declude.JunkMail". The archives can be found at http://www.mail-archive.com.
