Title: Message
Regarding False positives, I'll argue that the SURBL lists have less false positives than Sniffer.
 
I'm testing the invuribl external test with the different lists, not using the multi.surbl.org.
 
In regards to False Positives,
From Jan 1-8 here's a breakdown of the e-mails that scored from these lists and scored in my ham, or tag weights.
 
ab.surbl.org   (1)
                    1  on an email from the Imail list. This could be a bad domain referenced in the email or a false positive.
 
jw list referenced locally  (1)
                   1 fp
 
sc.surbl.org  (2)
                   2 false hams. ok why don't I just call that spam!
 
ob.surbl.org  (8)
                    3 false hams ok. spams
                    2 fp
                    2 questionable
 
ws.surbl.org  (18):
                      10 fp
                      7 grey items
                      1 Imail list hit.
 
Sniffer over the same period has 227 hits.
Many grey mail (walmart, travelzoo, yournewsletters, tep1, birthday express) and quite a few fp (roving, exact target, (my web provider).
 
To take another comparison,
I would consider Spamhaus's SBL my most reliable IP4R test on the whole and my highest weighted IP4R test. SBL has 37 false positives over the Jan 1-8 time frame more than all of the surbl.org tests combined.
 
---- Original Message -----
From: Matt
Sent: Sunday, January 09, 2005 9:23 AM
Subject: Re: [Declude.JunkMail] Sniffer vs. SURBL

Andy,

I'm not sure how you are seeing the results that you are seeing.  Sniffer tags from 95% to 97.5% of spam on any given day on my system with a good portion of what gets through being either fresh spam sources, niche spam or backscatter.  Unless there was something wrong, it is impossible for invURIBL to be tagging 11% more than Sniffer, even if invURIBL tagged 100% of it.

As someone else pointed out, I would be cautious of how to weight the combined SURBL zone as it clearly has false positives and it will exacerbate any issues that you might already be having with blocking legitimate commercial E-mail.  These resources are great for catching spam, but unfortunately there are many out there that will submit almost anything that is commercial and their choices end up applying to anyone that uses the zone.

Another thing to consider is that Sniffer does already cross check with SURBL as a way of helping to verify the payload URL strings.  I have found and reported several false positives to Sniffer that were tagged in this manner, and SURBL appears to be much safer than invURIBL as a whole.  I consider the double hit with SURBL to be fairly safe because the zone is time limited and there is a delay in Sniffer adding new rulebases, so I rarely get double hits on false positives.  Things would be significantly different however with a longer lived zone.

Matt



Andy Schmidt wrote:
Hi,
 
Today I finally took the time (I didn't have) and ran both Sniffer and SURBL Tests (using http://www.invariantsystems.com/invURIBL/).
 
Result:
 
  1,860  tagged by invURIBL only -> gain over Sniffer = 21%
  8,926  tagged by BOTH invURIBL AND Sniffer
     962  tagged by Sniffer only -> gain over invURIBL = 11%
 
In other words:
 
If I ran ONLY Sniffer, I would have missed 21% of additional messages that were detected by checking against SURBL.
If I tested SURBL only, I would have missed 11% of messages (that only Sniffer found)
I have configured Declude, so that the two tests are complimentary (no extra weight BOTH tests vs. ONE test fails.)
 
My conclusion:
 
Both Sniffer and invURIBL are worth their money...
 
 
PS: here the "raw" numbers:
 
DLAnalyzer(4.0.5 - 12/21/2004) Report Generated At 1/9/2005 12:48:14 AM For Argos.net
Breakdown Of Messages That Failed: INV-URIBL
Messages That Matched: 10,786
TEST             # FAILED   Percentage
IPNOTINMX..........10,372.......96.16%
SNIFFER.............8,926.......82.76%
NOLEGITCONTENT......8,673.......80.41%
SPAMCOP.............4,983.......46.20%
SORBS...............4,521.......41.92%
XBL-DYNA............4,470.......41.44%
 
Breakdown Of Messages That Failed: SNIFFER
Messages That Matched: 9,888
TEST             # FAILED   Percentage
IPNOTINMX...........9,611.......97.20%
INV-URIBL...........8,926.......90.27%
NOLEGITCONTENT......8,788.......88.88%
SPAMCOP.............5,208.......52.67%
XBL-DYNA............4,672.......47.25%
SORBS...............4,664.......47.17%

Best Regards
Andy Schmidt

Phone:  +1 201 934-3414 x20 (Business)
Fax:    +1 201 934-9206

 

-- 
=====================================================
MailPure custom filters for Declude JunkMail Pro.
http://www.mailpure.com/software/
=====================================================

Reply via email to