http://bugzilla.spamassassin.org/show_bug.cgi?id=4200


[EMAIL PROTECTED] changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
   Target Milestone|Undefined                   |Future




------- Additional Comments From [EMAIL PROTECTED]  2005-05-22 22:06 -------
Would it be possible to enhance the URI blacklist systems to track this data? 
I'm thinking of something like: 

1) Blacklist system caches all URI requests, with the cache tracking number of
requests, oldest request, newest request.

2) Cache is cleared on a regular basis of all entries where newest request is 
"old".

3) Cache is dumped on a regular basis, collecting all URI requests where oldest
request is significantly old. 

4) When a URI request comes in that matches an "old" URI entry, which is *not*
on any blacklist and not on any whitelist (since domains like Yahoo are
whitelisted), but is recorded as "old", it gets a specific response that SA is
then able to detect as an "old" (and not blacklisted) domain. This then enables
a negative score depending upon the perceptron.

Only domains that are referenced in URIs will be logged, and only those that are
not in whitelists or blacklists. This response will indicate that a) yes, this
is a domain seen at least occasionally in URIs, b) it's not whitelisted like
Yahoo is, c) it's not blacklisted (or greylisted if in URIBL), and d) it's old
enough to not be a quickly recycled domain ("old enough") might mean six months,
or perhaps a year, actual age threshold to be determined by experiment). 

Would this be viable? 





------- You are receiving this mail because: -------
You are the assignee for the bug, or are watching the assignee.

Reply via email to