On 1/11/2008 11:25 AM, [EMAIL PROTECTED] wrote:
http://issues.apache.org/SpamAssassin/show_bug.cgi?id=5777
------- Additional Comments From [EMAIL PROTECTED] 2008-01-11 02:25 -------
(In reply to comment #19)
on this point -- does it make a difference to the querier's DNS cache,
whether the domain or subdomain exists or not? In other words, a spammer
can equally launch this attack by filling a spam with foo1.com, foo2.com, ...
foo99999.com. They don't even have to register those domains -- the effect
on the DNSBL's server will be the same.
The problem is multidimensional:
1. Spammers do use their botnets to automatically set up free host sites. The
number of such sites that they can set up is many times larger than the size of
their botnets, assuming each bot can set up more than one site, which it
certainly can. Trying to store and serve those alone may break the nameservers.
This bears repeating: the number of subdomain sites is easily larger than the
number of botnets, possibly multiple orders of magnitude larger. Botnets are
large and so are the resources of large freehosts. Sender IP blacklists are
orders of magnitude larger than current URIBLs. A URIBL that attempts to
capture every abused freehost subdomain is potentially much larger than sender
IP blacklists, i.e, larger than the number of botnets by several orders of
magnitude. I hope that's sufficiently clear.
so the googlepages subdomain used in one spam run, for max 15 minutes is
cached differently than blahasdasdasd.info created by some obscure
script and hosted on some domaincontrol.com NS and also used for one
spam run which lasts for 15 min?$
seems you haven't noticed that the dedicated domain usage has decreased.
for spammy its way more efficient to use googlepages and redirect to a
static site which lasts longer and will probably go unnoticed than
having to setup vhosts every 30 minutes.
2. Caching of potentially many millions of different subdomains may be
ineffective. There's little reason to re-use the same subdomain in each new
spam when it's trivial to set up another subdomain site. Making caching less
effective is another possible attack against the nameservers.
exactly, so if its .info or a googlepages, the result is the same.
3. This puts the size of the blacklist under control of how efficiently botnets
can set up new free hosting sites. Unfortunately that process is probably very
efficient. It effectively cedes control of the blacklist to the botnet
operators.
if the list can't expire stale entries, and carries a 4 year burden of
stale data, then it obvious that zone size will become a problem.
You'd have the same problem if its .nfo, .org or googlepages sub domains.