Kelson Vibber wrote:
At 03:59 PM 4/22/2004, Tom Allison wrote:

you would have to not only capture the IP, but all the domains associated with that IP. Unfortunately, the load against the whois/dns database servers might be a bit more than we can tolerate.


As has been stated elsewhere, whois does not scale well enough for any sort of automated tool.

The way SURBL works, it does not matter whether the URLs submitted contain IP addresses or domain names - whatever is actually *in the URL* goes into the queue that eventually ends up on the list. I believe someone mentioned that the SpamCop reporting facility (which is where SURBL gets its data) already handles some redirects. They also whitelist frequently-referenced legit domains.

So if spam contains <a href="http://1.2.3.4/evilphisher.php";>http://www.amazon.com/</a>, it will be 1.2.3.4 that gets submitted and will end up in SURBL


What if.... the surbl server was the only whois client? Would it still be too much?

DNS caching would become extremely important.


DNS caching is useful for any DNSBL. I don't see what makes SURBL different in that respect.


you're right, I wasn't thinking it through. You would almost need to cache whois.




-------------------------------------------------------
This SF.net email is sponsored by: The Robotic Monkeys at ThinkGeek
For a limited time only, get FREE Ground shipping on all orders of $35
or more. Hurry up and shop folks, this offer expires April 30th!
http://www.thinkgeek.com/freeshipping/?cpg=12297
_______________________________________________
Razor-users mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/razor-users

Reply via email to