It wouldn't make sense to repeatedly download what was essentially the same list with small changes. It would make much more sense to have one large file to download once, followed by a distribution of small "diff" files to apply to the main file, containing any additions/deletions since the previous "diff" file.

If this was done in a distributed way, almost like DNS, where anyone could get it from anyone else, there would simply be too many systems running it to make it possible to kill it via DDOS.

At 02:52 PM 9/26/2003, Markus Gufler wrote:

> DNS blacklist databases are very much larger than the Sniffer
> rule set files.

A textfile containing only IP-Addresses can by zipped down to around 1/3
of his size.
A file containing 200 Ips has an original size of 3,1 kB
The zipped file has 1,1 kB
(Probably the zip algoritmus will work bether for larger files because
there are more equal 3-digit-strings.)

Multiplicating it by 100.000 assuming a blacklist containing 20 million
"bad" IPs would create a 110 MB file.

But this 20 million IP's are a initial value. I have no exact idea but I
assume there should be something between 1000 and 10000 new/removed IPs
per day.

If my theory has no errors we can expect daily updates between 0,5 and
5,5 MB.
That shouldn't be a problem.

Markus

---
[This E-mail was scanned for viruses by Declude Virus (http://www.declude.com)]

---
This E-mail came from the Declude.JunkMail mailing list.  To
unsubscribe, just send an E-mail to [EMAIL PROTECTED], and
type "unsubscribe Declude.JunkMail".  The archives can be found
at http://www.mail-archive.com.

Reply via email to