[...]


So for now a url blacklist should be at-least 4KB with size but I think when 
jumping\doubling 4KB it's not such a big jump to 8KB.
The main issue I was thinking was between using one field of the DB with X size 
or other one which has indexes.

For now I have used mysql TEXT which doesn't have indexes but only the first 
query takes more then 0.00 ms.

I have tried couple key-value DB's and other DB's but it seems like all of them 
are having some kind of a step which is the slowest and then it run's fast.

I have mysql Compared to key-vaule and the main differences are the on-disk 
size of the DB which is important if there is a plan to filter many many 
specific urls and not based only on patterns.

Amos:(or anyone else) since you patched squidguard, maybe you do have basic 
understanding with it's lookup algorithm?
I started reading the code of SquidGuard but then all of a sudden I lost my way 
in it and things got a bit complicated (for me) to understand how they do 
it.(hints..)

Eliezer,

It is not clear what you want to achieve...  If you just want to use a URL 
filter
I suggest to use ufdbGuard. I am the author, give support, there are regular
updates, it is multithreaded and holds only one copy in memory, and has a 
documented
proprietary database format which is 3-4 times faster than squidGuard.

Marcus

Thanks,
Eliezer

Reply via email to