I know there is BigEvil list we can use for url blacklist, and it is not really a good idea to collect the url automaticaly and put in the BigEvil list. But I feel little bit sorry for Chris Santere who has to check it manually every day. So my question is if there is an attemp to automate this time consuming process, ofcourse with some smart url test before the url is put in the blacklist. if not, maybe we can have such a url blacklist server where SA can ask the url to the server, and the blacklist server checkt the url in its database, and if it is not in database, it try to download the url using a proxy filter DansGuardian to decide whether the url is spams like or not. DansGuardian is very good content filtering web traffic (www.dansguardian.org), it works like SA with scoring for every "bad" words, and will block the access if summ of the scores is more than the threshold.
cahya.
