A while back I mentioned that a combination of errors had allowed my teen son access to the net well past the normal midnight cutoff. He surfed for almost 2 hours and downloaded close to 50 megabytes of porn from about 25 different sites. Mind you, squidGuard was *not* disabled, it was working perfectly. Those 25 sites were not in my database.
I have figured out how he was able to do that, and I found a solution that is working for me. I wanted to share it with the group in the hope that it might help someone else. The problem is not very high-tech and you may consider the solution a bit heavy-handed, but it works. If you aren't using the porn expressionlist you are simply fighting a losing battle. Either start using it or give up. Let me demonstrate the origin of the problem. Using a login that has porn blocked (and that includes the expressionlist), go to <www.google.com> and put in the 'f' word as your single search term. You should be blocked. (If you are not blocked, you need to add the 'f' word to your expressions file. As a point of reference, it works fine if you insert it between Adultsite and Adultsonly with another '|'.) Once you've made that change and bounced squid (squid -k reconfigure) you should get blocked with you search for the 'f' word on google. Now go to <www.excite.com> and search for the 'f' word. If you system is like mine was, you should receive the first page of a bunch of results. How long do you think it would take you to find unblocked sites if you go down the list and <Shift><Click> on every link? You are pretty well guaranteed to get through to all the porn you can handle. Try it. The expressionlist works beautifully with some search engines and with others (like Excite), it's as if the search terms are invisible. The solution? I tested all of the popular search engines and made a list of those that get around the expressionlist. I then added that list to a rewrite group in my config file. Now when you request http://www.excite.com you actually get http://www.google.com. It's working and there've been *no* complaints! Here are the details: Add this rewrite group to your squidGuard.conf file, after the destination groups and before the acl section: ----------- rewrite group ------------ rew srch-engines { s@http://www.altavista.com@http://www.google.com@iR s@http://www.excite.com@http://www.google.com@iR s@http://www.askjeeves.com@http://www.google.com@iR s@http://www.aj.com@http://www.google.com@iR s@http://www.dogpile.com@http://www.google.com@iR s@http://hotbot.lycos.com@http://www.google.com@iR s@http://www.ask.com@http://www.google.com@iR s@http://infoseek.go.com@http://www.google.com@iR s@http://www.3bp.com@http://www.google.com@iR s@http://www.locate.com@http://www.google.com@iR s@http://www.teoma.com@http://www.google.com@iR s@http://www.search.com@http://www.google.com@iR s@http://search.msn.com@http://www.google.com@iR s@http://www.alltheweb.com@http://www.google.com@iR s@http://www.infopeople.org/search/@http://www.google.com@iR s@http://www.searchenginecolossus.com/@http://www.google.com@iR logfile /usr/local/squidGuard/log/srch-engines.log } -------- end ------------------ Now, for those acl groups that you want this to apply, add: rewrite srch-engines For example: acl { students { rewrite srch-engines pass allow !porn !drugs ..... etc redirect ......... } These changes plugged a big hole in my system. I hope you find this information useful. Rick Matthews
