Squid 2.6.stable14

I've got a small dillema. I've written an external Perl helper to return "OK" or "ERR" dependant upon some regular expressions and/or domains stored in an external Postgresql database.

What I've noticed is that for each URL/FQDN that is requested, squid passes 'every' URL embedded in the webpage, one by one, to the external helper(s that are running, until I add the asynchronous piece that allows just one instance of the helper to run, which will then fork children processes as needed). My external helper, makes a call to a postgresql db and checks each domain/url against a single table of 50K entries, this KILLS the performance and end user experience.

Does anyone have any suggestions as to 'how' to make this work well? Or Henrik, do you have any suggestions as to where I might start looking in the Squid code, as to how I can modify 'how' URL's are passed to the external helper?

Thanks dist!


--
Louis Gonzales
[EMAIL PROTECTED]
http://www.linuxlouis.net

Reply via email to