On Fri, Mar 16, 2012 at 01:47:52AM +1300, Amos Jeffries wrote:
Basically it boils down to Solaris being past its end-of-life as an
operating system.
It just has to be said: please stop blurting stupid FUD like that.
No need to reply.
On Mon, Nov 01, 2010 at 03:00:21PM +, decl...@is.bbc.co.uk wrote:
Besides that, I have a laaarge url_regexp file to process, and I was
wondering if there was any benefit to trying to break this regexp out to a
perl helper process (and if anyone has a precooked setup doing this I can
On Fri, Oct 29, 2010 at 11:14:55PM +0200, ma...@it-schuth.net wrote:
Hey,
how is it possible to setup squid 2.7 with clamav, WITHOUT dansguardin ?
thanks!
Since 2.7 doesn't have good ICAP support, you can't use c-icap.
Have a look at HAVP http://www.server-side.de/.
Forget about
On Thu, Jun 24, 2010 at 08:39:07AM +0200, Matus UHLAR - fantomas wrote:
On 24.06.10 10:05, senthilkumaar2021 wrote:
In order to use Anti virus scanning with Squid proxy which one is
suitable whether C-ICAP with ClamAV or HAVP with ClamAv.We are having
request rate around 300-350
On Fri, Jun 11, 2010 at 11:47:38AM -0700, Kurt Buff wrote:
2010/6/11 Henrik Nordström hen...@henriknordstrom.net:
fre 2010-06-11 klockan 11:02 -0700 skrev Kurt Buff:
All,
I've been biggling around teh interwebs, and crusing through the faqs
and manuals, and don't see how to block this
On Sat, Jun 12, 2010 at 10:07:51AM +0300, Henrik K wrote:
On Fri, Jun 11, 2010 at 11:47:38AM -0700, Kurt Buff wrote:
2010/6/11 Henrik Nordström hen...@henriknordstrom.net:
fre 2010-06-11 klockan 11:02 -0700 skrev Kurt Buff:
All,
I've been biggling around teh interwebs, and crusing
On Wed, Mar 17, 2010 at 05:27:26AM -0700, George Herbert wrote:
If it is that same gnu malloc issue on pattern matching, then a
restart of Squid should clear it up temporarily. It would
consistently appear after some time, after the restart, though.
You could either automatically restart
On Tue, Mar 16, 2010 at 08:58:27PM +0100, Henrik Nordström wrote:
mån 2010-03-15 klockan 18:47 +0200 skrev Henrik K:
If you don't want this limitation, you can use HAVP. It scans the file while
it's being transferred to client, while keeping small part of it buffered
(in case of virus
On Wed, Mar 17, 2010 at 03:53:01AM +, Amos Jeffries wrote:
So HAVP is designed specifically to send client scanned parts of the file
before the entire thing is checked?
Right. Of course all this is configurable.
That explains something that I was wondering about...
Consider this
On Mon, Mar 15, 2010 at 12:30:11PM +0100, Stefan Reible wrote:
PS: I have an secound problem with downloading big files, is it possilbe
to send any infos about the download progress to the webbrowser? Like
opening an ajax script or something else.
If you don't want this limitation, you can
On Tue, Feb 16, 2010 at 03:25:24AM -0800, davefu wrote:
Is there a way to avoid this double traffic generation?
Redirector based AV scanners are flawed and inefficient by design.
Use some sane package like HAVP or C-ICAP. Google for them.
On Tue, Jun 16, 2009 at 03:12:07PM +1200, Amos Jeffries wrote:
If it scales well and is faster than the existing dstdomain it would be a
welcome addition.
If Squid still has the problem of not answering to clients during a reload,
it should be fixed also. Fast lookups are one thing, but
On Thu, Apr 16, 2009 at 03:57:35PM +0200, Frank Fiene wrote:
Hi i have a problem with integrating clamav with squivir2 into squid.
Not that it helps with this specific question, but get yourself a real tool.
Redirector based virus scanners are flawed by design. You want to look at
proxy based
On Fri, Apr 17, 2009 at 07:13:35AM +0200, Frank Fiene wrote:
Am 16.04.2009 um 17:52 schrieb Henrik K:
On Thu, Apr 16, 2009 at 03:57:35PM +0200, Frank Fiene wrote:
Hi i have a problem with integrating clamav with squivir2 into squid.
Not that it helps with this specific question, but get
On Fri, Feb 06, 2009 at 10:28:45AM +0700, ? z??up?? ??z??
? wrote:
i guess there is no solution ?
You can download safebrowsing db and use it locally.
Check some examples:
On Tue, Jan 06, 2009 at 11:49:57PM -0700, Joseph L. Casale wrote:
Depends on your chosen ACL type and the number of patterns.
Many regex may be slower than DG, many dstdomain or dst may improve
response time.
It looks like the lists are far too large for any regex type acls but
the acl
On Wed, Jan 07, 2009 at 09:53:49AM -0800, Chuck Kollars wrote:
What kind of performance issues should I expect if I remove squidGuard
and simply make a series of acl's pointing to shalla bl files directly
then denying them with http_access deny statements?
I have to admit I don't know
On Wed, Dec 03, 2008 at 06:43:31PM +1300, Amos Jeffries wrote:
Henrik K wrote:
On Mon, Nov 17, 2008 at 03:00:06PM -0800, Jeff Gerard wrote:
Thanks so much...I'll definitely give this a try...but...
apparently I'm not sure what to do here..
Should I simply set LDFLAGS=-lpcreposix -lpcre
On Mon, Nov 17, 2008 at 03:00:06PM -0800, Jeff Gerard wrote:
Thanks so much...I'll definitely give this a try...but...
apparently I'm not sure what to do here..
Should I simply
set LDFLAGS=-lpcreposix -lpcre
then run my ./configure?
or??
Right..
export LDFLAGS=-lpcreposix -lpcre
On Fri, Nov 14, 2008 at 10:00:24PM -0600, Jeff Gerard wrote:
I have an acl that I use to block ad sites. One of the regex's that I use
is:
_ad_. I have discovered that I need to tweak this regex to fail when it
finds the following pattern: http://something.com/path/some_ad_test.js
I
On Sun, Oct 19, 2008 at 04:51:16PM +1300, Amos Jeffries wrote:
Fair test would be reversing the hostname, which is very cheap operation. ;)
No. Because most users will not write their ACL regex normally, and the
regex has to match a forward-coded domain anyway. The squid algorithm
works
On Fri, Oct 17, 2008 at 10:24:21PM +0200, Henrik Nordstrom wrote:
On tor, 2008-10-16 at 12:02 +0300, Henrik K wrote:
Optimizing 1000 x www.foo.bar/randomstuff into a _single_
www.foobar.com/(r(egex|and(om)?)|fuba[rz]) regex is nowhere near linear.
Even if it's all random servers
On Sat, Oct 18, 2008 at 12:44:46PM +0300, Henrik K wrote:
On Fri, Oct 17, 2008 at 10:24:21PM +0200, Henrik Nordstrom wrote:
On tor, 2008-10-16 at 12:02 +0300, Henrik K wrote:
Optimizing 1000 x www.foo.bar/randomstuff into a _single_
www.foobar.com/(r(egex|and(om)?)|fuba[rz]) regex
On Sat, Oct 18, 2008 at 11:54:52PM +1300, Amos Jeffries wrote:
Henrik K wrote:
On Sat, Oct 18, 2008 at 12:44:46PM +0300, Henrik K wrote:
Not sure what the splay code does in Squid, didn't have time to grab it.
But a simple test with Perl:
- Grepped some hostnames from wwwlogs etc
- Regexp
On Thu, Oct 16, 2008 at 01:56:59AM +0800, howard chen wrote:
Hello,
On Wed, Oct 15, 2008 at 10:14 PM, Henrik K [EMAIL PROTECTED] wrote:
On Wed, Oct 15, 2008 at 03:42:20PM +0200, Henrik Nordstrom wrote:
Any suggestion for having large ACL in a high traffic server?
Avoid using regex
On Thu, Oct 16, 2008 at 10:10:23AM +0200, Henrik Nordstrom wrote:
On ons, 2008-10-15 at 17:14 +0300, Henrik K wrote:
Avoid using regex based acls.
It's fine if you use Perl + Regexp::Assemble to optimize them. And link
Squid with PCRE. Sometimes you just need to block more specific URLs
On Wed, Oct 15, 2008 at 03:42:20PM +0200, Henrik Nordstrom wrote:
Any suggestion for having large ACL in a high traffic server?
Avoid using regex based acls.
It's fine if you use Perl + Regexp::Assemble to optimize them. And link
Squid with PCRE. Sometimes you just need to block more
On Sun, Oct 12, 2008 at 12:31:45PM +0300, Ali Hardogan wrote:
Hello,
What is the best way to have full control over HTTP traffic that goes
through a Squid-enabled firewall?
Don't allow outside connections from clients, don't use transparent. Force
users to configure proxy in browser.
On
On Mon, Oct 13, 2008 at 01:40:06AM +0300, Ali Hardogan wrote:
Depending on your OS/firewall, you may have ability search packets for HTTP
traffic. But it is intensive, not foolproof and unnecessary kludge.
Right. And I cannot be using Squid for that. Instead I need to rely on
another
On Mon, Oct 06, 2008 at 11:03:10AM -0700, simon benedict wrote:
Dear All,
I have the following setup which is been workin fine for a long time
Redhat 9
squid-2.4.STABLE7-4
i also have shoreline firewall on the same squid server
now i appreciate if someone cd advise n help me
1)
On Fri, Apr 25, 2008 at 04:05:19PM -0400, Steven Pfister wrote:
Does Apache + mod_security allow reverse proxying to https servers? The
server is using both http and https currently, and I don't know enough
about the actual server to know if doing everything over http is feasible.
Apache
On Fri, Apr 25, 2008 at 09:51:53AM -0400, Steven Pfister wrote:
Does squid as it's installed do any kind of checking of URLs for signs of
attacks, or does something additional need to be installed (and what's
popular for that)?
More likely you would want to use Apache with mod_security as
On Mon, Apr 21, 2008 at 12:11:49AM +0200, Henrik Nordstrom wrote:
tor 2008-04-17 klockan 08:02 -0300 skrev Cassiano Martin:
Its a anti-virus proxy wich uses clamav. You can use it together with squid.
Or better yet, if using Squid-3 you can plug clamav directly into Squid
using ICAP and
33 matches
Mail list logo