Hi Alex,
That will be awesome if that works. I will try this option.
Thanks,
Jatin
On 23 Aug 2014, at 10:24, Alex Rousskov rouss...@measurement-factory.com
wrote:
On 08/21/2014 07:06 PM, Jatin Bhasin wrote:
So, can somebody suggest me if there is a way to pass a flag to squid
from
Have a look at cache_dir in squid.conf. There are the options min-size
and max-size.
So you can specify ranges for the size of objects cache in different
cache_dirs.
--
View this message in context:
Hi Fred,
Sounds good, Already we have some proxy servers (like squid with
dansguardian ) tools to block the Nudity sites(including the images,
contents and videos etc..).
Is their any specific reason for going this API (
nudityimagesfilterforsquid )?
Thanks,
Visolve Squid
On 8/23/2014
Hello Visolve,
Is your DansGuardian able to block all porn/sexy websites/images, including
the very new domains just released ?
How do you block those images from google/yahoo search in https ?
Here, WebFilter is not enough... you need a real-time images filter :o)
Bye Fred
-Message
Hi Fred,
Sure we may need a real time image filter for advanced image filtering.
It can also be possible if we configured bannedregular expression list
in dansguardian.
It will count for words in a site and if the word exceeds it's limit (3
- 4 words as same eg:porn) then dansguardian will
On 23/08/2014 7:08 a.m., Stakres wrote:
Hi Guys,
We just released a new free tool for Squid: Nudity Images Filter for Squid
https://sourceforge.net/projects/nudityimagesfilterforsquid/
Its probably best to avoid PHP for publicly distributed helpers. At east
if you want them to be used
Hi Visolve,
Sure, you could do it with DansGuardian, personaly I prefer and advise
UfdbGuard that is - from my point of view - much more powerful in term of
possibilities than DansGuardian, that's is my opionion only, people are free
to use what they need...
Did you try our API ? maybe you could
Hello Fred,
Thanks for your suggestion. Surely we will look for your API.
Regards,
Visolve Squid
On 8/23/2014 5:57 PM, Vdoctor wrote:
Hi Visolve,
Sure, you could do it with DansGuardian, personaly I prefer and advise
UfdbGuard that is - from my point of view - much more powerful in term of
Hi,
I'm using Squid 3.3.8 as a transparent proxy, it works fine with HTTP,
but I'd like to avoid cacheing HTTPS sites, and just determine whether
the requested URL is listed as denied on Squid (via 'acl dstdom_regex'
for instance), otherwise just make squid act as a proxy to the URL's
Thanks for that, I missed those parameters!
Naturally, I'd start with just one cache_dir and make modifications later.
There are people reporting they could not get this to work as the files
wouldn't be distributed properly - can you think of any hidden gotchas when
setting these two params?
In the past, with older squids, there was a bug regarding a conflict with the
general parm
maximum_object_size
regarding the sequence (may be: value ?) of cache_dir max-size and
max_obj_size.
Can't exactly remember, think, max_obj_ has to be before cache_dir in
squid.conf, imposing the highest
On Saturday 23 August 2014 at 17:55:26 (EU time), babajaga wrote:
In the past, with older squids, there was a bug regarding a conflict with
the general parm maximum_object_size
regarding the sequence (may be: value ?) of cache_dir max-size and
max_obj_size.
No, it was the sequence they
So, to sum it all up (please correct me if I'm wrong) - it is possible to
have multiple cache_dirs AND instruct a single squid instance to place files
in those caches according to file size criteria using
min_file_size/max_file_size params on the cache_dir directive. Also,
maximum_object_size
On 24/08/2014 6:06 a.m., dxun wrote:
So, to sum it all up (please correct me if I'm wrong) - it is possible to
have multiple cache_dirs AND instruct a single squid instance to place files
in those caches according to file size criteria using
min_file_size/max_file_size params on the cache_dir
On 24/08/2014 1:00 a.m., Nicolás wrote:
Hi,
I'm using Squid 3.3.8 as a transparent proxy, it works fine with HTTP,
but I'd like to avoid cacheing HTTPS sites, and just determine whether
the requested URL is listed as denied on Squid (via 'acl dstdom_regex'
for instance), otherwise just make
15 matches
Mail list logo