Re: [squid-users] blockVirgin Works for CONNECT but Custom Response does not work
Hi Alex, That will be awesome if that works. I will try this option. Thanks, Jatin On 23 Aug 2014, at 10:24, Alex Rousskov rouss...@measurement-factory.com wrote: On 08/21/2014 07:06 PM, Jatin Bhasin wrote: So, can somebody suggest me if there is a way to pass a flag to squid from ecap adapter to decrypt a site regardless of what ACL says. For example if I have an acl as below which says do not decrypt www.888.com but If my ecap adapter could pass a message to squid asking it to decrypt www.888.com (for that session only) and ignore the below acl. Is it possible? Given a recent-enough Squid version, an adaptation service can control Squid behavior via the annotations mechanism and the note ACL associated with it. For example, your eCAP adapter can return an X-Bump:yes annotation(**) that Squid can then match using the note ACL. Something along these untested lines: acl note toBump X-Bump yes ssl_bump server-first toBump ssl_bump server-first ... ssl_bump none all This mechanism should be supported for ssl_bump ACLs but I have not tested that claim myself. HTH, Alex. (**) In eCAP terminology, an X-Bump:yes annotation is an adapter transaction option named X-Bump with a yes value. See libecap::Options, which is a parent of libecap::adapter::Xaction.
[squid-users] Re: Filter squid cached files to multiple cache dirs
Have a look at cache_dir in squid.conf. There are the options min-size and max-size. So you can specify ranges for the size of objects cache in different cache_dirs. -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Filter-squid-cached-files-to-multiple-cache-dirs-tp4667347p4667349.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Nudity Images Filter for Squid
Hi Fred, Sounds good, Already we have some proxy servers (like squid with dansguardian ) tools to block the Nudity sites(including the images, contents and videos etc..). Is their any specific reason for going this API ( nudityimagesfilterforsquid )? Thanks, Visolve Squid On 8/23/2014 12:38 AM, Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ You can specify the MaxResol and the MaxScore for the block. All details are in the readme.txt http://sourceforge.net/projects/nudityimagesfilterforsquid/files/readme.txt/download Important: - We provide the API for free, we can not warranty it'll work with your Squid installation, that's why you must test on a separated Squid before going to production. - We do not compile statistics based on your requests and we do not share data with Marketing teams or external companies, we also do not use your data for our internal needs. - If you are interested for a local implementation of our API in your network, just drop us an email atsupp...@unveiltech.com Your feadbacks are welcome... Bye Fred -- View this message in context:http://squid-web-proxy-cache.1019090.n4.nabble.com/Nudity-Images-Filter-for-Squid-tp4667345.html Sent from the Squid - Users mailing list archive at Nabble.com.
RE: [squid-users] Nudity Images Filter for Squid
Hello Visolve, Is your DansGuardian able to block all porn/sexy websites/images, including the very new domains just released ? How do you block those images from google/yahoo search in https ? Here, WebFilter is not enough... you need a real-time images filter :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 11:08 À : squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sounds good, Already we have some proxy servers (like squid with dansguardian ) tools to block the Nudity sites(including the images, contents and videos etc..). Is their any specific reason for going this API ( nudityimagesfilterforsquid )? Thanks, Visolve Squid On 8/23/2014 12:38 AM, Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ You can specify the MaxResol and the MaxScore for the block. All details are in the readme.txt http://sourceforge.net/projects/nudityimagesfilterforsquid/files/readme.txt /download Important: - We provide the API for free, we can not warranty it'll work with your Squid installation, that's why you must test on a separated Squid before going to production. - We do not compile statistics based on your requests and we do not share data with Marketing teams or external companies, we also do not use your data for our internal needs. - If you are interested for a local implementation of our API in your network, just drop us an email atsupp...@unveiltech.com Your feadbacks are welcome... Bye Fred -- View this message in context:http://squid-web-proxy-cache.1019090.n4.nabble.com/Nudity-Images-Fil ter-for-Squid-tp4667345.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Nudity Images Filter for Squid
Hi Fred, Sure we may need a real time image filter for advanced image filtering. It can also be possible if we configured bannedregular expression list in dansguardian. It will count for words in a site and if the word exceeds it's limit (3 - 4 words as same eg:porn) then dansguardian will automatically block the sites. And also we are not sure with very newly released domains. Thanks Regards, Visolve Squid On 8/23/2014 2:47 PM, Vdoctor wrote: Hello Visolve, Is your DansGuardian able to block all porn/sexy websites/images, including the very new domains just released ? How do you block those images from google/yahoo search in https ? Here, WebFilter is not enough... you need a real-time images filter :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 11:08 À :squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sounds good, Already we have some proxy servers (like squid with dansguardian ) tools to block the Nudity sites(including the images, contents and videos etc..). Is their any specific reason for going this API ( nudityimagesfilterforsquid )? Thanks, Visolve Squid On 8/23/2014 12:38 AM, Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ You can specify the MaxResol and the MaxScore for the block. All details are in the readme.txt http://sourceforge.net/projects/nudityimagesfilterforsquid/files/readme.txt /download Important: - We provide the API for free, we can not warranty it'll work with your Squid installation, that's why you must test on a separated Squid before going to production. - We do not compile statistics based on your requests and we do not share data with Marketing teams or external companies, we also do not use your data for our internal needs. - If you are interested for a local implementation of our API in your network, just drop us an emailatsupp...@unveiltech.com Your feadbacks are welcome... Bye Fred -- View this message in context:http://squid-web-proxy-cache.1019090.n4.nabble.com/Nudity-Images-Fil ter-for-Squid-tp4667345.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Nudity Images Filter for Squid
On 23/08/2014 7:08 a.m., Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ Its probably best to avoid PHP for publicly distributed helpers. At east if you want them to be used widely. PHP CLI is an unusual interpreter to have installed, and has known issues with engine timeouts closing the helper scripts unexpectedly while in use by Squid. Amos
RE: [squid-users] Nudity Images Filter for Squid
Hi Visolve, Sure, you could do it with DansGuardian, personaly I prefer and advise UfdbGuard that is - from my point of view - much more powerful in term of possibilities than DansGuardian, that's is my opionion only, people are free to use what they need... Did you try our API ? maybe you could find new opportinities with :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 13:54 À : squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sure we may need a real time image filter for advanced image filtering. It can also be possible if we configured bannedregular expression list in dansguardian. It will count for words in a site and if the word exceeds it's limit (3 - 4 words as same eg:porn) then dansguardian will automatically block the sites. And also we are not sure with very newly released domains. Thanks Regards, Visolve Squid On 8/23/2014 2:47 PM, Vdoctor wrote: Hello Visolve, Is your DansGuardian able to block all porn/sexy websites/images, including the very new domains just released ? How do you block those images from google/yahoo search in https ? Here, WebFilter is not enough... you need a real-time images filter :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 11:08 À :squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sounds good, Already we have some proxy servers (like squid with dansguardian ) tools to block the Nudity sites(including the images, contents and videos etc..). Is their any specific reason for going this API ( nudityimagesfilterforsquid )? Thanks, Visolve Squid On 8/23/2014 12:38 AM, Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ You can specify the MaxResol and the MaxScore for the block. All details are in the readme.txt http://sourceforge.net/projects/nudityimagesfilterforsquid/files/readme.txt /download Important: - We provide the API for free, we can not warranty it'll work with your Squid installation, that's why you must test on a separated Squid before going to production. - We do not compile statistics based on your requests and we do not share data with Marketing teams or external companies, we also do not use your data for our internal needs. - If you are interested for a local implementation of our API in your network, just drop us an emailatsupp...@unveiltech.com Your feadbacks are welcome... Bye Fred -- View this message in context:http://squid-web-proxy-cache.1019090.n4.nabble.com/Nudity-Imag es-Fil ter-for-Squid-tp4667345.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Nudity Images Filter for Squid
Hello Fred, Thanks for your suggestion. Surely we will look for your API. Regards, Visolve Squid On 8/23/2014 5:57 PM, Vdoctor wrote: Hi Visolve, Sure, you could do it with DansGuardian, personaly I prefer and advise UfdbGuard that is - from my point of view - much more powerful in term of possibilities than DansGuardian, that's is my opionion only, people are free to use what they need... Did you try our API ? maybe you could find new opportinities with :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 13:54 À : squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sure we may need a real time image filter for advanced image filtering. It can also be possible if we configured bannedregular expression list in dansguardian. It will count for words in a site and if the word exceeds it's limit (3 - 4 words as same eg:porn) then dansguardian will automatically block the sites. And also we are not sure with very newly released domains. Thanks Regards, Visolve Squid On 8/23/2014 2:47 PM, Vdoctor wrote: Hello Visolve, Is your DansGuardian able to block all porn/sexy websites/images, including the very new domains just released ? How do you block those images from google/yahoo search in https ? Here, WebFilter is not enough... you need a real-time images filter :o) Bye Fred -Message d'origine- De : Squid [mailto:sq...@visolve.com] Envoyé : samedi 23 août 2014 11:08 À :squid-users@squid-cache.org Objet : Re: [squid-users] Nudity Images Filter for Squid Hi Fred, Sounds good, Already we have some proxy servers (like squid with dansguardian ) tools to block the Nudity sites(including the images, contents and videos etc..). Is their any specific reason for going this API ( nudityimagesfilterforsquid )? Thanks, Visolve Squid On 8/23/2014 12:38 AM, Stakres wrote: Hi Guys, We just released a new free tool for Squid: Nudity Images Filter for Squid https://sourceforge.net/projects/nudityimagesfilterforsquid/ You can specify the MaxResol and the MaxScore for the block. All details are in the readme.txt http://sourceforge.net/projects/nudityimagesfilterforsquid/files/readme.txt /download Important: - We provide the API for free, we can not warranty it'll work with your Squid installation, that's why you must test on a separated Squid before going to production. - We do not compile statistics based on your requests and we do not share data with Marketing teams or external companies, we also do not use your data for our internal needs. - If you are interested for a local implementation of our API in your network, just drop us an emailatsupp...@unveiltech.com Your feadbacks are welcome... Bye Fred -- View this message in context:http://squid-web-proxy-cache.1019090.n4.nabble.com/Nudity-Imag es-Fil ter-for-Squid-tp4667345.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Only checking URLs via Squid for SSL
Hi, I'm using Squid 3.3.8 as a transparent proxy, it works fine with HTTP, but I'd like to avoid cacheing HTTPS sites, and just determine whether the requested URL is listed as denied on Squid (via 'acl dstdom_regex' for instance), otherwise just make squid act as a proxy to the URL's content. Is that even possible without using SSL Bump? Otherwise, could you recommend the simplest way of achieving this? Thanks
[squid-users] Re: Filter squid cached files to multiple cache dirs
Thanks for that, I missed those parameters! Naturally, I'd start with just one cache_dir and make modifications later. There are people reporting they could not get this to work as the files wouldn't be distributed properly - can you think of any hidden gotchas when setting these two params? -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Filter-squid-cached-files-to-multiple-cache-dirs-tp4667347p4667357.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] Re: Filter squid cached files to multiple cache dirs
In the past, with older squids, there was a bug regarding a conflict with the general parm maximum_object_size regarding the sequence (may be: value ?) of cache_dir max-size and max_obj_size. Can't exactly remember, think, max_obj_ has to be before cache_dir in squid.conf, imposing the highest limit. Should not contradict cache_dir max-size. -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Filter-squid-cached-files-to-multiple-cache-dirs-tp4667347p4667358.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Re: Filter squid cached files to multiple cache dirs
On Saturday 23 August 2014 at 17:55:26 (EU time), babajaga wrote: In the past, with older squids, there was a bug regarding a conflict with the general parm maximum_object_size regarding the sequence (may be: value ?) of cache_dir max-size and max_obj_size. No, it was the sequence they appeared in the config file. Can't exactly remember, think, max_obj_ has to be before cache_dir in squid.conf, imposing the highest limit. Should not contradict cache_dir max-size. http://www.squid-cache.org/mail-archive/squid-users/201404/0159.html So, just to be on the safe side (I don't know in which Squid version this bug was fixed), if you have a maximum_object_size parameter in your configuration file, make sure it occurs before the definition of your cache_dir definitions. Antony. -- BASIC is to computer languages what Roman numerals are to arithmetic. Please reply to the list; please *don't* CC me.
[squid-users] Re: Filter squid cached files to multiple cache dirs
So, to sum it all up (please correct me if I'm wrong) - it is possible to have multiple cache_dirs AND instruct a single squid instance to place files in those caches according to file size criteria using min_file_size/max_file_size params on the cache_dir directive. Also, maximum_object_size directive is basically a global max_file_size param applied to all cache_dirs, so it has to be specified BEFORE any particular cache_dir configuration. If that is the case, I am wondering - is this separation actually inadvisable for any reason? Is there a better way to separate files according to their transiency and underlying data store speed? What would you recommend? -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Filter-squid-cached-files-to-multiple-cache-dirs-tp4667347p4667360.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Re: Filter squid cached files to multiple cache dirs
On 24/08/2014 6:06 a.m., dxun wrote: So, to sum it all up (please correct me if I'm wrong) - it is possible to have multiple cache_dirs AND instruct a single squid instance to place files in those caches according to file size criteria using min_file_size/max_file_size params on the cache_dir directive. Also, maximum_object_size directive is basically a global max_file_size param applied to all cache_dirs, so it has to be specified BEFORE any particular cache_dir configuration. Sort of. * default value for maximum_object_size is 4MB, which is used until you change it. * maximum_object_size is the default value for cache_dir max-size=N parameter. Its current value is applied only if you omit that parameter from a cache_dir. For example (not a good idea to actually do it like this): # default maximum_object_size is 4 MB cache_dir ufs /a 100 16 256 maximum_object_size 8 MB cache_dir ufs /b 100 16 256 maximum_object_size 2 MB cache_dir ufs /c 100 16 256 Is the same as writing: cache_dir ufs /a 100 16 256 max-size=4194304 cache_dir ufs /b 100 16 256 max-size=8388608 cache_dir ufs /c 100 16 256 max-size=2097152 If that is the case, I am wondering - is this separation actually inadvisable for any reason? It is advised for better performance on high throughput configurations with multiple cache_dir. It does not matter for other configurations. Is there a better way to separate files according to their transiency and underlying data store speed? Squid automatically separates out the most recently and frequently used objects for storage in the high speed RAM cache. Also monitors the drive I/O stats for overloading. There is just no differentiation between HDD and SSD speeds (yet) - although indirectly via the loading checks SSD can see more object throughput then HDD. rock cache type is designed to reduce disk I/O loading on objects with high temporal locality (pages often requested or updated together in bunches), particularly if they are small objects. Transiency is handled in memory, or by RAM caching objects for a while before they go near disk. This is controlled by maximum_object_size_in_memory, objects over that limit will have disk I/O regardless of transiency in older Squid. Upcoming 3.5 releases only involve disk on them if they are actually cacheable. ? What would you recommend? Upstream recommendation is to configure maximum_object_size, then your cache_dir ordered by the size of objects going in there (smallest to largest). Also, to use a Rock type cache_dir for the smallest objects. It can be placed on the same HDD as an AUFS cache and working together a rock for small objects and AUFS for large objects can utilize larger HDD sizes better than either cache type alone. * 32KB object size is the limit for rock in current stable releases, that is about to be increased with squid-3.5. Based on theory and second-hand reports: I would only use an SSD for rock type cache with block size parameter for the rock cache sized to match the SSD sector or page size. So that only writing a single rock block/page causes each SSD sector/page to bump further towards its lifetime write limit. Amos
Re: [squid-users] Only checking URLs via Squid for SSL
On 24/08/2014 1:00 a.m., Nicolás wrote: Hi, I'm using Squid 3.3.8 as a transparent proxy, it works fine with HTTP, but I'd like to avoid cacheing HTTPS sites, and just determine whether the requested URL is listed as denied on Squid (via 'acl dstdom_regex' for instance), otherwise just make squid act as a proxy to the URL's content. Is that even possible without using SSL Bump? Otherwise, could you recommend the simplest way of achieving this? No it is only possible with bumping. For transparent interception of port 443 (HTTPS) use squid-3.4 with server-first bumping at minimum, preferrably squid-3.5 with peek-n-splice when it comes out. If you bump and still do not want to cache for some reason the cache access control can be used like so: acl HTTPS proto HTTPS cache deny HTTPS Amos