Re: [squid-users] Re: Effort for port 3.1 to windows?
On 26/04/2011 14:07, sichent wrote: Well, thanks for the pointer, But as far as I can see there, it's a installer, how did you generate the binary? Well we got the binaries from Acme Consulting (link from squid web site) they seem to have quite thorough instructions on how to built the sources I'm surprised no one has been taking on squid on windows seriously (or I didn't find it), but by far, most proxy software I tried on windows have different problem, while squid is the best atm I think. the idea with windows (at least it works for me) is to have a small vm with linux + squid instance running (256Mb+) on windows that serves all possible needs :) ... so i can live without windows port of squid quite happily. If you are interested here is howto that I use to set up the squid in vm box http://sichent.wordpress.com/2011/03/31/tiny-web-proxy-and-content-filtering-appliance-version-1-2/ best regards sich Well sichent it's very nice and many SYSADMIN can do that almost without the need of this guide. the main problem with a proxy server is the RealTime of it and a VM as proxy can work in almost RT response time and in many case will be enough. but it's not a solution for a RT server on an ISP class or ENTERPRISE CLASS. yes you can manage to have a VM and a SAN or NAS that can give you really good performence that can be compared to a RT proxy machine. many users already have a running windows machine and installing VM on them will mean risking the machine functions response time and other complicating some system issues. and also i must say that if a linux program can make it to the windows world it means a lot for the programmers. i probably won't use the windows Binaries but it's because i'm well aware of my linux system environment and capabilities. Regards Eliezer
Re: [squid-users] Squid 3.2.0.7 beta is available
On 19/04/2011 16:44, Ralf Hildebrandt wrote: * Amos Jeffriessqu...@treenet.co.nz: Arg! now I'm going crazy (and blind). Fix applied to trunk. http://www.squid-cache.org/Versions/v3/3.HEAD/changesets/squid-3-11387.patch Working... starting testing in two days. on: ubuntu 10.04 x64 the patch was applied in the helpers directory. and the squid was compiled and installed using: Squid Cache: Version 3.2.0.7 configure options: '--prefix=/opt/squid3207' '--includedir=/include' '--mandir=/share/man' '--infodir=/share/info' '--localstatedir=/opt/squid3207/var' '--disable-maintainer-mode' '--disable-dependency-tracking' '--disable-silent-rules' '--enable-inline' '--enable-async-io=8' '--enable-storeio=ufs,aufs,diskd' '--enable-removal-policies=lru,heap' '--enable-delay-pools' '--enable-cache-digests' '--enable-underscores' '--enable-icap-client' '--enable-follow-x-forwarded-for' '--enable-digest-auth-helpers=ldap,password' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' '--enable-arp-acl' '--enable-esi--disable-translation' '--with-logdir=/opt/squid3207/var/log/' '--with-pidfile=/var/run/squid3207.pid' '--with-filedescriptors=65536' '--with-large-files' '--with-default-user=proxy' '--enable-linux-netfilter' '--enable-ltdl-convenience about the soft reconfigure option. i have a bash script idea for a reconfigure using a second instance for the squid 3.1 + versions. it will use a second running and configure instance of squid to make the reconfiguring less painful for the users. Eliezer
[squid-users] ask n need youtube cache
(Help) im new one in squid im will build squid server with youtube cache. can anyone squid master here to help me to enable youtube with squid or third software (like youtube_cache). thanks before -- Best regards, rioda78.squid mailto:rioda7878.sq...@gmail.com
Re: [squid-users] ask n need youtube cache
2011/4/27 rioda78.squid rioda78.sq...@gmail.com: (Help) im new one in squid im will build squid server with youtube cache. can anyone squid master here to help me to enable youtube with squid or third software (like youtube_cache). Is this wiki article helpful to you? http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube Regards.
[squid-users] Filtering based on content size.
The reply_body_max_size directive prevents users from downloading very large files. The following configuration in Squid only allows download of size 15MB from IP range mentioned in the acl officelan. If the size is more it simply restricts the user with an error message. acl officelan src 192.168.1.0-192.168.1.54 reply_body_max_size 15 MB officelan Is there any way I can do something else instead of showing the error page to user when the size is more than 15MB. Something similar to the following: if ( reply_body_max_size 15MB ) { // do this; } else { // do this; } Regards Supratik
[squid-users] Problem regarding Basic Authentication
Hello everybody, I'm trying to run a small Squid proxy server on my home network and I would like to use authentication on it. I decided to start by using the basic authentication method and maybe later i'd like to switch to MySQL. Now the problem is this: my proxy never asks users for authentication. I added the following lines to my /etc/squid/squid.conf auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/passwd auth_param basic children 5 auth_param basic realm Squid proxy server by maffo auth_param basic credentialsttl 2 hours and finally I proceeded to create the password files issuing the following command (as root): htpasswd -c /etc/squid/passwd test (just to create a test user) and i gave him the password 'test'. Now, the passwd file is correctly populated but Squid still doesn't ask for user/password. If I try to run /usr/lib/squid/ncsa_auth /etc/squid/passwd and send in input 'test:test' it always return ERR. Can anybody help me please?
Re: [squid-users] ask n need youtube cache
On 27/04/2011 11:53, rioda78.squid wrote: (Help) im new one in squid im will build squid server with youtube cache. can anyone squid master here to help me to enable youtube with squid or third software (like youtube_cache). thanks before it's different using third party or squid2.7 using the store_url_rewrite. you should look at the wiki about dynamic content and also on: http://code.google.com/p/youtube-cache/ andre made this nice thing about youtube cache. there is an idea of using ICAP to do youtube caching. if you wish to try your self making it work and opensource, many will be glad. Regards Eliezer
[squid-users] Two extenral acl helpers
Hello, Is it allowed to implement in Squid3 two helpers? 1. I need to authenticate users via squid_kerb_auth 2. I need to check if user is in AD group - squid_kerb_ldap 3. I need to chech, if user is on my local database (has performed some action on server) - I'll write my own simple helper. Is it possible to use both (2) and (3) so: If user is not in AD - deny http If user is in AD but my_helper returns Err - deny http with my own page If squid_kerb_ldap AND my_helper is OK - allow http traffic. Regards Rafal
[squid-users] Re: Effort for port 3.1 to windows?
the main problem with a proxy server is the RealTime of it and a VM as proxy can work in almost RT response time and in many case will be enough. but it's not a solution for a RT server on an ISP class or ENTERPRISE CLASS. Well you probably right but... in these kinds of environments then (if it is windows of course...) the better idea would be to use the Forefront TMG or ISA from Microsoft.. and costs does not matter there much i guess... many users already have a running windows machine and installing VM on them will mean risking the machine functions response time and other complicating some system issues. ok :) and also i must say that if a linux program can make it to the windows world it means a lot for the programmers. +1 Well if you finally manage to build the latest binaries we will happily build an MSI for them... as we are also quite interested in 3+ squid capabilities (namely ICAP) best regards, sich
[squid-users] Squid error with WARNING: HTTP header contains NULL characters
Steps to reproduce: 1. Go to http://sourceforge.net/projects/sarg/files/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz/download 2. Attempt to download 3. Squid will display error page saying The requested URL could not be retrieved and The HTTP Response message received from the contacted server could not be understood or was otherwise malformed. Please contact the site operator. cache.log contains the error below: 2011/04/27 11:53:25| WARNING: HTTP: Invalid Response: Bad header encountered from http://downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1 AKA downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1 2011/04/27 11:53:25| ctx: enter level 0: 'http://downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1' 2011/04/27 11:53:25| WARNING: HTTP header contains NULL characters {Access-Control-Allow-Origin: * X-Powered-By: PHP/5.2.9 Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz} NULL {Access-Control-Allow-Origin: * X-Powered-By: PHP/5.2.9 Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz 2011/04/27 11:53:25| ctx: exit level 0 Here is a squid -v Squid Cache: Version 3.1.12.1 configure options: 'CHOST=i686-pc-linux-gnu' 'CFLAGS=-march=prescott -O2 -pipe -fomit-frame-pointer' 'CXXFLAGS=' '--prefix=/usr' '--includedir=/include' '--mandir=/share/man' '--infodir=/share/info' '--sysconfdir=/etc' '--localstatedir=/var' '--libexecdir=/lib/squid3' '--disable-maintainer-mode' '--disable-dependency-tracking' '--disable-silent-rules' '--srcdir=.' '--datadir=/usr/share/squid3' '--sysconfdir=/etc/squid3' '--mandir=/usr/share/man' '--enable-inline' '--enable-async-io=8' '--with-cppunit-basedir=/usr' '--enable-storeio=ufs,aufs,diskd' '--enable-removal-policies=heap' '--enable-delay-pools' '--enable-cache-digests' '--enable-icap-client' '--enable-underscore' '--enable-follow-x-forwarded-for' '--enable-auth=basic,digest,ntlm,negotiate' '--enable-basic-auth-helpers=LDAP,MSNT,NCSA,PAM,YP,getpwnam,multi-domain-NTLM' '--enable-digest-auth-helpers=ldap,password' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' '--enable-snmp' '--enable-epoll' '--with-large-files--with-filedescriptors=65536' '--enable-arp-acl' '--enable-zph-qos' '--enable-esi' '--with-logdir=/var/log/squid3' '--with-pidfile=/var/run/squid3.pid' '--with-filedescriptors=65536' '--with-large-files' '--enable-linux-netfilter' '--with-default-user=proxy' --with-squid=/opt/squid-3.1.12.1 Sourceforge is not the only website that does it, not all websites do it, but some. So far all affected websites have been affected in the header line Content-Disposition. I also have wireshark captures from a machine running outside squid and one running inside. Any help with this issue would be appreciated. Thank you.
RE: [squid-users] can´t access site fna.gov.co:8081
You aren't allowing tunneling/CONNECT to TCP/8081. It would appear that you need to adjust your ACLs to allow this. -Original Message- From: Oscar Andrés Eraso Moncayo [mailto:oscar.er...@sisa.com.co] Sent: Wednesday, April 27, 2011 1:07 PM To: squid-users@squid-cache.org Subject: [squid-users] can´t access site fna.gov.co:8081 Hello i can´t access to site https://www.fna.gov.co:8081/BancaVirtual/a/login.jsp, the error in the log is 1303923860.335 4 10.120.5.41 TCP_DENIED/403 1455 CONNECT www.fna.gov.co:8081 no proxy settings in the browser the access is correct, help me please, best regards,
Re: [squid-users] can´t access site fna.gov.co:8081
On 27/04/11 18:08, Baird, Josh wrote: You aren't allowing tunneling/CONNECT to TCP/8081. It would appear that you need to adjust your ACLs to allow this. -Original Message- From: Oscar Andrés Eraso Moncayo [mailto:oscar.er...@sisa.com.co] Sent: Wednesday, April 27, 2011 1:07 PM To: squid-users@squid-cache.org Subject: [squid-users] can´t access site fna.gov.co:8081 Hello i can´t access to site https://www.fna.gov.co:8081/BancaVirtual/a/login.jsp, the error in the log is 1303923860.335 4 10.120.5.41 TCP_DENIED/403 1455 CONNECT www.fna.gov.co:8081 no proxy settings in the browser the access is correct, help me please, best regards, The default in Squid is to only allow CONNECT to port 443. You need to add to this acl to allow other ports on CONNECT. This is *dangerous* so only allow it per dstdomain. Alex
RE: [squid-users] can´t access site fna.gov.co:8081
as I do ? thanks !! From: Alex Crow [a...@nanogherkin.com] Sent: Wednesday, April 27, 2011 12:54 PM To: Oscar Andrés Eraso Moncayo Cc: squid-users@squid-cache.org Subject: Re: [squid-users] can´t access site fna.gov.co:8081 On 27/04/11 18:08, Baird, Josh wrote: You aren't allowing tunneling/CONNECT to TCP/8081. It would appear that you need to adjust your ACLs to allow this. -Original Message- From: Oscar Andrés Eraso Moncayo [mailto:oscar.er...@sisa.com.co] Sent: Wednesday, April 27, 2011 1:07 PM To: squid-users@squid-cache.org Subject: [squid-users] can´t access site fna.gov.co:8081 Hello i can´t access to site https://www.fna.gov.co:8081/BancaVirtual/a/login.jsp, the error in the log is 1303923860.335 4 10.120.5.41 TCP_DENIED/403 1455 CONNECT www.fna.gov.co:8081 no proxy settings in the browser the access is correct, help me please, best regards, The default in Squid is to only allow CONNECT to port 443. You need to add to this acl to allow other ports on CONNECT. This is *dangerous* so only allow it per dstdomain. Alex
[squid-users] 3.2.0.7 compile error on fedora 14.
Hi, I was trying to investigate if the sslBump problem recently reported by Will Metcalf is a result of bug 1991 that SSL does not work well with FreeBSD kqueue. So I tried to test the same setting on a Linux. The system is a RedHat Fedora 14. Gcc version 4.5.1. 20100924, and OpenSSL 1.0.0d-fips 8 Feb 2011 ./configure '--disable-loadable-modules' '--disable-esi' '--enable-icap_client' '--enable-ssl' '--enable-auth' '--enable-ssl-crtd' --enable-ltdl-convenience However I was blocked by the compile error: make[3]: Entering directory `/home/fming/squid-3.2.0.7/src/ssl' g++ -DHAVE_CONFIG_H -I../.. -I../../include -I../../lib -I../../src -I../../include -Wall -Wpointer-arith -Wwrite-strings -Wcomments -Werror -pipe -D_REENTRANT -g -O2 -MT ssl_crtd.o -MD -MP -MF .deps/ssl_crtd.Tpo -c -o ssl_crtd.o ssl_crtd.cc In file included from ssl_crtd.cc:9:0: ../../src/ssl/certificate_db.h: In static member function 'static long unsigned int Ssl::CertificateDb::index_serial_hash_LHASH_HASH(const void*)': ../../src/ssl/certificate_db.h:113:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:113:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:113:12: error: 'index_serial_hash_hash' was not declared in this scope ../../src/ssl/certificate_db.h: In static member function 'static int Ssl::CertificateDb::index_serial_cmp_LHASH_COMP(const void*, const void*)': ../../src/ssl/certificate_db.h:114:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:114:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:114:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:114:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:114:12: error: 'index_serial_cmp_cmp' was not declared in this scope ../../src/ssl/certificate_db.h: In static member function 'static long unsigned int Ssl::CertificateDb::index_name_hash_LHASH_HASH(const void*)': ../../src/ssl/certificate_db.h:115:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:115:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:115:12: error: 'index_name_hash_hash' was not declared in this scope ../../src/ssl/certificate_db.h: In static member function 'static int Ssl::CertificateDb::index_name_cmp_LHASH_COMP(const void*, const void*)': ../../src/ssl/certificate_db.h:116:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:116:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:116:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:116:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:116:12: error: 'index_name_cmp_cmp' was not declared in this scope make[3]: *** [ssl_crtd.o] Error 1 Any help is appreciated, Ming
Re: [squid-users] can´t access site fna.gov.co:8081
On 27/04/11 19:39, Oscar Andrés Eraso Moncayo wrote: as I do ? thanks !! If you show the relevant parts of your squid.conf I'm sure may people here will help you with it. Cheers Alex
RE: [squid-users] can´t access site fna.gov.co:8081
Hi, squid.conf: ** http_port 127.0.0.1:3030 hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? cache deny QUERY acl apache rep_header Server ^Apache broken_vary_encoding allow apache cache_mem 1024 MB cache_dir ufs /var/spool/squid 4096 16 256 access_log /var/log/squid/access.log squid authenticate_ip_ttl 1 hours refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern . 0 20% 4320 acl all src 0.0.0.0/0.0.0.0 acl localhost src 127.0.0.1/255.255.255.255 #acl msn_messenger req_mime_type -i ^application/x-msn-messenger$ #acl msn_url url_regex -i gateway.dll http_access allow localhost #http_access deny msn_messenger #http_access deny msn_method msn_url http_access deny all http_reply_access allow all icp_access allow all error_directory /usr/share/squid/errors/Spanish client_db off log_fqdn off *** Best regards, From: Alex Crow [a...@nanogherkin.com] Sent: Wednesday, April 27, 2011 2:27 PM To: Oscar Andrés Eraso Moncayo Cc: squid-users@squid-cache.org Subject: Re: [squid-users] can´t access site fna.gov.co:8081 On 27/04/11 19:39, Oscar Andrés Eraso Moncayo wrote: as I do ? thanks !! If you show the relevant parts of your squid.conf I'm sure may people here will help you with it. Cheers Alex
Re: [squid-users] I would like to use Squid for caching but it is imperative that all files be cached.
Thanks for your reply again Amos, I am familiar with wget and curl for fetching headers and files. I think I can come up with a solution using either wget or curl to create a static cache. I will write a simple server application to serve the snapshot of the headers and content. The simple server could fetch files not in the cache as requested. I could redirect traffic to the simple server using iptables. Cheers, Dan
Re: [squid-users] Squid error with WARNING: HTTP header contains NULL characters
On Wed, 27 Apr 2011 12:04:23 -0500, Sam Klinger wrote: Steps to reproduce: 1. Go to http://sourceforge.net/projects/sarg/files/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz/download 2. Attempt to download 3. Squid will display error page saying The requested URL could not be retrieved and The HTTP Response message received from the contacted server could not be understood or was otherwise malformed. Please contact the site operator. cache.log contains the error below: 2011/04/27 11:53:25| WARNING: HTTP: Invalid Response: Bad header encountered from http://downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1 AKA downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1 2011/04/27 11:53:25| ctx: enter level 0: 'http://downloads.sourceforge.net/project/sarg/sarg/sarg-2.3.1/sarg-2.3.1.tar.gz?r=ts=1303923196use_mirror=cdnetworks-us-1' 2011/04/27 11:53:25| WARNING: HTTP header contains NULL characters {Access-Control-Allow-Origin: * X-Powered-By: PHP/5.2.9 Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz} NULL {Access-Control-Allow-Origin: * X-Powered-By: PHP/5.2.9 Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz 2011/04/27 11:53:25| ctx: exit level 0 Here is a squid -v Squid Cache: Version 3.1.12.1 configure options: 'CHOST=i686-pc-linux-gnu' 'CFLAGS=-march=prescott -O2 -pipe -fomit-frame-pointer' 'CXXFLAGS=' '--prefix=/usr' '--includedir=/include' '--mandir=/share/man' '--infodir=/share/info' '--sysconfdir=/etc' '--localstatedir=/var' '--libexecdir=/lib/squid3' '--disable-maintainer-mode' '--disable-dependency-tracking' '--disable-silent-rules' '--srcdir=.' '--datadir=/usr/share/squid3' '--sysconfdir=/etc/squid3' '--mandir=/usr/share/man' '--enable-inline' '--enable-async-io=8' '--with-cppunit-basedir=/usr' '--enable-storeio=ufs,aufs,diskd' '--enable-removal-policies=heap' '--enable-delay-pools' '--enable-cache-digests' '--enable-icap-client' '--enable-underscore' '--enable-follow-x-forwarded-for' '--enable-auth=basic,digest,ntlm,negotiate' '--enable-basic-auth-helpers=LDAP,MSNT,NCSA,PAM,YP,getpwnam,multi-domain-NTLM' '--enable-digest-auth-helpers=ldap,password' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' '--enable-snmp' '--enable-epoll' '--with-large-files--with-filedescriptors=65536' '--enable-arp-acl' '--enable-zph-qos' '--enable-esi' '--with-logdir=/var/log/squid3' '--with-pidfile=/var/run/squid3.pid' '--with-filedescriptors=65536' '--with-large-files' '--enable-linux-netfilter' '--with-default-user=proxy' --with-squid=/opt/squid-3.1.12.1 Sourceforge is not the only website that does it, not all websites do it, but some. So far all affected websites have been affected in the header line Content-Disposition. I also have wireshark captures from a machine running outside squid and one running inside. Any help with this issue would be appreciated. Thank you. Squid is doing all that is possible to be done in these circumstances. The HTTP headers are sent with a binary connection terminator (NULL) right in the middle of an ASCII-only portion of the protocol. The cache.log trace shows a full trace of the header block with NULL in the middle where the NULL is occuring. Do not be fooled by the duplicate nature of headers in that trace. That is actually what squid has received: Access-Control-Allow-Origin: *\r\n X-Powered-By: PHP/5.2.9\r\n Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz\0 Access-Control-Allow-Origin: *\r\n X-Powered-By: PHP/5.2.9\r\n Content-Disposition: attachment; filename=sarg-2.3.1.tar.gz\0 Normally one needed only to report it to the source website that their server or script is broken. Nowdays you may also have to trace the whole relay path looking for broken content adapters. Amos
Re: [squid-users] Two extenral acl helpers
On Wed, 27 Apr 2011 15:42:51 +0200, Rafal Zawierta wrote: Hello, Is it allowed to implement in Squid3 two helpers? 1. I need to authenticate users via squid_kerb_auth 2. I need to check if user is in AD group - squid_kerb_ldap 3. I need to chech, if user is on my local database (has performed some action on server) - I'll write my own simple helper. Is it possible to use both (2) and (3) so: If user is not in AD - deny http If user is in AD but my_helper returns Err - deny http with my own page If squid_kerb_ldap AND my_helper is OK - allow http traffic. Yes this is possible in all Squid since 2.6 AFAIK. One auth_param negotiate program to validate the credentials are correct and multiple named external_acl_type helpers to check the groups. Amos
Re: [squid-users] Filtering based on content size.
On Wed, 27 Apr 2011 15:36:52 +0530, Supratik Goswami wrote: The reply_body_max_size directive prevents users from downloading very large files. The following configuration in Squid only allows download of size 15MB from IP range mentioned in the acl officelan. If the size is more it simply restricts the user with an error message. What would you have it do? Amos
Re: [squid-users] 3.2.0.7 compile error on fedora 14.
On 28/04/11 06:58, Ming Fu wrote: Hi, I was trying to investigate if the sslBump problem recently reported by Will Metcalf is a result of bug 1991 that SSL does not work well with FreeBSD kqueue. Hmm, interesting idea. That might be involved. We have had several independent mentions about the 3.2.0.7 and 3.1.12.1 failure of SslBump. I was thinking it was probably due to the I/O handling changes we made to CONNECT in those releases. I am awaiting some kind of fix from someone interested. For the record the developers working on and with SslBump do not read this list so everybody who is reporting beta code bugs here (instead of squid-dev mailing list where such reports are supposed to go) is not getting anything close to expert attention. So I tried to test the same setting on a Linux. The system is a RedHat Fedora 14. Gcc version 4.5.1. 20100924, and OpenSSL 1.0.0d-fips 8 Feb 2011 ./configure '--disable-loadable-modules' '--disable-esi' '--enable-icap_client' '--enable-ssl' '--enable-auth' '--enable-ssl-crtd' --enable-ltdl-convenience However I was blocked by the compile error: make[3]: Entering directory `/home/fming/squid-3.2.0.7/src/ssl' g++ -DHAVE_CONFIG_H -I../.. -I../../include -I../../lib -I../../src -I../../include -Wall -Wpointer-arith -Wwrite-strings -Wcomments -Werror -pipe -D_REENTRANT -g -O2 -MT ssl_crtd.o -MD -MP -MF .deps/ssl_crtd.Tpo -c -o ssl_crtd.o ssl_crtd.cc In file included from ssl_crtd.cc:9:0: ../../src/ssl/certificate_db.h: In static member function 'static long unsigned int Ssl::CertificateDb::index_serial_hash_LHASH_HASH(const void*)': ../../src/ssl/certificate_db.h:113:12: error: duplicate 'const' ../../src/ssl/certificate_db.h:113:12: error: invalid conversion from 'const void*' to 'const char***' ../../src/ssl/certificate_db.h:113:12: error: 'index_serial_hash_hash' was not declared in this scope Interesting. This is the compiler complaining about the OpenSSL header definition of IMPLEMENT_LHASH_HASH_FN. I'm not sure where these char*** or double const are coming from. The Squid definition is char** and the OpenSSL (0.9.8o) macro I can see does not add any reference or pointer or const to the parameters. Amos -- Please be using Current Stable Squid 2.7.STABLE9 or 3.1.12 Beta testers wanted for 3.2.0.7 and 3.1.12.1
[squid-users] Custom error 400 page
Hi Everyone, Playing around with reverse proxying with cache, and I have a bit of a problem some of the pages are using 302 redirects, so I can't cache them. If the webserver goes down it will return an http error 400, can I customise an error page in: C:\squid\share\errors\English, if so which one, or do I make a new one? This is the line from the access.log: 1303970032.096 15 127.0.0.1 TCP_MISS/400 323 GET http://www.site.com/about.aspx - FIRST_UP_PARENT/myAccel text/html -- Regards Morgan Storey
Re: [squid-users] Filtering based on content size.
@Amos Thanks for your reply. Currently I am using acl to filter file extension .exe, .iso, .zip and using with tcp_outgoing_address I am able to change the source IP and it is working fine with source based routing. I want to filter by size (Ex. 15MB) which I am unable to do it using ACL. On the other hand reply_body_max_size does filtering based on size but I am not able to use it as per my requirement. Is it a limitation of Squid ? Please let me know if there is any way to resolve this issue. Regards Supratik On Thu, Apr 28, 2011 at 5:01 AM, Amos Jeffries squ...@treenet.co.nz wrote: On Wed, 27 Apr 2011 15:36:52 +0530, Supratik Goswami wrote: The reply_body_max_size directive prevents users from downloading very large files. The following configuration in Squid only allows download of size 15MB from IP range mentioned in the acl officelan. If the size is more it simply restricts the user with an error message. What would you have it do? Amos