Re: [squid-users] compliled squid size
Strip binary: strip -s squid Enviado do meu iPhone > Em 9 de set de 2016, às 10:58, mzgmediaescreveu: > > I think that the > the param is -g and is there already there on the CFLAGS and CXXFLAGS > > > > -- > View this message in context: > http://squid-web-proxy-cache.1019090.n4.nabble.com/compliled-squid-size-tp4679405p4679436.html > Sent from the Squid - Users mailing list archive at Nabble.com. > ___ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] squid-3.4.8 intercept
Thats because you have not set your local network to squid. You have to allow your network range 66.159.32.0/24 2014-11-18 15:59 GMT-02:00 Frank fr...@cronomagic.com: Hi, Since upgrading from 3.1.22 to 3.4.8 I have been unable to get the transparent mode to accept my IP. I am seeing permission denied in the transaction when I do a packet dump. I have read the documentation making changes for 3.4.8. I even allowed everything and no go. I also compiled squid and here is my configure script: ./configure \ --prefix=/usr/share/squid-3.4.8 \ --libdir=/usr/lib${LIBDIRSUFFIX} \ --sysconfdir=/etc/squid \ --localstatedir=/var/log/squid \ --datadir=/usr/share/squid-3.4.8 \ --with-pidfile=/var/run/squid/squid.pid \ --mandir=/usr/man \ --with-logdir=/var/log/squid \ --enable-snmp \ --enable-ipf-transparent \ --enable-ipfw-transparent # --enable-auth=basic \ # --enable-basic-auth-helpers=NCSA \ # --enable-linux-netfilter \ # --enable-async-io \ # --disable-strict-error-checking My machine the browser is on: 66.159.32.31 The machine that is running squid: 66.159.47.22 Here is my squid.conf === # # Recommended minimum configuration: # cache_effective_user squid cache_effective_group squid # Example rule allowing access from your local networks. # Adapt to list your (internal) IP networks from where browsing # should be allowed acl localnet src all# RFC1918 possible internal network #acl localnet src 66.159.32.0/24# RFC1918 possible internal network #acl localnet src 108.161.167.0/24 # RFC1918 possible internal network #acl localnet src 66.159.47.0/24# RFC1918 possible internal network #acl localnet src 127.0.0.0/24 # RFC1918 possible internal network acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl CONNECT method CONNECT # # Recommended minimum Access Permission configuration: # # Deny requests to certain unsafe ports http_access deny !Safe_ports # Deny CONNECT to other than secure SSL ports ###http_access deny CONNECT !SSL_ports # Only allow cachemgr access from localhost ###http_access allow localhost manager ###http_access deny manager # We strongly recommend the following be uncommented to protect innocent # web applications running on the proxy server who think the only # one who can access services on localhost is a local user http_access deny to_localhost # # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS # # Example rule allowing access from your local networks. # Adapt localnet in the ACL section to list your (internal) IP networks # from where browsing should be allowed http_access allow localnet http_access allow localhost # And finally deny all other access to this proxy #http_access deny all http_access allow all # Squid normally listens to port 3128 http_port 3128 http_port 3129 intercept always_direct allow all # Uncomment and adjust the following to add a disk cache directory. cache_dir ufs /usr/share/squid/cache 100 32 512 # Leave coredumps in the first cache dir coredump_dir /var/log/squid/cache/squid # # Add any of your own refresh_pattern entries above these. # refresh_pattern ^ftp: 144020% 10080 refresh_pattern ^gopher:14400% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 == And I have configured my browser to use HTTP Proxy 66.159.47.22 Port 3129 I also setup iptables on my machine as follows and that didn't work either. Same permission denied. /sbin/iptables -t nat -A OUTPUT -p tcp -s 66.159.32.31 --dport 80 -j DNAT --to 66.159.47.22:3129 Let me know if further info is needed. Any help would be greatly appreciated. -- Regards, Frank Torontour Network Administrator fr...@cronomagic.com 514-341-1579 EXT-214 1-800-427-6012 Ext-214 ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users ___ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users
Re: [squid-users] illegal instruction with 3.4.6 (no problem with 3.4.4)
Please provide more details, like log output, or a coredump backtrace. Thanks 2014-08-27 11:44 GMT-03:00 Alfredo Rezinovsky alfr...@fing.uncu.edu.ar: squid exits with illegal instruction message Squid Cache: Version 3.4.6-20140826-r13168 configure options: '--disable-auth' '--disable-auto-locale' '--disable-cache-digests' '--disable-cpu-profiling' '--disable-debug-cbdata' '--disable-delay-pools' '--disable-devpoll' '--disable-ecap' '--disable-esi' '--disable-eui' '--disable-external-acl-helpers' '--disable-follow-x-forwarded-for' '--disable-forw-via-db' '--enable-gnuregex' '--disable-htcp' '--disable-icap-client' '--disable-ident-lookups' '--enable-internal-dns' '--disable-ipf-transparent' '--disable-ipfw-transparent' '--disable-ipv6' '--disable-leakfinder' '--disable-pf-transparent' '--disable-poll' '--disable-select' '--disable-snmp' '--disable-ssl' '--disable-stacktraces' '--disable-translation' '--disable-url-rewrite-helpers' '--disable-wccp' '--disable-wccpv2' '--disable-win32-service' '--disable-x-accelerator-vary' '--disable-icmp' '--disable-storeid-rewrite-helpers' '--enable-async-io' '--enable-disk-io' '--enable-epoll' '--enable-http-violations' '--enable-inline' '--enable-kill-parent-hack' '--enable-linux-netfilter' '--enable-log-daemon-helpers' '--enable-removal-policies' '--enable-storeio' '--enable-unlinkd' '--enable-x-accelerator-vary' '--enable-zph-qos' '--with-default-user=nobody' '--with-pthreads' '--with-included-ltdl' --enable-ltdl-convenience No problem with Squid Cache: Version 3.4.4-20140323-r13111 with the same compile options the cpu is cpu an Intel(R) Xeon(R) CPU E5345 @ 2.33GHz supporting: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx lm constant_tsc arch_perfmon pebs bts rep_good nopl aperfmperf pni dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm dca lahf_lm dtherm tpr_shadow It seems to be trying to use non existent instructions due to a cpu misdetection. -- Alfrenovsky
Re: [squid-users] Very slow initial reply
On my squid box it shows DNS failure. 014/08/26 15:15:09.243 kid1| ModEpoll.cc(139) SetSelect: FD 8, type=1, handler=1, client_data=0, timeout=0 2014/08/26 15:15:09.243 kid1| dns_internal.cc(1362) idnsRead: idnsRead: FD 8: received 55 bytes from 127.0.0.1:53 2014/08/26 15:15:09.243 kid1| dns_internal.cc(1169) idnsGrokReply: idnsGrokReply: QID 0xf689, -2 answers 2014/08/26 15:15:09.244 kid1| dns_internal.cc(1234) idnsGrokReply: idnsGrokReply: error Server Failure: The name server was unable to process this query. (2) 2014/08/26 15:15:09.244 kid1| dns_internal.cc(1092) idnsCallback: Merging DNS results www.lusitania.pt A has 3 RR, has -2 RR 2014/08/26 15:15:09.244 kid1| dns_internal.cc(1125) idnsCallback: Sending 3 (OK) DNS results to caller. 2014/08/26 15:15:09.244 kid1| ipcache.cc(498) ipcacheParse: ipcacheParse: 3 answers for 'www.lusitania.pt' 2014/08/26 15:15:09.244 kid1| ipcache.cc(556) ipcacheParse: ipcacheParse: www.lusitania.pt #0 212.55.134.4 2014/08/26 15:15:09.244 kid1| ipcache.cc(556) ipcacheParse: ipcacheParse: www.lusitania.pt #1 62.28.187.7 2014/08/26 15:15:09.245 kid1| client_side_request.cc(546) hostHeaderIpVerify: validate IP 62.28.187.7:80 non-match from Host: IP 212.55.134.4 2014/08/26 15:15:09.245 kid1| client_side_request.cc(541) hostHeaderIpVerify: validate IP 62.28.187.7:80 possible from Host: Thanks 2014-08-26 14:32 GMT-03:00 Bruno Guerreiro bruno.guerre...@ine.pt: Hello. Some of our user are complaning about very slow access to some sites. After some tests i've noticed that the time between squid receiving the request, and actually connecting to the site itself is very high. After this wait all the objects in the page are fetch rather quickly. I've tried upgrading to 3.4 but the issue persists. No auth in place, and the Squid server is connected to internet via full nat. Connecting directly from the server ou via some other proxy software, like nginx, works perfectly. Here are some of the sites (this are portuguese insurance companies): www.nseguros.pt www.lusitania.pt www.logo.pt Any ideas? Thanks in advance. Bruno Guerreiro DMSI/IT Instituto Nacional de Estatística Tel: 218440448 - Ext: 1657 Bruno Guerreiro DMSI/IT Instituto Nacional de Estatística Tel: 218440448 - Ext: 1657 Confidencialidade: Esta mensagem (e eventuais ficheiros anexos) é destinada exclusivamente às pessoas nela indicadas e tem natureza confidencial. Se receber esta mensagem por engano, por favor contacte o remetente e elimine a mensagem e ficheiros, sem tomar conhecimento do respectivo conteúdo e sem reproduzi-la ou divulgá-la. Confidentiality Warning: This e-mail message (and any attached files) is confidential and is intended solely for the use of the individual or entity to whom it is addressed. lf you are not the intended recipient of this message please notify the sender and delete and destroy all copies immediately.
Re: [squid-users] Anybody using squid on openWRT ?
Unfortunately openwrt squid package is very outdated and buggy. I've tried it, but I gave up. I'm not sure, but they do not include software which uses C++ as language. 99% of its package repository are C source software, may be this is one reason to keep an older squid version, which is not written in C++ 2014-08-22 7:48 GMT-03:00 babajaga augustus_me...@yahoo.de: Just trying to use offic. package for openWRT, which is based on squid2.7 only. Having detected some DNS-issues, does anybody use squid on openWRT, and which squid version ? -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Anybody-using-squid-on-openWRT-tp4667335.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Re: Never used Squid, need to access it
if you don't know where is squid.conf you can locate it by running: find / -name squid.conf 2/dev/null It will print full path to config file. When found, you will need to know how to manage it. Your server might be running an old squid version, so you have to pay attention to newer ACLs types, that your squid may not accept. 2014-07-25 7:59 GMT-03:00 babajaga augustus_me...@yahoo.de: how to actually access the software itself. Pls, be more specific. What do you want to know or achieve ? (Usually, either in /etc OR in /usr/local/squid/etc the config-files to be found). Search for squid.conf. That's the entry for the features used. Depending on, whether squid has been installed from a binary package, or not, you also might find sources. -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Never-used-Squid-need-to-access-it-tp4667025p4667026.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] YouTube Resolution Locker
Yes, as the youtube accelerator cache does. Only youtube urls need to be rewritten, so you dont need to forward all URLs to storeid. 2014-07-25 23:25 GMT-03:00 Amm ammdispose-sq...@yahoo.com: On 07/25/2014 09:03 PM, Stakres wrote: Hi All, Free API to lock resolution in YouTube players via your prefered Squid Cache. https://sourceforge.net/projects/youtuberesolutionlocker/ BIG WARNING: I looked at the script out of curiosity. It sends all queries to storeid.unveiltech.com in background. Amm
Re: [squid-users] Squid v3.3.8 SSL Bumping Issues
Fundamentally, my intent is to set up Squid for home use to block advertising, malware, and in particular, perform content adaptation. One of my specific goals is to modify search URL paths to restrict explicit search returns (e.g. affixing safe=active to any Google search path) Hi David, I did some work to filter google explicit search by DNS hijacking and tinyproxy I redirect google.* to nosslsearch.google.com and use my modified version of tinyproxy, which transparently intercepts *only* google traffic for now. I did it for an embedded platform. If you'd like to check it out and test it: https://github.com/polaco1782/tinyproxy 2014-07-08 23:17 GMT-03:00 David Marcos davem.busin...@gmail.com: Hi, I have been attempting to configure SSL bumping with Squid v3.3.8. I have a well configured Squid proxy for HTTP and HTTP Intercept proxying. I am now trying to expand the configuration to bump SSL connections. I believe I have the basics of the configuration correct for both direct HTTPS proxying as well as intercepted HTTPS, but am having a few issues that I would appreciate some input on. Specifically: a. HTTPS Page Rendering: Some HTTPS pages load fine. However, I have found that if I try to login to online banking or other secure pages that either (1) the page does not render properly (I get flat, unorganized text) or (2) the page simply does not load. With respect to the latter, some pages simply bring me right back to the login page; there seems to be some kind of behind-the-scenes redirection that is being rejected and preventing logging in. What recommendations might anyone have to tweak my configuration to address these issues? b. HTTP Strict Transport Security (HSTS): Some pages flat-out reject any SSL bumping due to HSTS. I am using Chrome, which I'm sure aggravates the issue. Is there a way to configure Squid to get around HSTS? (Yes, I know this may be a dumb question given how HSTS works, but would appreciate any insight.) Fundamentally, my intent is to set up Squid for home use to block advertising, malware, and in particular, perform content adaptation. One of my specific goals is to modify search URL paths to restrict explicit search returns (e.g. affixing safe=active to any Google search path). I have additionally configured ICAP with SquidClamav, multiple ACLs for blocking of ads and malware, and SquidGuard for additional domain and url blocking. SquidGuard is also successfully manipulating *unencrypted* Google, Yahoo, and Bing URL paths to insert commands to suppress explicit search returns. (I should note that when I tested out SSL bumping, I disabled ICAP, Squidguard, and ACLs for blocking of ads and malware; the issues described above persisted.) Below is my squid.conf file to help out. Thanks in advance, Dave #BEGIN FILE# hosts_file /etc/hosts visible_hostname proxyserver shutdown_lifetime 5 seconds coredump_dir /tmp dns_nameservers 192.168.1.1 208.67.222.222 208.67.220.220 half_closed_clients off negative_ttl 0 negative_dns_ttl 2 minutes http_port 127.0.0.1:3128 http_port 192.168.1.1:3128 ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=4MB cert=/etc/squid3/certs/cert.crt key=/etc/squid3/certs/cert.key http_port 192.168.1.1:3129 intercept https_port 192.168.1.1:3130 intercept ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=4MB cert=/etc/squid3/certs/cert.crt key=/etc/squid3/certs/cert.key sslcrtd_program /usr/lib/squid3/ssl_crtd -s /disk/dyn-certs/sslcrtd_db -M 4MB sslcrtd_children 5 udp_incoming_address 192.168.1.1 pinger_enable off forwarded_for delete via off memory_replacement_policy heap GDSF cache_replacement_policy heap LFUDA maximum_object_size_in_memory 1 MB minimum_object_size 0 KB maximum_object_size 64 MB memory_pools off cache_mem 256 MB cache_dir aufs /disk/squid-cache 25000 32 512 cache_swap_low 95 cache_swap_high 97 ipcache_size 10240 fqdncache_size 2048 quick_abort_min 0 KB quick_abort_max 0 KB max_filedescriptors 4096 read_ahead_gap 512 KB client_lifetime 6 hours connect_timeout 10 seconds log_icp_queries off buffered_logs on debug_options ALL,1 logformat squid %tg %6tr %A %Ss/%03Hs UA=%{User-Agent}h XFF=%{X-Forwarded-For}h CKE=- %rm %ru %un %Sh/%A %mt BYTES=%st access_log stdio:/var/log/squid/access.log squid cache_log /var/log/squid/cache.log cache_store_log none #/var/log/squid/store.log icap_enable on icap_send_client_ip on icap_send_client_username on icap_client_username_encode off icap_client_username_header X-Authenticated-User icap_preview_enable on icap_preview_size 1024 icap_service sqclamav_req reqmod_precache bypass=1 icap://127.0.0.1:1344/squidclamav adaptation_access sqclamav_req allow all icap_service sqclamav_resp respmod_precache bypass=1 icap://127.0.0.1:1344/squidclamav adaptation_access sqclamav_resp allow all refresh_pattern -i
Re: [squid-users] Detecting proxy server
If you issue: lynx -head -dump http://www.google.com And if the proxy server is set to add headers, you will see it, even if transparent: HTTP/1.1 302 Found Cache-Control: private Content-Type: text/html; charset=UTF-8 Location: http://www.google.com.br/?gfe_rd=crei=xJy6U66VDqKk8weLnYGIAw Content-Length: 262 Date: Mon, 07 Jul 2014 13:12:36 GMT Server: GFE/2.0 Alternate-Protocol: 80:quic X-Cache: MISS from firewall.securegateway.localnet Via: 1.1 firewall.securegateway.localnet (squid/3.4.5) Connection: close 2014-07-07 9:58 GMT-03:00 Vinay C vinayc.ma...@gmail.com: Hi All, Is there any way I can detect which Proxy server (Preferably name like Squid, Websense etc) did my http-request passes through (Either at webserver side or at request initiating client side)? Thanks, -Vinay
Re: [squid-users] Problem with HTTP redirection and IPTABLES?
1 - Your iptables is missing DNAT target, you may try using REDIRECT target. 2 - In Squid 3.1+ the transparent option has been split. Use 'intercept to catch DNAT packets. 2014-07-03 11:25 GMT-03:00 Mark jensen ngiw2...@hotmail.com: I have follow this tutorial to redirect HTTP traffic to the squid listening on 8080: http://wiki.squid-cache.org/ConfigExamples/Intercept/AtSource My questions are: 1- when I try to do this command: iptables -t nat -A OUTPUT -p tcp --dport 80 -j DNAT --to-destination SQUIDIP:8080 an error returns: unknown option --to-destination (iptables version1.4.7) 2- I'm using squid3.1.10 what option should I chose: http_port 8080 transparent OR http_port 8080 intercept mark
Re: [squid-users] Squid auth_param children not closing
Your script is not shutting down when squid closes stdin/stdout. So squid will start a new copy every time. 2014-06-23 19:56 GMT-03:00 Romeo Mihalcea romeo.mihal...@gmail.com: I have this authentication script on my setup: auth_param basic program /usr/bin/python /etc/auth.py auth_param basic children 5 startup=5 idle=1 auth_param basic realm Please login auth_param basic credentialsttl 1 hours Quite frequently I check with a cron for changes in my user's data (passwords etc) and I issue a squid3 -k reconfigure whenever I detect a change. The problem is that each time I issue this command squid spanws 5 new authentication listeners (/usr/bin/python /etc/auth.py) and it quickly adds up to thousands given the amount of activity we have on our servers. Any ideas on how I can do this better?
Re: [squid-users] memory_cache_shared no support for atomic operations
Ah OK, thank you very much. If someone could fix the documentation and add a note that is only 64 bit supported.. Thanks 2014-06-10 1:47 GMT-03:00 Amos Jeffries squ...@treenet.co.nz: On 10/06/2014 10:10 a.m., Eliezer Croitoru wrote: On 06/10/2014 12:43 AM, Cassiano Martin wrote: Yes its 32 bit custom built OS As far as I can remember the shared memory needed 64bit OS and HW. I am not 100% sure yet. Yes 64-bit atomics are required. And for now it is also restricted to GNU-style atomic operations. Although with a bit of patching these can be replaced with C++11 standardized atomics. Amos
Re: [squid-users] Re: Squid 3.4.x Videos/Music Booster
I've rewritten your scripts to C++ native binary. If you'd like to include it on your package, feel free to contact me. Thanks 2014-06-09 16:42 GMT-03:00 Stakres vdoc...@neuf.fr: Hi All, Version https://sourceforge.net/projects/squidvideosbooster/ *1.02* is released including: - Scripts in Perl, Ruby and Python. - Special key 'porn' to de-duplicate or not Porn web sites Bye Fred -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-3-4-x-Videos-Music-Booster-tp4666154p4666272.html Sent from the Squid - Users mailing list archive at Nabble.com.
Re: [squid-users] Re: Squid 3.4.x Videos/Music Booster
Anyway, here it is. Compile it with g++ booster.cpp -lcurl http://pastebin.com/raw.php?i=5HKk8rQq Admins, sorry about this off-topic! 2014-06-11 1:10 GMT-03:00 Cassiano Martin cassi...@polaco.pro.br: I've rewritten your scripts to C++ native binary. If you'd like to include it on your package, feel free to contact me. Thanks 2014-06-09 16:42 GMT-03:00 Stakres vdoc...@neuf.fr: Hi All, Version https://sourceforge.net/projects/squidvideosbooster/ *1.02* is released including: - Scripts in Perl, Ruby and Python. - Special key 'porn' to de-duplicate or not Porn web sites Bye Fred -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-3-4-x-Videos-Music-Booster-tp4666154p4666272.html Sent from the Squid - Users mailing list archive at Nabble.com.
[squid-users] memory_cache_shared no support for atomic operations
Hello there. I'm trying to setup my squid workers to share cache_mem, bu when I activate memory_cache_shared I get this error: FATAL: memory_cache_shared is on, but no support for atomic operations detected What I'm missing? Does this imply in something related to kernel configuration? I've not found any tip regarding this. thanks
Re: [squid-users] external_acl_mode
Use deny_info for that. External ACL filters just says to squid what is OK and NOT OK You can pass some data to squid by keypairs (key=value) but I think its only used to squid replace tags on response pages. 2014-05-25 6:51 GMT-03:00 Jose-Marcio Martins jose-marcio.mart...@mines-paristech.fr: On 05/25/2014 01:08 AM, Luis Daniel Lucio Quiroz wrote: use url_rewrite_* I'm converting a url_rewrite_* program to external_acl* Thanks either way. 2014-05-24 13:00 GMT-04:00 Jose-Marcio Martins jose-marcio.mart...@mines-paristech.fr: Hello, I'm writing a external_acl_mode helper to use with our proxy (combined with http_access). Is there a way to specify an alternative URL, as does redirect helpers, when access to the asked URL is denied ? Thanks José-Marcio -- --
Fwd: [squid-users] External ACLs strange behavior
Amos, The issue seems to be resolved, but sake of curiosity, I had some tweaks on net.netfilter.nf_conntrack_tcp_timeout_established - set to 3600 net.netfilter.nf_conntrack_tcp_loose - set to 0 Does it interfere with the helpers? i'm not sure about it, but why for each helper, a socket is open on loopback? After I changed these values to: net.netfilter.nf_conntrack_tcp_timeout_established - set to 7440 net.netfilter.nf_conntrack_tcp_loose - set to 1 Completely stopped from squid reloading the helpers every hour. I'm not sure if I'm right, may be I'm just saying BS. :-) Thanks -- Forwarded message -- From: Cassiano Martin cassi...@polaco.pro.br Date: 2014-05-19 13:40 GMT-03:00 Subject: Fwd: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org I'll test it again without this test. lets see if it will work normally then. Thanks! -- Forwarded message -- From: Cassiano Martin cassi...@polaco.pro.br Date: 2014-05-19 13:39 GMT-03:00 Subject: Fwd: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org Hmm, I didnt known about that. -- Forwarded message -- From: Amos Jeffries squ...@treenet.co.nz Date: 2014-05-19 11:49 GMT-03:00 Subject: Re: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org On 20/05/2014 12:46 a.m., Cassiano Martin wrote: I'm still having a strange issue with external ACLs. Sometimes I get this in my squid logs: ERR message= 2014/05/19 08:33:23 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:23 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:23 kid1| Starting new helpers 2014/05/19 08:33:23 kid1| helperOpenServers: Starting 1/5 'squid_filter' processes 2014/05/19 08:33:26 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:26 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3128 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3129 2014/05/19 08:33:26 kid1| storeDirWriteCleanLogs: Starting... 2014/05/19 08:33:26 kid1| Finished. Wrote 0 entries. 2014/05/19 08:33:26 kid1| Took 0.00 seconds ( 0.00 entries/sec). FATAL: The securegateway_cfs helpers are crashing too rapidly, need help! And this is my fake external ACL for testing purposes: int main(int argc, char** argv) { string line; while(getline(cin, line)) { if(line.length()1) { CFS::print_response_err(); } else exit(0); } } Why squid says redirector exited, and catches a FATAL? This code does not exits until squid closes stdin. No this code closes also when Squid only delivers 1 byte of input. For example when you pass it a single format code which has no data available. Squid will send the line - which is 1 character long. Amos
[squid-users] External ACLs strange behavior
I'm still having a strange issue with external ACLs. Sometimes I get this in my squid logs: ERR message= 2014/05/19 08:33:23 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:23 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:23 kid1| Starting new helpers 2014/05/19 08:33:23 kid1| helperOpenServers: Starting 1/5 'squid_filter' processes 2014/05/19 08:33:26 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:26 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3128 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3129 2014/05/19 08:33:26 kid1| storeDirWriteCleanLogs: Starting... 2014/05/19 08:33:26 kid1| Finished. Wrote 0 entries. 2014/05/19 08:33:26 kid1| Took 0.00 seconds ( 0.00 entries/sec). FATAL: The securegateway_cfs helpers are crashing too rapidly, need help! And this is my fake external ACL for testing purposes: int main(int argc, char** argv) { string line; while(getline(cin, line)) { if(line.length()1) { CFS::print_response_err(); } else exit(0); } } Why squid says redirector exited, and catches a FATAL? This code does not exits until squid closes stdin. Thanks
Fwd: [squid-users] External ACLs strange behavior
Hmm, I didnt known about that. -- Forwarded message -- From: Amos Jeffries squ...@treenet.co.nz Date: 2014-05-19 11:49 GMT-03:00 Subject: Re: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org On 20/05/2014 12:46 a.m., Cassiano Martin wrote: I'm still having a strange issue with external ACLs. Sometimes I get this in my squid logs: ERR message= 2014/05/19 08:33:23 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:23 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:23 kid1| Starting new helpers 2014/05/19 08:33:23 kid1| helperOpenServers: Starting 1/5 'squid_filter' processes 2014/05/19 08:33:26 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:26 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3128 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3129 2014/05/19 08:33:26 kid1| storeDirWriteCleanLogs: Starting... 2014/05/19 08:33:26 kid1| Finished. Wrote 0 entries. 2014/05/19 08:33:26 kid1| Took 0.00 seconds ( 0.00 entries/sec). FATAL: The securegateway_cfs helpers are crashing too rapidly, need help! And this is my fake external ACL for testing purposes: int main(int argc, char** argv) { string line; while(getline(cin, line)) { if(line.length()1) { CFS::print_response_err(); } else exit(0); } } Why squid says redirector exited, and catches a FATAL? This code does not exits until squid closes stdin. No this code closes also when Squid only delivers 1 byte of input. For example when you pass it a single format code which has no data available. Squid will send the line - which is 1 character long. Amos
Fwd: [squid-users] External ACLs strange behavior
I'll test it again without this test. lets see if it will work normally then. Thanks! -- Forwarded message -- From: Cassiano Martin cassi...@polaco.pro.br Date: 2014-05-19 13:39 GMT-03:00 Subject: Fwd: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org Hmm, I didnt known about that. -- Forwarded message -- From: Amos Jeffries squ...@treenet.co.nz Date: 2014-05-19 11:49 GMT-03:00 Subject: Re: [squid-users] External ACLs strange behavior To: squid-users@squid-cache.org On 20/05/2014 12:46 a.m., Cassiano Martin wrote: I'm still having a strange issue with external ACLs. Sometimes I get this in my squid logs: ERR message= 2014/05/19 08:33:23 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:23 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:23 kid1| Starting new helpers 2014/05/19 08:33:23 kid1| helperOpenServers: Starting 1/5 'squid_filter' processes 2014/05/19 08:33:26 kid1| WARNING: securegateway_cfs #Hlpr0 exited 2014/05/19 08:33:26 kid1| Too few securegateway_cfs processes are running (need 1/5) 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3128 2014/05/19 08:33:26 kid1| Closing HTTP port [::]:3129 2014/05/19 08:33:26 kid1| storeDirWriteCleanLogs: Starting... 2014/05/19 08:33:26 kid1| Finished. Wrote 0 entries. 2014/05/19 08:33:26 kid1| Took 0.00 seconds ( 0.00 entries/sec). FATAL: The securegateway_cfs helpers are crashing too rapidly, need help! And this is my fake external ACL for testing purposes: int main(int argc, char** argv) { string line; while(getline(cin, line)) { if(line.length()1) { CFS::print_response_err(); } else exit(0); } } Why squid says redirector exited, and catches a FATAL? This code does not exits until squid closes stdin. No this code closes also when Squid only delivers 1 byte of input. For example when you pass it a single format code which has no data available. Squid will send the line - which is 1 character long. Amos
[squid-users] Re: Segfault in CommSelectEngine::checkEvents
b7fff000-b800 rw-p 00022000 00:01 114/lib/ld-2.17.so bffdf000-c000 rw-p 00:00 0 [stack] Program received signal SIGABRT, Aborted. 0xb7fdb424 in __kernel_vsyscall () (gdb) Sorry about symbols, the binary is stripped. 2014-03-20 12:02 GMT-03:00 Cassiano Martin cassi...@polaco.pro.br: Guys, I'm facing a very strange issue. I have compiled squid, from version 3.3.3 to 3.4.4 and ALL of them crashes on the same place. The architecture is mips64. I've attached a backtrace Program received signal SIGSEGV, Segmentation fault. 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 2058comm.cc: No such file or directory. (gdb) (gdb) (gdb) (gdb) (gdb) bt #0 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 #1 0x10229a90 in EventLoop::checkEngine (this=0x7fff6920, engine=0x7fff6910, primary=true) at EventLoop.cc:55 #2 0x1022a004 in EventLoop::runOnce (this=0x7fff6920) at EventLoop.cc:129 #3 0x10229d9c in EventLoop::run (this=0x7fff6920) at EventLoop.cc:99 #4 0x102fa25c in SquidMain (argc=2, argv=0x7fff6af4) at main.cc:1534 #5 0x102f91c0 in SquidMainSafe (argc=2, argv=0x7fff6af4) at main.cc:1262 #6 0x102f9170 in main (argc=2, argv=0x7fff6af4) at main.cc:1254 It happens when I open some connections, and close the web browser. I have no Idea what I can do to solve this. Squid is in transparent mode. Thanks
Fwd: [squid-users] Fwd: Segfault in CommSelectEngine::checkEvents
Squid is in both modes. It accepts directly configured proxy, and interception mode. Heres my configuration file (squid.conf). Its auto generated from a system daemon which i've written. acl localnet src 10.0.0.0/8 acl localnet src 172.16.0.0/12 acl localnet src 192.168.0.0/16 acl localnet src fc00::/7 acl localnet src fe80::/10 acl SSL_ports port 443 acl CONNECT method CONNECT dns_nameservers 127.0.0.1 hierarchy_stoplist cgi-bin ? http_access allow manager localhost http_access deny manager http_access deny CONNECT !SSL_ports refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 acl localdst dst 127.0.0.1/32 acl localdst dst 192.168.100.0/24 acl localdst dst 192.168.150.0/24 acl localdst dst 192.168.50.0/24 acl localdst dst 192.168.175.0/24 acl localdst dst 172.16.10.0/24 acl source_Vendas_443 src 192.168.50.3 acl dport_Vendas_443 port 443 tcp_outgoing_address 189.27.236.136 source_Vendas_443 dport_Vendas_443 !localdst acl source_Devel443 src 192.168.150.177 acl source_Devel443 src 192.168.150.13 acl source_Devel443 src 192.168.150.8 acl source_Devel443 src 192.168.150.95 acl source_Devel443 src 192.168.150.196 acl dport_Devel443 port 443 tcp_outgoing_address 189.27.236.136 source_Devel443 dport_Devel443 !localdst acl source_Wireless443 src 192.168.175.242 acl source_Wireless443 src 192.168.175.21 acl dport_Wireless443 port 443 tcp_outgoing_address 187.113.225.9 source_Wireless443 dport_Wireless443 !localdst acl source_WiFi443 src 192.168.175.0/24 acl dport_WiFi443 port 443 tcp_outgoing_address 189.27.236.136 source_WiFi443 dport_WiFi443 !localdst acl source_voip src 192.168.50.33 tcp_outgoing_address 187.113.225.9 source_voip !localdst acl source_L4D2 src 192.168.100.78 tcp_outgoing_address 187.113.225.9 source_L4D2 !localdst external_acl_type securegateway_cfs ipv4 %DST %PROTO %PORT /usr/bin/squid_filter acl grp_IDB src 192.168.100.0/24 192.168.150.0/24 192.168.175.0/24 172.16.10.0/24 acl cat_Teste external securegateway_cfs 09,0B,0E,10,12,19,56,58,5C http_access deny cat_Teste grp_IDB acl out_balance random 1/2 tcp_outgoing_address 187.113.225.9 out_balance !localdst tcp_outgoing_address 189.27.236.136 out_balance !localdst http_access allow localnet http_access allow localhost http_access deny all pid_filename /var/run/squid.pid half_closed_clients off memory_replacement_policy heap GDSF cache_replacement_policy heap LFUDA cache_effective_user squid cache_effective_group squid cache_mem 8 MB memory_pools off workers 1 visible_hostname firewall.securegateway.localnet coredump_dir none access_log daemon: logfile_daemon /usr/bin/squid_logger cache_log /var/squid/logs/cache.log http_port 3128 http_port 3129 intercept qos_flows local-hit=0x1c NB: when testing, I also disabled ACLs, extenal ACLs, Logging daemon, but no sucess at all. Squid still crashes on the same place. 2014-04-03 19:12 GMT-03:00 Eliezer Croitoru elie...@ngtech.co.il: On 03/20/2014 05:42 PM, Cassiano Martin wrote: Squid is in transparent mode. tproxy or redirect targets on iptables? Eliezer
[squid-users] Fwd: Segfault in CommSelectEngine::checkEvents
I'm also facing this issue on x86. At some point squid crashes and dumps the same error here's how I've configured squid: SQUID_CONF_OPT = --sysconfdir=/etc/squid \ --localstatedir=/var \ --datadir=/usr/share/squid \ --enable-linux-netfilter \ --enable-removal-policies=lru,heap \ --with-filedescriptors=65535 \ --disable-ident-lookups \ --with-krb5-config=no \ --enable-auth-basic=fake getpwnam \ --enable-auth-digest=file \ --enable-auth-negotiate=wrapper \ --enable-auth-ntlm=fake \ --enable-basic-auth-helpers=NCSA \ --disable-strict-error-checking \ --enable-storeio=aufs \ --enable-disk-io=DiskThreads \ --enable-arp-acl \ --enable-delay-pools \ --enable-kill-parent-hack \ --enable-http-violations \ --enable-follow-x-forwarded-for \ --enable-zph-qos \ --with-netfilter-conntrack \ --enable-external-acl-helpers=file_userip I'm really out of ideas, never faced an issue like this -- Forwarded message -- From: Cassiano Martin cassi...@polaco.pro.br Date: 2014-03-20 12:02 GMT-03:00 Subject: Segfault in CommSelectEngine::checkEvents To: squid-users@squid-cache.org Guys, I'm facing a very strange issue. I have compiled squid, from version 3.3.3 to 3.4.4 and ALL of them crashes on the same place. The architecture is mips64. I've attached a backtrace Program received signal SIGSEGV, Segmentation fault. 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 2058comm.cc: No such file or directory. (gdb) (gdb) (gdb) (gdb) (gdb) bt #0 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 #1 0x10229a90 in EventLoop::checkEngine (this=0x7fff6920, engine=0x7fff6910, primary=true) at EventLoop.cc:55 #2 0x1022a004 in EventLoop::runOnce (this=0x7fff6920) at EventLoop.cc:129 #3 0x10229d9c in EventLoop::run (this=0x7fff6920) at EventLoop.cc:99 #4 0x102fa25c in SquidMain (argc=2, argv=0x7fff6af4) at main.cc:1534 #5 0x102f91c0 in SquidMainSafe (argc=2, argv=0x7fff6af4) at main.cc:1262 #6 0x102f9170 in main (argc=2, argv=0x7fff6af4) at main.cc:1254 It happens when I open some connections, and close the web browser. I have no Idea what I can do to solve this. Squid is in transparent mode. Thanks
[squid-users] Fwd: Segfault in CommSelectEngine::checkEvents
Guys, I'm facing a very strange issue. I have compiled squid, from version 3.3.3 to 3.4.4 and ALL of them crashes on the same place. The architecture is mips64, big endian n32 binary format I've attached a backtrace Program received signal SIGSEGV, Segmentation fault. 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 2058comm.cc: No such file or directory. (gdb) (gdb) (gdb) (gdb) (gdb) bt #0 0x104771ec in CommSelectEngine::checkEvents (this=0x7fff6910, timeout=24) at comm.cc:2058 #1 0x10229a90 in EventLoop::checkEngine (this=0x7fff6920, engine=0x7fff6910, primary=true) at EventLoop.cc:55 #2 0x1022a004 in EventLoop::runOnce (this=0x7fff6920) at EventLoop.cc:129 #3 0x10229d9c in EventLoop::run (this=0x7fff6920) at EventLoop.cc:99 #4 0x102fa25c in SquidMain (argc=2, argv=0x7fff6af4) at main.cc:1534 #5 0x102f91c0 in SquidMainSafe (argc=2, argv=0x7fff6af4) at main.cc:1262 #6 0x102f9170 in main (argc=2, argv=0x7fff6af4) at main.cc:1254 It happens when I open some connections, for example open a page and click a lot of links and then close the web browser. I have no Idea what I can do to solve this. Squid is in transparent mode. Thanks
Re: [squid-users] Squid Disconnection problem for messenger
Do you mean MSN messenger? No, it wont pass thru squid, only if you block port 1863. Are you sure that is squid causing this? Amod Kulkarni wrote: Hello, I am using squid proxy in our network. Since last few days, we are facing problem with our squid proxy server that it gets disconnected continuousnessly. We have not set any access list or iptables on squid, still the problem is coming to us Could u please tell me how could i come out from this problem? Please look into my request and suggest the solution as early as possible -Amod
Re: [squid-users] Problem while accessing the site
Yup, and this could be the same problem you are having with your messaging software. This is like DNS failures. Amos Jeffries wrote: Amod Kulkarni wrote: Hello Squid, Presently I have a problem to access a URL The problem details is given below: The requested URL could not be retrieved While trying to retrieve the URL: http://googlewebtoolkit.blogspot.com/ The following error was encountered: Unable to determine IP address from host name for googlewebtoolkit.blogspot.com The dnsserver returned: Name Error: The domain name does not exist. This means that: The cache was not able to resolve the hostname presented in the URL. Check if the address is correct. Please suggest the solution on this Check your DNS server is working for that domain. Squid uses either the system default resolver(s) or some you might have configured. Between them they are not providing any DNS data for that domain. Amos
Re: [squid-users] Block Windows Live Messenger with Squid
Messenger uses port 1863 tcp for communication, and some HTTPS SOAP requests to M$ servers. You need to block this port using iptables. iptables -A FORWARD -p tcp --dport 1863 -j DROP iptables -A FORWARD -p tcp --sport 1863 -j DROP adnann5 wrote: Hi Guys, I've a running a transparently working copy of squid 2.6 stable 19 on a Linux FC9 box. I wanted to block msn/windows live messenger through it, i've add following code in my squid.conf acl msnmime req_mime_type ^application/x-msn-messenger acl msngw url_regex -i gateway.dll http_access deny msnmime http_access deny msngw but messenger is still signing in... Does any body have another solution? Regards
Re: [squid-users] Squid logging to a database
You could write one yourself, as squid supports a 'logfile' daemon, or try mysar. It reads the squid logfile and import all content into a mysql database. get it at mysar.sf.net and the fast C code version at www.polaco.pro.br/mysar Wundy wrote: Hi! I currently have a transparent squid in place that used squidguard as a blocking filter. squid version is 2.6 STABLE. I would like to import the access.log into a database for safe keeping and easy access with queries. a request from the company I do the internschip with. I have read various pieces that Squid3.0 would support this but I can't find any solid info on it. is it still in development ? and if so is it stable enough to try it ? My other option would be to write a perl script that reads access.log and imports it into the database. I have a question with this. is it possible to read the access.log and afterwards clear the file so that I can just read the entire file everytime I sync the database ? how would squid react to this clearing ? also Squid rotates the logfiles. I currently have logfile rotate set to 0 so It wouldn't rotate then ? but everything gets logged to access.log.1 . Can I still change this ?
Re: [squid-users] Strange behaviour with website and squid
Got it using lynx. Thanks Henrik Nordstrom wrote: On tis, 2008-05-13 at 07:38 -0300, Cassiano Martin wrote: Got the same problem here, after loding the Site on IE, doesnt show up when reloading. header: HTTP/1.1 200 OK Server: Oracle9iAS/9.0.2.2.0 Oracle HTTP Server Last-Modified: Mon, 17 Sep 2007 13:17:10 GMT Cache-Control: private Content-Type: text/html Content-Length: 160 Date: Tue, 13 May 2008 10:34:04 GMT X-Varnish: 1353841216 1353839708 Age: 97 Via: 1.1 varnish Connection: close Is this the header that was sent to MSIE? Or is this header collected with some other tool? Regards Henrik
Re: [squid-users] Strange behaviour with website and squid
Got the same problem here, after loding the Site on IE, doesnt show up when reloading. header: HTTP/1.1 200 OK Server: Oracle9iAS/9.0.2.2.0 Oracle HTTP Server Last-Modified: Mon, 17 Sep 2007 13:17:10 GMT Cache-Control: private Content-Type: text/html Content-Length: 160 Date: Tue, 13 May 2008 10:34:04 GMT X-Varnish: 1353841216 1353839708 Age: 97 Via: 1.1 varnish Connection: close I`m using squid3-stable1, transparent mode Thomas Kirk escreveu: Hi there list members I have a very peculiar problem with our squid cache. When a user visit the homepage http://www.aarhuskommune.dk it loads perfectly. Second time the site is visited nothing happens. The problem only persists when we use IE or Opera - with FireFox everything works as expected? The web page loads fine if we bypass squidcache in all browser, which is why I write this email to the list :) We have been experimenting with cookies from the site. When we delete the cookie that is set we can browse the website againVery strange... If anybody have an idea whats going on please advice - we are hitting a wall here. Please let me know if you need further details Sincerely Thomas
Re: [squid-users] denied urls between two hours
*Acl Type:* time *Description* Time of day, and day of week *Usage* acl aclname time [day-abbreviations] [h1:m1-h2:m2] day-abbreviations: S - Sunday M - Monday T - Tuesday W - Wednesday H - Thursday F - Friday A - Saturday h1:m1 must be less than h2:m2 *Example* acl ACLTIME time M 9:00-17:00 ACLTIME refers day of Monday from 9:00 to 17:00. E. Traas escreveu: Hi how can i use squid to deny listed url's between two hours ( f.i. 07:00 and 16:00 ) ? Erwin
Re: [squid-users] http://www.synopsys.com connection timeout
Nothing here, everything works fine for me. I`m using squid stable 3.0 Which squid version are you using? Thanks [EMAIL PROTECTED] escreveu: Clients are behind firewall(OpenBSD PF) and accessing web via squid proxy. Accessing http://www.synopsys.com failed with connection timeout error. Squid access log says TCP_MISS/504. A client outside firewall can acess this site. Are there anyone accesing this site via squid proxy?
Re: [squid-users] how to check virus using squid
Google for HAVP Its a anti-virus proxy wich uses clamav. You can use it together with squid. Anil Saini escreveu: how can one check viruses on the host machines browsing thru squid just want to identify the viruses - Anil Saini M.E. - Software Systems B.E. - Electronics and Communication Project Assistant CISCO LAB Information Processing Center Unit BITS-PILANI
Re: [squid-users] per user quota
Or you can continue to develop the user quota system for MySAR. :-) Quota for IP address is working fine, but its missing many other things. you can get it at: http://www.polaco.pro.br/mysar/testing PS: the code is a mess, look out ;-) Fabio Silva escreveu: You can try http://www.ledge.co.za/software/squint/squish . but if it be native from squid should be better. On Mon, Apr 7, 2008 at 12:22 AM, Adrian Chadd [EMAIL PROTECTED] wrote: On Fri, Apr 04, 2008, Marco Berizzi wrote: Hi folks, I would like to know if there is any plan for adding per user quota (per hour/day) to a future squid version. You can implement it using an external_acl and some log parsing. Incremental quota support may appear in a subsequent release. Adrian -- - Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support - - $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -
Re: [squid-users] Limiting download size
Tag Name reply_body_max_size Usage reply_body_max_size (KB) Description This option specifies the maximum size of a reply body. It can be used to prevent users from downloading very large files, such as MP3's and movies. The reply size is checked twice. First when we get the reply headers, we check the content-length value. If the content length value exists and is larger than this parameter, the request is denied and the user receives an error message that says the request or reply is too large. If there is no content-length, and the reply size exceeds this limit, the client's connection is just closed and they will receive a partial reply. Default reply_body_max_size 0 If this parameter is set to zero (the default), there will be no limit imposed. piyush joshi escreveu: Dear All, I want to only allow users using squid server to download only those files from internet which are less than 5 MB in size but do not know how to do this. Please help me so that i can add this feature in my squid server.
Re: [squid-users] How can I tell if snmp has been compiled into Squid?
Try squid -v It should report configure parameters Ed Flecko escreveu: Hi folks, I'm running OpenBSD 4.2 and have installed the Squid package using the pkg_add method. I'm trying to set up snmp monitoring with no success. I keep getting a Invalid ACL type 'snmp_community error message, so now I'm wondering if snmp has been compiled in. Is there a command I can run on Squid to see what options have been compiled in? Thank you, Ed
Re: [squid-users] Squid + ClamAV
You could try HAVP http://www.server-side.de/ At least it works fine for me troxlinux escreveu: there is not much information, you have it implemented, what OS have? greetings 2008/3/25, Henrik Nordstrom [EMAIL PROTECTED]: My recommended method: Squid-3 + c-icap There is other methods as well such as viralator, but ICAP is much better. Regards Henrik
Re: [squid-users] problem with access_log
Are you looking at cache.log or access.log? Looks like you're in the wrong file. Ramashish Baranwal escreveu: Hi, I am trying to prevent logging of certain urls using acls on access_log. The corresponding part of my squid.conf looks like- acl test_url url_regex .*test.* # don't log test_url access_log none test_url # log others access_log log-file-path squid Squid however, is not honoring the acl. It logs everything. The log for request http://netdev.com/test/ looks like- 1204882748.408 RELEASE -1 15A550F13DEC5BEE462C4DDCA8645838 403 1204882748 0 1204882748 text/html 1091/1091 GET http://netdev.com/test/ What am I missing here? My squid version is squid/2.6.STABLE16. Thanks in advance, Ram
Re: [squid-users] Youtube video cache
Adrian Chadd escreveu: On Mon, Mar 03, 2008, Cassiano Martin wrote: Hi all! Did someone had success caching youtube videos? I tried it but it didnt worked for me. I followed all Adrian's steps, but no success at all. The trouble is that its a moving target and I'm having to try and keep things updated. I'm trying to organise better, updated documentation but its only for paying clients at the present time. Trying to keep the documentation updated and keeping an eye on what they're up to requires time! I'm using squid: Squid Cache: Version 2.7.DEVEL0-20080303 configure options: '--enable-delay-pools' '--enable-cache-digests' '--enable-poll' '--disable-ident-lookups' '--enable-truncate' '--enable-removal-policies' '--enable-arp-acl' '--enable-ssl' Thanks. Okay, I'm trying to figure out what changed. If I find what is happening, I'll post to the list. Thanks Adrian.
[squid-users] Youtube video cache
Hi all! Did someone had success caching youtube videos? I tried it but it didnt worked for me. I followed all Adrian's steps, but no success at all. I'm using squid: Squid Cache: Version 2.7.DEVEL0-20080303 configure options: '--enable-delay-pools' '--enable-cache-digests' '--enable-poll' '--disable-ident-lookups' '--enable-truncate' '--enable-removal-policies' '--enable-arp-acl' '--enable-ssl' Thanks.
Re: [squid-users] Squid 3.0 Stable1 with MySql Logging
Amos Jeffries wrote: Hemming Tero wrote: Hi, I Have read that Squid 3.0 supports logging access.log cache.log to MySql database. Haven't found any examples how to deploy it ? Searched all over the net forums. Any tutorials or examples available ? Thanks, Tero, Finland Where did you read that? There is some work still being done on it for 3.1. If you want to test contact the developer for it or squid-dev Amos There are third party application that import squid logs to mysql, such as MySar. I also work on the development of a C version, which is faster than the interpreted language one. You can get it at http://mysar.sf.net and the C importer at http://www.polaco.pro.br/mysar
Re: [squid-users] Squid 3.0 Stable1 with MySql Logging
Adrian Chadd wrote: On Mon, Feb 18, 2008, Cassiano Martin wrote: Where did you read that? There is some work still being done on it for 3.1. If you want to test contact the developer for it or squid-dev Amos There are third party application that import squid logs to mysql, such as MySar. I also work on the development of a C version, which is faster than the interpreted language one. and Squid-2.7, along with a future Squid-3 release, makes hooking into mysql rather easy. An external program can be fed all the logs via a local socket, without having to mess around and patch the source, or execute tail -f on anything. Its easy to do if you know a little Perl, or can employ someone who knows a little Perl. Adrian Interesting feature... I'll take a deeper look on this :-) Thanks Adrian :-)
Re: [squid-users] Squid in Transparent ?
Cassiano Martin wrote: Yes, they can use the proxy, even manually set. Phibee Network Operation Center wrote: Hi it's a problems that use squid with a iptable redirect: 80 to 8080 and into squid.conf don't put transparent to : http_port 8080 ? and if i put http_port 8080 transparent User that use manually into the web browser configuration can continue to use the proxy ? Thanks for your help Jerome
Re: [squid-users] squid source compile
pokeman wrote: can i compile and hide the squid.conf nobody can view settings Certainly not. Only modifying the source code. Sounds like a hardcoded config ;-)
Re: [squid-users] squid Version 2.6.STABLE16 crashing with url_rewriters]
Goj, Dirk wrote: Hi there. Yesterday my proxy started crashing with following error message: 2008/01/31 14:04:12| Starting Squid Cache version 2.6.STABLE16 for i386-debian-linux-gnu... 2008/01/31 14:04:12| Process ID 22919 2008/01/31 14:04:12| With 1024 file descriptors available 2008/01/31 14:04:12| Using epoll for the IO loop 2008/01/31 14:04:12| Performing DNS Tests... 2008/01/31 14:04:12| Successful DNS name lookup tests... 2008/01/31 14:04:12| DNS Socket created at 0.0.0.0, port 37010, FD 6 2008/01/31 14:04:12| Adding nameserver 62.206.1.4 from /etc/resolv.conf 2008/01/31 14:04:12| Adding nameserver 62.206.2.4 from /etc/resolv.conf 2008/01/31 14:04:12| helperOpenServers: Starting 5 'squidGuard' processes 2008/01/31 14:04:12| User-Agent logging is disabled. 2008/01/31 14:04:12| Referer logging is disabled. 2008/01/31 14:04:12| Unlinkd pipe opened on FD 16 2008/01/31 14:04:12| Swap maxSize 1024 KB, estimated 787692 objects 2008/01/31 14:04:12| Target number of buckets: 39384 2008/01/31 14:04:12| Using 65536 Store buckets 2008/01/31 14:04:12| Max Mem size: 8192 KB 2008/01/31 14:04:12| Max Swap size: 1024 KB 2008/01/31 14:04:12| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec 2008/01/31 14:04:12| Rebuilding storage in /usr/local/squid/var/cache (CLEAN) 2008/01/31 14:04:12| Using Least Load store dir selection 2008/01/31 14:04:12| Current Directory is / 2008/01/31 14:04:12| Loaded Icons. 2008/01/31 14:04:12| Accepting proxy HTTP connections at 0.0.0.0, port 3128, FD18 2008/01/31 14:04:12| Accepting ICP messages at 0.0.0.0, port 3130, FD 19. 2008/01/31 14:04:12| Accepting HTCP messages on port 4827, FD 20. 2008/01/31 14:04:12| Accepting SNMP messages on port 3401, FD 21. 2008/01/31 14:04:12| WCCP Disabled. 2008/01/31 14:04:12| Ready to serve requests. 2008/01/31 14:04:12| Store rebuilding is 2.8% complete 2008/01/31 14:04:12| WARNING: url_rewriter #1 (FD 7) exited 2008/01/31 14:04:13| Done reading /usr/local/squid/var/cache swaplog (146478 entries) 2008/01/31 14:04:13| Finished rebuilding storage from disk. 2008/01/31 14:04:13|146478 Entries scanned 2008/01/31 14:04:13| 0 Invalid entries. 2008/01/31 14:04:13| 0 With invalid flags. 2008/01/31 14:04:13|146478 Objects loaded. 2008/01/31 14:04:13| 0 Objects expired. 2008/01/31 14:04:13| 0 Objects cancelled. 2008/01/31 14:04:13| 0 Duplicate URLs purged. 2008/01/31 14:04:13| 0 Swapfile clashes avoided. 2008/01/31 14:04:13| Took 0.7 seconds (195566.6 objects/sec). 2008/01/31 14:04:13| Beginning Validation Procedure 2008/01/31 14:04:13| Completed Validation Procedure 2008/01/31 14:04:13| Validated 146478 Entries 2008/01/31 14:04:13| store_swap_size = 2671964k 2008/01/31 14:04:13| WARNING: url_rewriter #2 (FD 8) exited 2008/01/31 14:04:13| storeLateRelease: released 0 objects 2008/01/31 14:04:14| WARNING: url_rewriter #3 (FD 9) exited 2008/01/31 14:04:14| Too few url_rewriter processes are running 2008/01/31 14:04:14| storeDirWriteCleanLogs: Starting... 2008/01/31 14:04:14| WARNING: Closing open FD 18 2008/01/31 14:04:14| commSetEvents: epoll_ctl(EPOLL_CTL_DEL): failed on fd=18: (1) Operation not permitted 2008/01/31 14:04:14| 65536 entries written so far. 2008/01/31 14:04:14|131072 entries written so far. 2008/01/31 14:04:14| Finished. Wrote 146478 entries. 2008/01/31 14:04:14| Took 0.0 seconds (6418843.1 entries/sec). FATAL: The url_rewriter helpers are crashing too rapidly, need help! Squid Cache (Version 2.6.STABLE16): Terminated abnormally. CPU Usage: 0.580 seconds = 0.260 user + 0.320 sys Maximum Resident Size: 0 KB Page faults with physical i/o: 0 Memory usage for squid via mallinfo(): total space in arena: 16272 KB Ordinary blocks:16186 KB 11 blks Small blocks: 0 KB 0 blks Holding blocks: 504 KB 2 blks Free Small blocks: 0 KB Free Ordinary blocks: 85 KB Total in use: 16690 KB 99% Total free:85 KB 1% When I restart squid from webmin or from ssh the same thing happens. Changed nothing at squid itself. Search the internet and found one thing that the helper (squidGuard) may have a to big logfile. But it's not the logfile. When I disable the helper squidGuard squid runs normally. It's the squidGuard which hasn't changed too... access rights to squidguard-db's are ok. Every folder and file same user like the squid is running under... I'm out of ideas... May anyone of you have ideas ? Thx! Dirk Goj Mit freundlichen Grüßen Udo Täubrich Betreuungs GmbH i. A. Dirk Goj Westpreussenstr. 38 47809 Krefeld Tel: 02151-1570 1766 Fax: 02151-1570 1789 _ Geschäftsführer: Ralph Krahl, Roland Stübing Registergericht:Amtsgericht Krefeld HRB 8906 Die in dieser
Re: [squid-users] How to Clear Cache in SQUID
Shutdown squid and recreate the cache squid -k shutdown rm -rf /var/squid/cache/* (check if this is the real path) squid -z squid Tek Bahadur Limbu wrote: Hi, TRM wrote: Hi List, My HardDisk is getting full, i want to clear the cache. how can do that? If your cache resides in /var/squid/cache/ rm -fr /var/squid/cache/ then recreate the directory and rebuild it!!
Re: [squid-users] Squid Quota
Amos Jeffries wrote: Cassiano Martin wrote: Hi All! I wrote a squid quota daemon (sorry admin, if this is not the right place to announce!) and its working, but in testing stages. Its a squid redirector, which MySQL db as backend, and a log reader, which feeds the DB with information. You can set how much an user, or an IP can use, in MB per day. I want someone to help me with the project, as I dont have too much free time. anyone interested, please contact me. I'll be glad. :-) Thanks! Hi ;-) Very Interesting! :-) In fact, this projetct is a extension to MySAR report generator. The log reader is the same as in the project, as it uses a DB, and fill in with all information that it gathers from squid log file. I just have a few questions: Is the log reader daily after rotate or real-time? I use it to read every minute, so it keeps the DB updated. Without updating the DB, the redirector daemon doesn't know about traffic sizes. I know its not a Real perfect way, but it does the job very well. Can it handle multiple seemingly-identical requests per second? Yes. All operation is done in memory, using a fast search tree. My next step is to use hash based search. Is this an open-source collective, private, or commercial enterprise? It will be open source, I'll release it soon. What language(s) are involved? C, PHP and MySQL as backend. Amos Thanks! Cassiano Martin
[squid-users] Squid Quota
Hi All! I wrote a squid quota daemon (sorry admin, if this is not the right place to announce!) and its working, but in testing stages. Its a squid redirector, which MySQL db as backend, and a log reader, which feeds the DB with information. You can set how much an user, or an IP can use, in MB per day. I want someone to help me with the project, as I dont have too much free time. anyone interested, please contact me. I'll be glad. :-) Thanks!
[squid-users] Squid Quota
Hi All! I wrote a squid quota daemon (sorry admin, if this is not the right place to announce!) and its working, but in testing stages. Its a squid redirector, which MySQL db as backend, and a log reader, which feeds the DB with information. You can set how much an user, or an IP can use, in MB per day. I want someone to help me with the project, as I dont have too much free time. anyone interested, please contact me. I'll be glad. :-) Thanks!
[squid-users] Squid URL rewrite_program
Hello all! According to squid documentation, the URL rewriter is feed with the following line format: URL SP client_ip / fqdn SP user SP method SP urlgroup [SP kvpairs] NL So I ask, its possible to customize the format, or its hardcoded on squid sources? Well, my need is the body reply size, but squid does not tell the rewriter about sizes :-) Any help or idea... Thanks!