[squid-users] R: Squid 2.7 for Windows Bug Report
Hi, Hi I am still finding difficult to compile squid with enable ssl option, see the attached for my efforts so far. How can you help me resolve this error either in stable8 or stable9. However, can i take full advantage of HTTPS feature in squid if i don't compile squid with --enable-ssl option? Sorry, but really I don't know how to help you. squid2.7.8make_error.txt: no errors ... squid2.7.9make_error.txt: no errors ... I have just run a build process of latest 2.7 with OpenSSL, no errors I cannot reproduce the Stack.c error. Your configure output seems to be OK. You should ask OpenSSL people about problems related to latest OpenSSL on MinGW. Regards Guido Serassio Acme Consulting S.r.l. Microsoft Gold Certified Partner Via Lucia Savarino, 110098 - Rivoli (TO) - ITALY Tel. : +39.011.9530135 Fax. : +39.011.9781115 Email: guido.seras...@acmeconsulting.it WWW: http://www.acmeconsulting.it
[squid-users] Zero Sized Reply when doing POST
Hi I've setup a firewall (Endian Firewall) in small network. This firewall uses squid cache Users must upload csv files in an extranet website. When they upload/submit the csv files, they get: Zero Sized Reply - Proxy is set to Transparent - Site doesn't work in Firefox/Chrome (decent browser). It works Only on IE! - Uploading works but one csv file by one. If they upload all toghether, they get Zero Sized Reply - Cleaned Cookies and temp files from IE config Any ideas? Thanks
[squid-users] support
hi, what browser need from user if my network used squid to allow go to outside/internet ? Regards, Badrul Hisyam Mohamad Engineer Apex Communications Sdn.Bhd 12th Menara Hap Seng Jalan P.Ramlee 50250 Kuala Lumpur http://www.apex.com.my Tel :603-21485855 Fax:603-21485411
Re: [squid-users] support
Le samedi 2 octobre 2010 09:54:42, Badrul Mohamad a écrit : hi, what browser need from user if my network used squid to allow go to outside/internet ? Regards, Badrul Hisyam Mohamad Engineer Apex Communications Sdn.Bhd 12th Menara Hap Seng Jalan P.Ramlee 50250 Kuala Lumpur http://www.apex.com.my Tel :603-21485855 Fax:603-21485411 Anyone, Opera, IE, MOzillla, Chromium, Konqueror, Arora work very well. LD
[squid-users] Ahead Caching
Hello Squid User Group I wonder how I can configure Squid to load web pages ahead. A while ago, I saw a perl script that forced ahead caching of web pages. I searched the forums and only meeting of the topology where requests are made from the Internet to a Squid server that redirects to especific webservers. This is not what I want. What I want is that the request for a page to Squid, Squid itself makes requests for pages from the related links of the original page! For example, when I open the page http://google.com, Squid would also request the pages http://videos.google.com, http://news.google.com etc and maintain it on cache, so when I open http://videos.google.com, squid returns the cached page. I think this is perfectly possible, but I don't found references on how to do this. Please let me know if I was not clear because I am using a translator. -- Flaviane
[squid-users] Ecap compression
Hello everyone. Has anyone used the compression utility with ecap with the latest realeases of 3.1? I tried it a while back and it was giving me problems like restarting the squid process every once in a while. Would like to know if that has been fixed since 3.1 is now stable. Thanks.
Re: [squid-users] Ahead Caching
It's trivial to run a wget or curl on the same server that the squid proxy is on and access pages through it, directing the output to /dev/null, in order to prime the cache. But there's no explicit way to tell squid to please pull this URL into your cache without an actual HTTP request for that page. Also, don't forget that only objects that aren't marked with no-cache or no-store cache-control headers will be stored in squid's cache, which for sites like google's main pages will result in not very much being cached at all beyond inline graphics. -C On Oct 2, 2010, at 11:05 AM, flaviane athayde wrote: Hello Squid User Group I wonder how I can configure Squid to load web pages ahead. A while ago, I saw a perl script that forced ahead caching of web pages. I searched the forums and only meeting of the topology where requests are made from the Internet to a Squid server that redirects to especific webservers. This is not what I want. What I want is that the request for a page to Squid, Squid itself makes requests for pages from the related links of the original page! For example, when I open the page http://google.com, Squid would also request the pages http://videos.google.com, http://news.google.com etc and maintain it on cache, so when I open http://videos.google.com, squid returns the cached page. I think this is perfectly possible, but I don't found references on how to do this. Please let me know if I was not clear because I am using a translator. -- Flaviane
Re: [squid-users] Ahead Caching
Hello Chris Thanks for your fast reply. I try to put a shell script that read the Squid log, and use it to run wget with -r -l1 -p flag, but it also get its on pages, making a infinit loop, and I can't resolve it. Is there a shell script that do it as I wish ? Thanks again 2010/10/2 Chris Woodfield rek...@semihuman.com: It's trivial to run a wget or curl on the same server that the squid proxy is on and access pages through it, directing the output to /dev/null, in order to prime the cache. But there's no explicit way to tell squid to please pull this URL into your cache without an actual HTTP request for that page. Also, don't forget that only objects that aren't marked with no-cache or no-store cache-control headers will be stored in squid's cache, which for sites like google's main pages will result in not very much being cached at all beyond inline graphics. -C On Oct 2, 2010, at 11:05 AM, flaviane athayde wrote: Hello Squid User Group I wonder how I can configure Squid to load web pages ahead. A while ago, I saw a perl script that forced ahead caching of web pages. I searched the forums and only meeting of the topology where requests are made from the Internet to a Squid server that redirects to especific webservers. This is not what I want. What I want is that the request for a page to Squid, Squid itself makes requests for pages from the related links of the original page! For example, when I open the page http://google.com, Squid would also request the pages http://videos.google.com, http://news.google.com etc and maintain it on cache, so when I open http://videos.google.com, squid returns the cached page. I think this is perfectly possible, but I don't found references on how to do this. Please let me know if I was not clear because I am using a translator. -- Flaviane -- Flaviane
[squid-users] squid syslog-ng, problem rotate
I have configured Squid to send the log to syslog. I Use the following configuration: access_log syslog squid When squid rotate de log, stop send the log messages to syslog-ng. In syslog-ng I have de following configuration: A program that reads from standard input the line that squid send. -- source s_squid { # standard Linux log source (this is the default place for the syslog() # function to send logs to) unix-stream(/dev/log); }; filter f_squid { program(squid) and match(TCP_|UDP_|ERR_);}; destination d_squid_prog { program(/usr/local/quota template($MSGONLY\n) log_fifo_size(5)); }; log { source(s_squid); filter(f_squid); destination(d_squid_prog); }; Exist another way to read from standard input the squid access_log ? Sorry for my english.
[squid-users] squid syslog-ng, problem rotate
I have configured Squid to send the log to syslog. I Use the following configuration: access_log syslog squid When squid rotate de log, stop send the log messages to syslog-ng and starts sending messages to the /var/log/message file In syslog-ng I have de following configuration: A program that reads from standard input the line that squid send. -- source s_squid { # standard Linux log source (this is the default place for the syslog() # function to send logs to) unix-stream(/dev/log); }; filter f_squid { program(squid) and match(TCP_|UDP_|ERR_);}; destination d_squid_prog { program(/usr/local/quota template($MSGONLY\n) log_fifo_size(5)); }; log { source(s_squid); filter(f_squid); destination(d_squid_prog); }; Exist another way to read from standard input the squid access_log ? Sorry for my english.