Re: [squid-users] Recommended Store Size

2008-11-26 Thread Stand H
Hi Chris,

 The rule of thumb I've read previously is storage
 equivalent to a week's traffic.  If you pass an
 average of 30GB per day, a storage size of 210GB is a good
 start.

I have two squid servers. Each processes around 120GB a day with about 43% 
request hit ratio and 25% byte hit ratio. The cache size is 300GB with 6GB 
memory. Per rule of thumb, can I increase my cache size?

Thank you.

Stand


  


Re: [squid-users] compilation errors on rhel 5

2008-11-26 Thread John Doe
Maybe missing a c++ lib?
But I never tried Squid 3.x, only 2.x...
On my CentOS I have those ( rpm -qa | grep ++ ):

  libstdc++-4.1.2-42.el5
  libstdc++-devel-4.1.2-42.el5
  libsigc++20-2.0.17-1.el5.rf
  gcc-c++-4.1.2-42.el5
  compat-libstdc++-296-2.96-138
  compat-libstdc++-33-3.2.3-61

JD



- Original Message 
 From: Mario Remy Almeida [EMAIL PROTECTED]
 To: Squid Users squid-users@squid-cache.org
 Sent: Wednesday, November 26, 2008 6:42:08 AM
 Subject: [squid-users] compilation errors on rhel 5
 
 Hi All
 
 Since compilation failed on Ubuntu 8.10 thought to give a try on rhel 5
 32bit
 
 but no luck
 
 had all the No such file or directory error message posted in the early
 mail and also got this error message
 
 run make and got the below errors
 
 debug.o: In function `operator ':
 /usr/include/c++/4.3/ostream:517: undefined reference to
 `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 /usr/include/c++/4.3/ostream:517: undefined reference to
 `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 debug.o: In function `operator ':
 /home/remy/rnd/squid-3.1.0.2/src/debug.cc:776: undefined reference to
 `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 debug.o: In function `operator ':
 /usr/include/c++/4.3/ostream:517: undefined reference to
 `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 /usr/include/c++/4.3/ostream:517: undefined reference to
 `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 debug.o:/home/remy/rnd/squid-3.1.0.2/src/debug.cc:778: more undefined
 references to `std::basic_ostream
 std::__ostream_insert
 (std::basic_ostream, char const*, int)'
 follow
 collect2: ld returned 1 exit status
 make[1]: *** [cf_gen] Error 1
 make[1]: Leaving directory `/root/squid-3.1.0.2/src'
 make: *** [all-recursive] Error 1
 
 need help
 
 what should i do next to solve the above errors?
 
 Regards
 Remy



  



Re: [squid-users] compilation errors on rhel 5

2008-11-26 Thread Mario Remy Almeida
Thanks JD,

after installing compat-libstdc++-* the debug.o errors were solved
was able to compile it
but what about the errors of missing files in config.log

conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory
conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory
conftest.c:92:18: error: sasl.h: No such file or directory
conftest.c:59:18: error: sasl.h: No such file or directory
conftest.c:67:28: error: ac_nonexistent.h: No such file or directory
conftest.c:105:21: error: bstring.h: No such file or directory
conftest.c:72:21: error: bstring.h: No such file or directory
conftest.c:113:23: error: gnumalloc.h: No such file or directory
conftest.c:80:23: error: gnumalloc.h: No such file or directory
conftest.c:114:23: error: ip_compat.h: No such file or directory
conftest.c:81:23: error: ip_compat.h: No such file or directory
conftest.c:114:27: error: ip_fil_compat.h: No such file or directory
conftest.c:81:27: error: ip_fil_compat.h: No such file or directory
conftest.c:114:20: error: ip_fil.h: No such file or directory
conftest.c:81:20: error: ip_fil.h: No such file or directory
conftest.c:114:20: error: ip_nat.h: No such file or directory
conftest.c:81:20: error: ip_nat.h: No such file or directory
conftest.c:114:17: error: ipl.h: No such file or directory
conftest.c:81:17: error: ipl.h: No such file or directory
conftest.c:114:18: error: libc.h: No such file or directory
conftest.c:81:18: error: libc.h: No such file or directory
conftest.c:118:19: error: mount.h: No such file or directory
conftest.c:85:19: error: mount.h: No such file or directory
conftest.c:121:35: error: netinet/ip_fil_compat.h: No such file or
directory
conftest.c:88:35: error: netinet/ip_fil_compat.h: No such file or
directory
conftest.c:140:23: error: sys/bswap.h: No such file or directory
conftest.c:107:23: error: sys/bswap.h: No such file or directory
conftest.c:140:24: error: sys/endian.h: No such file or directory
conftest.c:107:24: error: sys/endian.h: No such file or directory
conftest.c:144:21: error: sys/md5.h: No such file or directory
conftest.c:111:21: error: sys/md5.h: No such file or directory
conftest.c:162:18: error: glib.h: No such file or directory
conftest.c:129:18: error: glib.h: No such file or directory
conftest.c:165:24: error: nss_common.h: No such file or directory
conftest.c:132:24: error: nss_common.h: No such file or directory
conftest.c:168:28: error: sys/capability.h: No such file or directory
conftest.c:135:28: error: sys/capability.h: No such file or directory
conftest.c:171:44: error: linux/netfilter_ipv4/ip_tproxy.h: No such file
or directory
conftest.c:194:31: error: netinet/ip_compat.h: No such file or directory
conftest.c:194:28: error: netinet/ip_fil.h: No such file or directory
conftest.c:195:28: error: netinet/ip_nat.h: No such file or directory
conftest.c:195:25: error: netinet/ipl.h: No such file or directory
conftest.c:195:23: error: net/pfvar.h: No such file or directory
conftest.c:177:27: error: libxml/parser.h: No such file or directory
conftest.c:144:27: error: libxml/parser.h: No such file or directory
conftest.c:177:27: error: libxml/parser.h: No such file or directory
conftest.c:144:27: error: libxml/parser.h: No such file or directory

2nd any idea where am i failing to compile it in ubuntu 8.10

errors

 g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I.
-I. -I../include -I. -I. -I../include -I../include
-I../lib/libTrie/include -I../lib -I../lib -I/usr/include/libxml2
-Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments
-DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=
\/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF
ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc  -fPIC -DPIC -o
ICAP/.libs/AsyncJob.o
cc1plus: warnings being treated as errors
ICAP/AsyncJob.cc: In member function ‘virtual const char*
AsyncJob::status() const’:
ICAP/AsyncJob.cc:158: error: format not a string literal and no format
arguments
make[1]: *** [ICAP/AsyncJob.lo] Error 1
make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src'
make: *** [all-recursive] Error 1

Regards,
Remy

On Wed, 2008-11-26 at 02:12 -0800, John Doe wrote:
 Maybe missing a c++ lib?
 But I never tried Squid 3.x, only 2.x...
 On my CentOS I have those ( rpm -qa | grep ++ ):
 
   libstdc++-4.1.2-42.el5
   libstdc++-devel-4.1.2-42.el5
   libsigc++20-2.0.17-1.el5.rf
   gcc-c++-4.1.2-42.el5
   compat-libstdc++-296-2.96-138
   compat-libstdc++-33-3.2.3-61
 
 JD
 
 
 
 - Original Message 
  From: Mario Remy Almeida [EMAIL PROTECTED]
  To: Squid Users squid-users@squid-cache.org
  Sent: Wednesday, November 26, 2008 6:42:08 AM
  Subject: [squid-users] compilation errors on rhel 5
  
  Hi All
  
  Since compilation failed on Ubuntu 8.10 thought to give a try on rhel 5
  32bit
  
  but no luck
  
  had all the No such file or 

[squid-users] scanning request with squid

2008-11-26 Thread Mario Remy Almeida
Hi All,

Can someone tell me how can I scan http,https and ftp request for virus
etc... with squid 3.1.x

Without DG is it possible?

Regards,
Mario 



Re: [squid-users] compilation errors on rhel 5

2008-11-26 Thread John Doe
 after installing compat-libstdc++-* the debug.o errors were solved
 was able to compile it
 but what about the errors of missing files in config.log
 
 conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
 conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
 conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory
 conftest.cpp:22:28: error: ac_nonexistent.h: No such file or directory
 conftest.c:92:18: error: sasl.h: No such file or directory
 conftest.c:59:18: error: sasl.h: No such file or directory
 conftest.c:67:28: error: ac_nonexistent.h: No such file or directory
 conftest.c:105:21: error: bstring.h: No such file or directory
 conftest.c:72:21: error: bstring.h: No such file or directory
 conftest.c:113:23: error: gnumalloc.h: No such file or directory
 conftest.c:80:23: error: gnumalloc.h: No such file or directory
 conftest.c:114:23: error: ip_compat.h: No such file or directory
 conftest.c:81:23: error: ip_compat.h: No such file or directory
 conftest.c:114:27: error: ip_fil_compat.h: No such file or directory
 conftest.c:81:27: error: ip_fil_compat.h: No such file or directory
 conftest.c:114:20: error: ip_fil.h: No such file or directory
 conftest.c:81:20: error: ip_fil.h: No such file or directory
 conftest.c:114:20: error: ip_nat.h: No such file or directory
 conftest.c:81:20: error: ip_nat.h: No such file or directory
 conftest.c:114:17: error: ipl.h: No such file or directory
 conftest.c:81:17: error: ipl.h: No such file or directory
 conftest.c:114:18: error: libc.h: No such file or directory
 conftest.c:81:18: error: libc.h: No such file or directory
 conftest.c:118:19: error: mount.h: No such file or directory
 conftest.c:85:19: error: mount.h: No such file or directory
 conftest.c:121:35: error: netinet/ip_fil_compat.h: No such file or
 directory
 conftest.c:88:35: error: netinet/ip_fil_compat.h: No such file or
 directory
 conftest.c:140:23: error: sys/bswap.h: No such file or directory
 conftest.c:107:23: error: sys/bswap.h: No such file or directory
 conftest.c:140:24: error: sys/endian.h: No such file or directory
 conftest.c:107:24: error: sys/endian.h: No such file or directory
 conftest.c:144:21: error: sys/md5.h: No such file or directory
 conftest.c:111:21: error: sys/md5.h: No such file or directory
 conftest.c:162:18: error: glib.h: No such file or directory
 conftest.c:129:18: error: glib.h: No such file or directory
 conftest.c:165:24: error: nss_common.h: No such file or directory
 conftest.c:132:24: error: nss_common.h: No such file or directory
 conftest.c:168:28: error: sys/capability.h: No such file or directory
 conftest.c:135:28: error: sys/capability.h: No such file or directory
 conftest.c:171:44: error: linux/netfilter_ipv4/ip_tproxy.h: No such file
 or directory
 conftest.c:194:31: error: netinet/ip_compat.h: No such file or directory
 conftest.c:194:28: error: netinet/ip_fil.h: No such file or directory
 conftest.c:195:28: error: netinet/ip_nat.h: No such file or directory
 conftest.c:195:25: error: netinet/ipl.h: No such file or directory
 conftest.c:195:23: error: net/pfvar.h: No such file or directory
 conftest.c:177:27: error: libxml/parser.h: No such file or directory
 conftest.c:144:27: error: libxml/parser.h: No such file or directory
 conftest.c:177:27: error: libxml/parser.h: No such file or directory
 conftest.c:144:27: error: libxml/parser.h: No such file or directory

Not sure if it is normal (as in used to detect available libraries) or not... 
but on my system, I found these:

# rpm -qf /usr/include/linux/netfilter_ipv4/ip_nat.h
kernel-headers-2.6.18-92.1.18.el5
# rpm -qf /usr/include/sasl/sasl.h
# rpm -qf /usr/include/sasl/md5.h
cyrus-sasl-devel-2.1.22-4
# rpm -qf /usr/include/libxml2/libxml/parser.h
libxml2-devel-2.6.26-2.1.2.7
]# rpm -qf /usr/include/glib-2.0/glib.h
glib2-devel-2.12.3-2.fc6
# rpm -qf /usr/include/endian.h
glibc-headers-2.5-24

JD


  



[squid-users] Re: [NoCat] Non-Authenticating Splash Page

2008-11-26 Thread Wilson Hernandez - MSD, S. A.

Colin,

I tried to add some php code but it doesn't get embedded, I guess is 
because the gateway is run with perl. I don't know perl, it would be 
easier to use perl instead of php in this case. How can I embed php code 
on the splash page to build a dynamic page?


Thanks again.



Colin A. White wrote:
Have you tried editing the splash page to embed your own content? And 
change the submit button to say, Dismiss or Ignore?




On Nov 25, 2008, at 4:55 PM, Wilson Hernandez - MSD, S. A. 
[EMAIL PROTECTED] wrote:




I am currently running Nocat in open mode but, the only way it would
work in open mode is if I have a splash page that with a submit button
accepting the agreement (see below). I don't want users to really do
that. All I want is a page to show up every 60mins and let users click
on the splash page' contents or proceed with whatever it was doing.

form method=GET action=$action
   input type=hidden name=redirect value=$redirect
   input type=hidden name=accept_terms value=yes
   input type=hidden name=mode_login
   input type=submit class=button value=Enter
/form


Thanks for replying.



Colin White wrote:

Set the time out to 60 mins and run the gateway in Open mode.
 On Tue, Nov 25, 2008 at 10:47 AM, Wilson Hernandez - MSD, S. A. 
[EMAIL PROTECTED] mailto:[EMAIL PROTECTED] wrote:

   Hello.
   I would like to know if there is a way of redirecting users to a
   splash page every hour and have the user to continue browsing the
   internet without authenticating or accepting an user agreement?
   Thanks.
   -- ___
   NoCat mailing list
   [EMAIL PROTECTED] mailto:[EMAIL PROTECTED]
   http://lists.nocat.net/mailman/listinfo/nocat
--
Colin A. White
P : +1 605 940 5863


--
*Wilson Hernandez*
Presidente
829.848.9595
809.766.0441
www.msdrd.com http://www.msdrd.com
Conservando el medio ambiente


--
*Wilson Hernandez*
Presidente
829.848.9595
809.766.0441
www.msdrd.com http://www.msdrd.com
Conservando el medio ambiente

___
NoCat mailing list
[EMAIL PROTECTED]
http://lists.nocat.net/mailman/listinfo/nocat





--
*Wilson Hernandez*
Presidente
829.848.9595
809.766.0441
www.msdrd.com http://www.msdrd.com
Conservando el medio ambiente


Re: [squid-users] Squid vs httpd mod_cache

2008-11-26 Thread Kinkie
 I would like to know how good Squid's cache management (i.e. pruning) is. I
 get the impression that mod_cache in Apache 2.2 is not very mature - some of
 the cache management features don't even seem to be implemented yet. I
 assume that Squid is a much more mature product, and thus I'd hope that it
 has cache management pretty much down pat.

 How does Squid manage its disk cache? Does it consume a lot of disk io when
 doing it?

The workload you describe is not high by any means (at least for
squid). Squid deployed as a reverse proxy routinely handles 800
requests/second on any decent hardware (I think I've seen reports
floating around speaking of 2000).

 Has anybody else here migrated from using Apache's mod_cache to Squid, and
 if so do you have any insights?

 Lastly, if I do decide to use Squid, is the O'Reilly book from 2004 still
 relevant, or is it out of date now? I know there's a lot of stuff online,
 but I like to have a handy book reference, plus a well-written book often
 has a good intro to the tool. This book seems to get only 5-star reviews on
 Amazon. Is it still up to date?

It's mostly relevant. Some configuration parameters have changed, but
the basic principles still apply.
You can use that as a basic guide, and then refer to the wiki (
http://wiki.squid-cache.org/ ) to drill into specific details.
For any doubts, just ask on the squid-users mailing-list.



-- 
/kinkie


Re: [squid-users] scanning request with squid

2008-11-26 Thread Kinkie
On Wed, Nov 26, 2008 at 12:28 PM, Mario Remy Almeida
[EMAIL PROTECTED] wrote:
 Hi All,

 Can someone tell me how can I scan http,https and ftp request for virus
 etc... with squid 3.1.x

The best option is to use an ICAP-based AV scanner.


-- 
/kinkie


Re: [squid-users] scanning request with squid

2008-11-26 Thread malmeida
Can you suggest any help site where I can start?

Regards,
Remy

On Wed, 26 Nov 2008 15:21:06 +0100, Kinkie [EMAIL PROTECTED] wrote:
 On Wed, Nov 26, 2008 at 12:28 PM, Mario Remy Almeida
 [EMAIL PROTECTED] wrote:
 Hi All,

 Can someone tell me how can I scan http,https and ftp request for virus
 etc... with squid 3.1.x
 
 The best option is to use an ICAP-based AV scanner.
 
 




Re: [squid-users] compilation errors on rhel 5

2008-11-26 Thread Christos Tsantilas
Hi Remy,

 
 2nd any idea where am i failing to compile it in ubuntu 8.10

 errors

  g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I.
 -I. -I../include -I. -I. -I../include -I../include
 -I../lib/libTrie/include -I../lib -I../lib -I/usr/include/libxml2
 -Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments
 -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=
 \/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF
 ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc  -fPIC -DPIC -o
 ICAP/.libs/AsyncJob.o
 cc1plus: warnings being treated as errors
 ICAP/AsyncJob.cc: In member function β#65533;#65533;virtual const char*
 AsyncJob::status() constβ#65533;#65533;:
 ICAP/AsyncJob.cc:158: error: format not a string literal and no format
 arguments
 make[1]: *** [ICAP/AsyncJob.lo] Error 1
 make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src'
 make: *** [all-recursive] Error 1

I think this is the bug 2527:
http://www.squid-cache.org/bugs/show_bug.cgi?id=2527
There is a small patch for this bug can you try it?


Regards,
 Christos

 Regards,
 Remy





Re: [squid-users] scanning request with squid

2008-11-26 Thread Kinkie
On Wed, Nov 26, 2008 at 3:54 PM,  [EMAIL PROTECTED] wrote:
 Can you suggest any help site where I can start?

 Regards,
 Remy

 On Wed, 26 Nov 2008 15:21:06 +0100, Kinkie [EMAIL PROTECTED] wrote:
 On Wed, Nov 26, 2008 at 12:28 PM, Mario Remy Almeida
 [EMAIL PROTECTED] wrote:
 Hi All,

 Can someone tell me how can I scan http,https and ftp request for virus
 etc... with squid 3.1.x

 The best option is to use an ICAP-based AV scanner.

http://wiki.squid-cache.org/Features/ICAP



-- 
/kinkie


Re: [squid-users] scanning request with squid

2008-11-26 Thread John Doe
 Can you suggest any help site where I can start?
 
 Regards,
 Remy
 
 On Wed, 26 Nov 2008 15:21:06 +0100, Kinkie wrote:
  On Wed, Nov 26, 2008 at 12:28 PM, Mario Remy Almeida
  wrote:
  Hi All,
 
  Can someone tell me how can I scan http,https and ftp request for virus
  etc... with squid 3.1.x
  
  The best option is to use an ICAP-based AV scanner.

 http://www.google.fr/search?q=squid+antivirus
Maybe the 4th answer would help you: Setup Squid with Clamav antivirus

JD


  



[squid-users] Can I Force Connections To All or Some Sites To Traverse using HTTP 1.1?

2008-11-26 Thread wiskbroom


Greetings;

I have a proxy-to-proxy setup (without ICP) and it is working wonderfully with 
the exception of cases whereby IE users attempt to connect to a remote Citrix 
server.  The odd thing is that the errors encountered do not seem to happen at 
all when users use Firefox. 

When IE initiates traffic to the Citrix site, it uses HTTP 1.0, somewhere along 
the way, the Citrix site (or other Proxy, which I have no control nor ability 
to see into) returns HTTP 1.1 traffic.  At this point, the 1.1 trafic arrives 
back to my proxy, converting it back to the original HTTP 1.0 format before it 
passes the traffic back to IE user. 

So it appears that my Squid proxy tries to convert to HTTP 1.0, but only for IE 
sessions as Firefox users never have these issues, also Firefox uses 1.1 
anyhow, thus not requiring any conversions.

Any thoughts?  Is there any way to force or preserve the HTTP protocol version 
to 1.1 on all connections, or preferably on a destination basis?

Thanks all,

.vp



Re: [squid-users] Recommended Store Size

2008-11-26 Thread Nyamul Hassan
Thank you Chris for your valualbe info.  Sorry about asking the should I 
increase store size question.  It was a bit on the duh side.  :)


Is there a measurement inside the squid counters that tells me the bytes of 
data transferred?  Like you said it was 150GB for a day for your setup.  I 
was wondering where to see this in Squid.


Regards
HASSAN



- Original Message - 
From: Chris Robertson [EMAIL PROTECTED]

To: Squid Users squid-users@squid-cache.org
Sent: Wednesday, November 26, 2008 03:13
Subject: Re: [squid-users] Recommended Store Size



Nyamul Hassan wrote:
Thx Chris.  Cost of hardware does not become a big factor here, as it is 
directly related to the amount of BW that we save, and also the customer 
experience of getting pages faster from the cache.


After looking many of the threads here, I've found that some guys are 
using cache stores measured in terabytes.  I was wondering if a bigger 
store was going to improve the byte hit ratio, which seems to give the 
idea of how much BW was saved.


It won't reduce it.  :o)  If you want to increase the byte hit ratio 
change your cache_replacement_policy to heap LFUDA and increase your 
maximum_object_size.  Be sure your squid is compiled with the 
--enable-removal-policies= option specifying heap as one of the choices. 
My compilation options are below...


-bash-3.2$ /usr/local/squid/sbin/squid -v
Squid Cache: Version 2.7.STABLE5
configure options:  '--enable-stacktraces' '--enable-snmp' 
'--enable-removal-policies=heap,lru' '--enable-storeio=aufs,null,ufs' 
'--with-pthreads' '--enable-err-languages=English'




If I wanted to increase my store size by adding a JBOD of 12 disks using 
eSATA, and put another 12 x 160 GB sata disks, and also putting 130GB on 
each disk, making a total 2 TB cache store, would that improve the hit 
ratio?


Be aware.  For each TB of disk space, you might need up to 10GB of RAM to 
track the objects.  I'm pretty sure that calculation is based on a 20KB 
(or so) mean object size, but it's something to keep in mind.  Don't go 
all-out on increasing the storage size without keeping an eye on the 
associated memory usage.




I understand that patterns of user behavior greatly changes the hit 
ratio, as we ourselves see it drop during off-peak hours (late into the 
night), as users who are online probably visit more and more diverse web 
content.  I just wanted to check how all the guys out here who are using 
Squid as a forward proxy are doing in terms of saving BW, and for 
regular broadband internet users, how much BW they were saving with how 
big of a cache store.


As for myself, I currently have one main cache with 8GB of RAM and ~720GB 
of store dir (180GB on 4 Seagate ST3500320AS  spindles, 630GB used).  The 
Squid process size is 3.8GB (according to top).  I suffer a pretty 
serious 50% wait state and load spikes above 8, but my hit response time 
is sub 30ms under peak load (misses hover around 120ms).  On a typical 
weekday my cache passes around 150GB of traffic to clients, and about 10GB 
each day of the weekend, for a weekly total of just under 800GB of traffic 
per week.  Using heap LFUDA for my cache_replacement_policy with a maximum 
object size of 1GB I saw a 25% request hit rate and 24% byte hit rate on 
yesterday's traffic.  Friday's traffic was at 25% and 21% respectively, 
and seems more typical.




Thanks once again for your response, and hope you and the guys running 
squid as I am would share some of their experiences.


Regards
HASSAN


Chris






RE: [squid-users] Recommended Store Size

2008-11-26 Thread Gregori Parker
Of course you can :)

The trick to making these adjustments is having a means to gauge their
benefit/detriment...personally, each of my squid servers have all their
metrics graphed in Cacti and generate a Calamaris report each night, so
I get good hard data that can be compared to a historical baseline.  Put
this kind of monitoring in place (especially cacti IMO), and you wont be
tied to rules of thumb.


-Original Message-
From: Stand H [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 26, 2008 12:40 AM
To: Squid Users
Subject: Re: [squid-users] Recommended Store Size

Hi Chris,

 The rule of thumb I've read previously is storage
 equivalent to a week's traffic.  If you pass an
 average of 30GB per day, a storage size of 210GB is a good
 start.

I have two squid servers. Each processes around 120GB a day with about
43% request hit ratio and 25% byte hit ratio. The cache size is 300GB
with 6GB memory. Per rule of thumb, can I increase my cache size?

Thank you.

Stand


  


[squid-users] I need help to find my error

2008-11-26 Thread Mariel Sebedio

Hello, I have a squid-2.6.STABLE16-2.fc8 on RHEL 5.1

I need to access a this video on Macromedia-Flash but my squid 
configuration does not permit.


This is de  url: http://wireless.agilent.com/vcentral/viewvideo.aspx?vid=349

When I test de page whitout squid this access open in the source the 
port 80 and 1935.


When I conect through squid, the page does not open the video but in the 
access.log in the page have no errors.


I copy the access.log, and my squid Ports configuration.

Thanks a lot for your help.

Mariel

squid.conf

acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 81  #
acl Safe_ports port 21  # sftp
acl Safe_ports port 23  # ftp
acl Safe_ports port 161 # SNMP Send
acl Safe_ports port 162 # SNMP Recived
acl Safe_ports port 443 # https
acl Safe_ports port 563 #
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
*
Access.log

1227720982.612  1 125.1.1.249 TCP_HIT/200 734 GET 
http://wireless.agilent.com/vcentral/images/vtabs/tab-hover-right.gif - 
NONE/- image/gif
1227720982.613  1 125.1.1.249 TCP_HIT/200 730 GET 
http://wireless.agilent.com/vcentral/images/vtabs/tab-hover-left.gif - 
NONE/- image/gif
1227720982.615  1 125.1.1.249 TCP_HIT/200 689 GET 
http://wireless.agilent.com/vcentral/images/vtabs/tab-hover.gif - NONE/- 
image/gif
1227720983.336451 125.1.1.249 TCP_REFRESH_HIT/304 394 GET 
http://wireless.agilent.com/vcentral/styles/vcentral.css - 
DIRECT/wireless.agilent.com -
1227720983.343454 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/WebResource.axd? - 
DIRECT/wireless.agilent.com -
1227720983.345456 125.1.1.249 TCP_REFRESH_HIT/304 394 GET 
http://wireless.agilent.com/vcentral/styles/vtabs.css - 
DIRECT/wireless.agilent.com -
1227720983.540670 125.1.1.249 TCP_REFRESH_HIT/304 416 GET 
http://cp.home.agilent.com/agilent/css2/master.css - 
DIRECT/cp.home.agilent.com text/css
1227720983.551   1145 125.1.1.249 TCP_MISS/200 25713 GET 
http://wireless.agilent.com/vcentral/viewvideo.aspx? - 
DIRECT/wireless.agilent.com text/html
1227720983.585700 125.1.1.249 TCP_REFRESH_HIT/304 416 GET 
http://cp.home.agilent.com/agilent/css2/not_nn4.css - 
DIRECT/cp.home.agilent.com text/css
1227720983.823237 125.1.1.249 TCP_REFRESH_HIT/304 392 GET 
http://wireless.agilent.com/vcentral/images/agilentlogo4.gif - 
DIRECT/wireless.agilent.com -
1227720983.828213 125.1.1.249 TCP_REFRESH_HIT/304 345 GET 
http://cp.home.agilent.com/agilent/images2/frame/99_topright.gif - 
DIRECT/cp.home.agilent.com image/gif
1227720983.845227 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/WebResource.axd? - 
DIRECT/wireless.agilent.com -
1227720983.850254 125.1.1.249 TCP_REFRESH_HIT/304 345 GET 
http://cp.home.agilent.com/agilent/images2/frame/99_topleft.gif - 
DIRECT/cp.home.agilent.com image/gif
1227720984.085470 125.1.1.249 TCP_REFRESH_HIT/304 345 GET 
http://cp.home.agilent.com/agilent/images2/frame/navbar_topleft.gif - 
DIRECT/cp.home.agilent.com image/gif
1227720984.123238 125.1.1.249 TCP_REFRESH_HIT/304 432 GET 
http://cp.home.agilent.com/agilent/scripts/s_code_remote.js - 
DIRECT/cp.home.agilent.com application/x-javascript
1227720984.446300 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720984.682235 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720985.166484 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720985.630463 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720986.095465 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720986.550454 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720987.012462 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720987.469456 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720987.927457 125.1.1.249 TCP_MISS/304 435 GET 
http://wireless.agilent.com/vcentral/ScriptResource.axd? - 
DIRECT/wireless.agilent.com -
1227720988.385458 125.1.1.249 TCP_REFRESH_HIT/304 393 GET 
http://wireless.agilent.com/vcentral/images/banner.gif - 

Re: [squid-users] Recommended Store Size

2008-11-26 Thread Chris Robertson

Nyamul Hassan wrote:
Thank you Chris for your valualbe info.  Sorry about asking the 
should I increase store size question.  It was a bit on the duh 
side.  :)


Is there a measurement inside the squid counters that tells me the 
bytes of data transferred?


I use the SCALAR log analyzer (http://scalar.risk.az/) as part of my log 
rotation.  It's quite processor intensive, but gives lots of useful info.


  Like you said it was 150GB for a day for your setup.  I was 
wondering where to see this in Squid.


Regards
HASSAN 


Chris



Re: [squid-users] URGENT : How to limit some ext

2008-11-26 Thread Chris Robertson

░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:

how to limit .zip .swf only from squid.conf in 2 option

1. Global Rule ( i mean all user will get this rule - limit on zip and swf )
2. Individual Rule ( only certain ppl that listed )

thx b4

in urgent ASAP :(

it's about dead and live :(
  


There's a whole FAQ section on ACLs...

http://wiki.squid-cache.org/SquidFaq/SquidAcl

Chris


Re: [squid-users] NTLM Auth and not authenticated pages

2008-11-26 Thread Chris Robertson

Matias Chris wrote:

Hello All,

Im currently in the process of changing the way we authenticate users
from LDAP to NTLMSSP. Now we are in test phase and while ntlm auth is
working fine and allowing all users that are already logged to the AD
Domain to access the web without asking for their credentials, Im
seeing a lot of denied attempts at the log.
Is like for every page visited I have now two log entries, one is
denied, and the other one is allowed.
  


That's due to the design of NTLM.  See 
http://devel.squid-cache.org/ntlm/client_proxy_protocol.html



Is there any way to tweak squid to avoid doing this? AD DC is on the
same phisycal LAN.
  


I suppose you could refrain from logging 407 responses...


1227614260.463  0 127.0.0.1 TCP_DENIED/407 2083 POST
http://mail.google.com/a/matiaschris.com.ar/channel/bind? - NONE/-
text/html
1227614261.218188 127.0.0.1 TCP_MISS/200 351 POST
http://mail.google.com/a/matiaschris.com.ar/channel/bind? mchrist
DIRECT/66.102.9.18 text/html

Any help will be much appreciated. Thanks.
  


Chris


RE: [squid-users] URGENT : How to limit some ext

2008-11-26 Thread Gregori Parker
And if every post is going to be 'life and death', urgent, asap, etc...you 
really need to get a test lab / virtual environment :)

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 26, 2008 12:23 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] URGENT : How to limit some ext

░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░ wrote:
 how to limit .zip .swf only from squid.conf in 2 option

 1. Global Rule ( i mean all user will get this rule - limit on zip and swf )
 2. Individual Rule ( only certain ppl that listed )

 thx b4

 in urgent ASAP :(

 it's about dead and live :(
   

There's a whole FAQ section on ACLs...

http://wiki.squid-cache.org/SquidFaq/SquidAcl

Chris


Re: [squid-users] Re: IMAP support

2008-11-26 Thread Chris Robertson

RW wrote:

On Wed, 12 Nov 2008 14:28:51 -0200
Leonardo Rodrigues Magalh__es [EMAIL PROTECTED] wrote:

  

julian julian escreveu:


Because all my traffic to internet is managed by squid. Do you have
any suggestion? 
  

no, it's not. Only http/https/ftp/gopher can be handled by squid.

and it wont help keep sending messages asking about IMAP
support ... squid can't do that. period.



Whilst Squid cannot proxy IMAP directly, it can proxy arbitary tcp
connections through an http connect request.


Amusingly enough, this point was raised by Amos the first time he 
responded to this question...



  IIRC there are patches
that allow stunnel to connect through squid in this way. I'd try that
first if gmail supports imap over ssl/tls (I'm not sure if stunnel has
any starttls support). Otherwise you can probably find some other
utility to connect a localhost port to gmail.

I don't know why people have been so dismissive of this question, it
sound like a sensible thing to do, particularly if you want to use
squid delay pools to manage download bandwidth.  
  


Perhaps it's because Squid is a HTTP caching server, as spelled out 
quite clearly in the FAQ 
(http://wiki.squid-cache.org/SquidFaq/AboutSquid), not a generic 
bandwidth shaper.


Perhaps it's because people who don't read the FAQ (and don't check the 
list archives) throw a question similar to this on the mailing list 
(what seems like) every week.


Perhaps in this case it was due to the sheer persistence of the original 
requester.  Six different people told julian that Squid is not capable 
of handling the IMAP protocol[1] in response to five separate email 
messages[2].  What does it take to get the message across?


If you want to use a screwdriver to pound nails, don't expect much 
support from the screwdriver community.  On the other hand, if you 
figure out,  on your own, a novel method to drive nails with a 
screwdriver, share it and you might receive praise and recognition.  
Then again, you might be questioned about not using a hammer in the 
first place.


If you want to throttle random traffic traversing your internet 
connection, there are free applications that work in Windows, tc on 
Linux, and pf under BSD.  If you are willing to spend a little cash, 
there are any number of appliance vendors and software consultants who 
would be happy to help you.


Then again, maybe I just need more rest.

Chris

[1]
1: Amos: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0275.html
2: Joel: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0283.html
3: Dieter: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0284.html
4: Leonardo: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0286.html
5: Mark: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0285.html
6: Jakob: 
http://www.squid-cache.org/mail-archive/squid-users/200811/0287.html


[2]
1: http://www.squid-cache.org/mail-archive/squid-users/200811/0274.html
2: http://www.squid-cache.org/mail-archive/squid-users/200811/0277.html
3: http://www.squid-cache.org/mail-archive/squid-users/200811/0281.html 
- A repost of #2

4: http://www.squid-cache.org/mail-archive/squid-users/200811/0278.html
5: http://www.squid-cache.org/mail-archive/squid-users/200811/0282.html 
- A repost of #4


[squid-users] Can squid acted as a application SSL proxy

2008-11-26 Thread 李春

I have a client/server application program and want to add SSL module to it to 
secure the data transferring on the network. I wander that if I can use the 
squid as a SSL proxy between client and serve
r. The squid will configurated as a reserve proxy and located in the 
application server's environmen
t. The client and squid contact with SSL connection. Just like this:
 -(no SSL)---(SSL)--
Server Squid   client
 --(no SSL)---(SSL)-

I know squid can act as web proxy like this using https_port. But I am 
curious that if I can make use of squid like this.
Thanks very much!

Pickup.Li
_
MSN 中文网,最新时尚生活资讯,白领聚集门户。
http://cn.msn.com


[squid-users] Question about Squid 3 reverse proxy and SSL

2008-11-26 Thread Tom Williams

Ok, I'm adding SSL support to my Squid 3 reverse proxy configuration.

Here are the configuration directives:

http_port 8085 accel defaultsite=www.mydomain.com vhost
https_port 4433 accel cert=/etc/ssl/cert/www_mydomain_com.crt 
key=/etc/ssl/private/private.key  defaultsite=www.mydomain.com vhost
cache_peer 192.168.1.7 parent 80 0 no-query originserver login=PASS 
name=web2Accel
cache_peer 192.168.1.7 parent 443 0 no-query originserver ssl login=PASS 
name=web2SSLAccel


Here is the error I get when I try to connect:

clientNegotiateSSL: Error negotiating SSL connection on FD 13: 
error:1407609C:SSL routines:SSL23_GET_CLIENT_HELLO:http request (1/-1)


What does this error mean?

Thanks!

Peace...

Tom


Re: [squid-users] compilation errors on rhel 5

2008-11-26 Thread Mario Remy Almeida
Thnaks Christos.

After applying the patch I managed to install

Regards,
Remy

On Wed, 2008-11-26 at 10:14 -0500, Christos Tsantilas wrote:
 Hi Remy,
 
  
  2nd any idea where am i failing to compile it in ubuntu 8.10
 
  errors
 
   g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I.
  -I. -I../include -I. -I. -I../include -I../include
  -I../lib/libTrie/include -I../lib -I../lib -I/usr/include/libxml2
  -Werror -Wall -Wpointer-arith -Wwrite-strings -Wcomments
  -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=
  \/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo -MD -MP -MF
  ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc  -fPIC -DPIC -o
  ICAP/.libs/AsyncJob.o
  cc1plus: warnings being treated as errors
  ICAP/AsyncJob.cc: In member function β#65533;#65533;virtual const char*
  AsyncJob::status() constβ#65533;#65533;:
  ICAP/AsyncJob.cc:158: error: format not a string literal and no format
  arguments
  make[1]: *** [ICAP/AsyncJob.lo] Error 1
  make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src'
  make: *** [all-recursive] Error 1
 
 I think this is the bug 2527:
 http://www.squid-cache.org/bugs/show_bug.cgi?id=2527
 There is a small patch for this bug can you try it?
 
 
 Regards,
  Christos
 
  Regards,
  Remy
 
 
 



Re: [squid-users] error compiling squid-3.1.0.2 on ubuntu 8.10

2008-11-26 Thread Amos Jeffries
 Hi All,

 tried to compile squid squid-3.1.0.2 on ubuntu 8.10

 with the following options
 ./configure \
   --prefix=/usr \
   --localstatedir=/var \
   --libexecdir=${prefix}/lib/squid \
   --srcdir=. \
   --datadir=${prefix}/share/squid \
   --sysconfdir=/etc/squid \
   --with-default-user=prox \
   --with-logdir=/var/log \
   --enable-arp-acl

 below are all the errors found in config.log file


Many of these config.log 'errors' are simply Squid ./configure script
probing your system to locate various files. Many are expected to be
logged.
the important bit is whether anything is displayed when it halts.

snip


 executed make and got this error

 de -I../lib/libTrie/include -I../lib -I../lib-Werror -Wall
 -Wpointer-arith -Wwrite-strings -Wcomments  -DDEFAULT_SQUID_CONFIG_DIR=
 \/etc/squid\ -DDEFAULT_SQUID_DATA_DIR=\/share/squid\ -g -O2 -MT
 ICAP/AsyncJob.lo -MD -MP -MF $depbase.Tpo -c -o ICAP/AsyncJob.lo
 ICAP/AsyncJob.cc; \
   then mv -f $depbase.Tpo $depbase.Plo; else rm -f $depbase.Tpo;
 exit 1; fi
 mkdir ICAP/.libs
  g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\/etc/squid/squid.conf\ -I.
 -I. -I../include -I. -I. -I../include -I../include
 -I../lib/libTrie/include -I../lib -I../lib -Werror -Wall -Wpointer-arith
 -Wwrite-strings -Wcomments -DDEFAULT_SQUID_CONFIG_DIR=\/etc/squid\
 -DDEFAULT_SQUID_DATA_DIR=\/share/squid\ -g -O2 -MT ICAP/AsyncJob.lo
 -MD -MP -MF ICAP/.deps/AsyncJob.Tpo -c ICAP/AsyncJob.cc  -fPIC -DPIC -o
 ICAP/.libs/AsyncJob.o
 cc1plus: warnings being treated as errors
 ICAP/AsyncJob.cc: In member function ‘virtual const char*
 AsyncJob::status() const’:
 ICAP/AsyncJob.cc:158: error: format not a string literal and no format
 arguments
 make[1]: *** [ICAP/AsyncJob.lo] Error 1
 make[1]: Leaving directory `/home/remy/rnd/squid-3.1.0.2/src'
 make: *** [all-recursive] Error 1

 can someone tell me what other packages are required to fix the above
 errors

I'm a bit braindead from a conference right now, but IIRC this was found
and fixed a while back.

Can you check whether the latest daily snapshot still has this or any
other problems please?

Cheers
Amos




Re: [squid-users] Cache_dir more than 10GB

2008-11-26 Thread Nyamul Hassan

Hi,

This is most interesting!  1.5 TB!!  :)

Considering it is a 64-bit system, did it require 22 GB of RAM for the squid 
process alone (14MB per GB).  Can you share the following info:


- Total RAM on the system
- Size of your cache_mem
- Peak Requests Per Second
- Byte Hit Ratio
- Cache Replacement Policy

I apologize beforehand if this is of any trouble for you.

Regards
HASSAN



- Original Message - 
From: Henrik Nordstrom [EMAIL PROTECTED]

To: Itzcak Pechtalt [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org
Sent: Monday, October 06, 2008 17:05
Subject: Re: [squid-users] Cache_dir more than 10GB




[squid-users] squid reverse-proxy for videos

2008-11-26 Thread Ken DBA
We have some web servers for videos playing (the FLV format,like youtube).
Could we deploy squid to act as a reverse-proxy for this application?
What's the recommend configure for squid? Thanks.

Ken.


  


[squid-users] 2 squid server

2008-11-26 Thread ░▒▓ ɹɐzǝupɐɥʞ ɐzɹıɯ ▓▒░
hi all
i have problem here

server A 192.168.222.111 squid port 2210
server B 192.168.222.100 squid port 2012

when i put this line on Server A on the first line
cache_peer 192.168.222.100 parent 2012 0 no-query no-digest default

and i put this line at server B
First line :
cache_peer 192.168.222.111 sibling 2012 0 no-query no-digest default

and at line after
acl manager proto cache_object
acl all src 0.0.0.0/0.0.0.0
acl localhost src 127.0.0.1
acl SSL_ports port 443 563
acl Safe_ports port 21 80 81 53 143 2443 443 563 70 210 1025-65535
acl Safe_ports port 280
acl Safe_ports port 488
acl Safe_ports port 591
acl Safe_ports port 777
acl CONNECT method CONNECT

i put :
cache_peer_access 192.168.222.111 allow all



some how it doest work
at server B's log it say  192.168.222.111 DENIED

my question :
1. why it say denied ?
2. if i have user rule at server A , is the rule will still work ? or
i must put the rule again at server B

~~~ it;s urgent but not about live ^^