Re: [squid-users] squid with mysql and squid_db_auth as program

2010-04-25 Thread Amos Jeffries

Ami Choksi wrote:

Hi Friends,
Me configuring squid with mysql using
squid_db_auth as the authentication program. I have done all according
to following website.
http://wiki.squid-cache.org/ConfigExamples/Authenticate/Mysql
But not able to do login from browser. can
any1 please help me?
Ami


Please describe:
  what version of Squid this is?
  did the troubleshooting tests described at the end of example work?
  how does your browser connect to the proxy?
  what happens when you try to login?
  what do your full http_access rules look like?

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


Re: [squid-users] storeUpdateCopy Error issue

2010-04-25 Thread Amos Jeffries

Kris wrote:
yes i`m using new harddisk 1.5TB but only use 300gb as cache dir and use 
reiserfs filesystem. my machine restart everyday using crond.




Why are you restarting the whole system daily?


Jeff Pang wrote:

On Sat, Apr 24, 2010 at 3:06 PM, Kris christ...@wanxp.com wrote:
 

Dear,

i run my proxy well with 100-200 hit per second. it run smooth but 
after few

day , my user complained proxy become slow , in cache.log i see bunch of
eror like this

storeUpdateCopy: Error at 280 (-1)
storeUpdateCopy: Error at 384 (-1)

for current temporary fix i removed all cache and build new one. it 
happened

me twice on 2 weeks. any suggestion what cause this problem ?




It seems a disk writing error.
Are you sure the hard disk and filesystem are right?


  





--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


Re: [squid-users] Only logging DENIED or MISS actions

2010-04-25 Thread Amos Jeffries

CactusCo wrote:
On Debian Squeeze 
# squid -v

Squid Cache: Version 2.7.STABLE8
configure options:  '--prefix=/usr' '--exec_prefix=/usr'
'--bindir=/usr/sbin' '--sbindir=/usr/sbin' '--libexecdir=/usr/lib/squid'
'--sysconfdir=/etc/squid' '--localstatedir=/var/spool/squid'
'--datadir=/usr/share/squid' '--enable-async-io' '--with-pthreads'
'--enable-storeio=ufs,aufs,coss,diskd,null' '--enable-linux-netfilter'
'--enable-arp-acl' '--enable-epoll' '--enable-removal-policies=lru,heap'
'--enable-snmp' '--enable-delay-pools' '--enable-htcp'
'--enable-cache-digests' '--enable-underscores' '--enable-referer-log'
'--enable-useragent-log' '--enable-auth=basic,digest,ntlm,negotiate'
'--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-carp'
'--enable-follow-x-forwarded-for' '--with-large-files' '--with-maxfd=65536'
'i386-debian-linux' 'build_alias=i386-debian-linux'
'host_alias=i386-debian-linux' 'target_alias=i386-debian-linux'
'CFLAGS=-Wall -g -O2' 'LDFLAGS=' 'CPPFLAGS='

squid.conf tag: 


access_log /var/log/squid/access.log squid

Logging ONLY register action with DENIED or MISS
Like this:

DENIED/504 1739 GET http://www.site.com/ - NONE/- text/html
MISS/504 1907 GET http://www.site.com/ - DIRECT/200.xxx.xxx.xxx text/html
MISS/504 2074 GET http://www.site.com/ - DIRECT/190.xxx.xx.xx text/html

Any idea?? 
Regards

Rick



That does not look like the squid native format. Please how the bits 
you erased as well.


a) you configured it to.
   Look for other access_log lines, log_access lines, and cache deny 
lines.


b) you're using a patched your proxy.
   Is this the Debian package? or a self-built proxy?

c) those are actually the requests happening.

FWIW; SITE systems Inc. have their website hosted at 66.113.131.78 right 
now. Your DNS cache may have been poisoned.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


Re: [squid-users] Only logging DENIED or MISS actions

2010-04-25 Thread Amos Jeffries

CactusCo wrote:
On Debian Squeeze 
# squid -v

Squid Cache: Version 2.7.STABLE8
configure options:  '--prefix=/usr' '--exec_prefix=/usr'
'--bindir=/usr/sbin' '--sbindir=/usr/sbin' '--libexecdir=/usr/lib/squid'
'--sysconfdir=/etc/squid' '--localstatedir=/var/spool/squid'
'--datadir=/usr/share/squid' '--enable-async-io' '--with-pthreads'
'--enable-storeio=ufs,aufs,coss,diskd,null' '--enable-linux-netfilter'
'--enable-arp-acl' '--enable-epoll' '--enable-removal-policies=lru,heap'
'--enable-snmp' '--enable-delay-pools' '--enable-htcp'
'--enable-cache-digests' '--enable-underscores' '--enable-referer-log'
'--enable-useragent-log' '--enable-auth=basic,digest,ntlm,negotiate'
'--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-carp'
'--enable-follow-x-forwarded-for' '--with-large-files' '--with-maxfd=65536'
'i386-debian-linux' 'build_alias=i386-debian-linux'
'host_alias=i386-debian-linux' 'target_alias=i386-debian-linux'
'CFLAGS=-Wall -g -O2' 'LDFLAGS=' 'CPPFLAGS='

squid.conf tag: 


access_log /var/log/squid/access.log squid

Logging ONLY register action with DENIED or MISS
Like this:

DENIED/504 1739 GET http://www.site.com/ - NONE/- text/html
MISS/504 1907 GET http://www.site.com/ - DIRECT/200.xxx.xxx.xxx text/html
MISS/504 2074 GET http://www.site.com/ - DIRECT/190.xxx.xx.xx text/html

Any idea?? 
Regards

Rick



That does not look like the squid native format. Please how the bits 
you erased as well.


a) you configured it to.
   Look for other access_log lines, log_access lines, and cache deny 
lines.


b) you're using a patched your proxy.
   Is this the Debian package? or a self-built proxy?

c) those are actually the requests happening.

FWIW; SITE systems Inc. have their website hosted at 66.113.131.78 right 
now. Your DNS cache may have been poisoned.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


[squid-users] squid with mysql and squid_db_auth as program

2010-04-25 Thread Ami Choksi
what i feel is : access is denied as remote user may not be able to
access the database. i think when any user does login through proxy,
the effective user is squid. i gave permission to squid user also for
select.
i.e.
grant select on squid.* to sq...@localhost identified by 'squid';
is it the problem because squid user doesn't have login shell permission?

answers are with your questions:
 Please describe:
  what version of Squid this is?
squid-3.1.0.14
  did the troubleshooting tests described at the end of example work?
yes, at the end of example whatever command is given works perfectly ok.

  how does your browser connect to the proxy?
browser prompts for username and password.

  what happens when you try to login?
after entering username and password it again and again asks for the same.
squid's access log shows like
1272114337.269  0 172.16.1.108 TCP_DENIED/407 4297 GET
http://sb.google.com/safebrowsing/update? - NONE/- text/html

  what do your full http_access rules look like?
acl localnet src 172.16.1.0/24
auth_param basic program /usr/lib/squid/squid_db_auth --user someuser
--password  --plaintext --persist
auth_param basic children 5
auth_param basic realm Web-Proxy
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off


acl db-auth proxy_auth REQUIRED
http_access allow db-auth
http_access allow localhost
http_access allow localnet
http_access deny all


Re: [squid-users] squid with mysql and squid_db_auth as program

2010-04-25 Thread Luis Daniel Lucio Quiroz
Le dimanche 25 avril 2010 02:37:57, Ami Choksi a écrit :
 what i feel is : access is denied as remote user may not be able to
 access the database. i think when any user does login through proxy,
 the effective user is squid. i gave permission to squid user also for
 select.
 i.e.
 grant select on squid.* to sq...@localhost identified by 'squid';
 is it the problem because squid user doesn't have login shell permission?
 
 answers are with your questions:
  Please describe:
   what version of Squid this is?
 
 squid-3.1.0.14
 
   did the troubleshooting tests described at the end of example work?
 
 yes, at the end of example whatever command is given works perfectly ok.
 
   how does your browser connect to the proxy?
 
 browser prompts for username and password.
 
   what happens when you try to login?
 
 after entering username and password it again and again asks for the same.
 squid's access log shows like
 1272114337.269  0 172.16.1.108 TCP_DENIED/407 4297 GET
 http://sb.google.com/safebrowsing/update? - NONE/- text/html
 
   what do your full http_access rules look like?
 
 acl localnet src 172.16.1.0/24
 auth_param basic program /usr/lib/squid/squid_db_auth --user someuser
 --password  --plaintext --persist
 auth_param basic children 5
 auth_param basic realm Web-Proxy
 auth_param basic credentialsttl 2 hours
 auth_param basic casesensitive off
 
 
 acl db-auth proxy_auth REQUIRED
 http_access allow db-auth
 http_access allow localhost
 http_access allow localnet
 http_access deny all

No problem about squid fobiden login permitions.  mysql users are not related 
with system users.

try it first by hand, in command line and paste your results.


Re: [squid-users] Re: Joomla DB authentication support hits Squid! :)

2010-04-25 Thread Luis Daniel Lucio Quiroz
Le vendredi 23 avril 2010 00:20:13, Amos Jeffries a écrit :
 Luis Daniel Lucio Quiroz wrote:
  Le jeudi 22 avril 2010 20:09:57, Amos Jeffries a écrit :
  Luis Daniel Lucio Quiroz wrote:
  Le jeudi 22 avril 2010 15:49:55, Luis Daniel Lucio Quiroz a écrit :
  HI all
  
  As a requirement of one client, he wants to use joomla user database
  to let squid authenticate.
  
  I did patch squid_db_auth that Henrik has written in order to support
  joomla hash conditions.
  
  I did add one usefull option to script
  
  --joomla
  
  in order to activate joomla hashing.  Other options are identical.
  Please test :)
  
  Ammos, I'd like if you can include this in 3.1.2
  
  Mumble.
  
  How do other users feel about it? Useful enough to cross the security
  bugs and regressions only freeze?
  
  LD
  
  I have a typo in
  my salt
  
  should be
  my $salt
  
  sorry
  
  Can you make the option --md5 instead please?
  
Possibilities are not limited to Joomla and they may change someday.
  
  The option needs to be added to the documentation sections of the helper
  as well.
  
  Amos
  
  I dont get you about cross the security,
 
 3.1 is under feature freeze. Anything not a security fix or regression
 needs to have some good reasons to be committed.
 
 I'm trying to stick to the freeze a little more with 3.1 than with 3.0,
 to get back into the habit of it. Particularly since we look like having
 a good foothold on the track for 12-month releases now.
 
  what i did is that --joomla flag do diferent sql request and because
  joomla hass is like this:
  hash:salt
  i did split and compare.  by default joomla uses md5 (i'm not a joomla
  master, i dont know when joomla uses other hashings)
 
 I intend to use this auth helper myself for other systems, and there are
 others who ask about a DB helper occasionally.
 
 
 Taking a better look at your changes ...
 
 The first one: db_conf = block = 0  seems to be useless. All it does
 is hard-code a different default value for the --cond option.
 
For Joomla the squid.conf should instead contain:
   --cond  block=0 
 
 
 Which leaves the salted/non-salted hash change.
 Adding this:
 
--salt-delimiter D
 
 To configure character(s) between the hash and salt values.  Will not to
 lock people into the specific Joomla syntax of colon.  There are
 examples and tutorials out there for app design that use other delimiters.
 
 Doing both of those changes Joomla would be configured with:
 
... --cond  block=0   --salt-delimiter :
 
  if you want, latter i may add also --md5 to store md5 password, and
  --digest- auth to support diggest authentication :) but later jejeje
 
 Amos

a little hack becuase perl was warning about first disconnect in perl dbh

:S
--- helpers/basic_auth/DB/squid_db_auth.in	2010-03-29 12:02:56.0 +0200
+++ helpers/basic_auth/DB/squid_db_auth.in.dlucio	2010-04-25 09:57:42.0 +0200
@@ -1,8 +1,9 @@
 #...@perl@
-use strict;
+#use strict;
 use DBI;
 use Getopt::Long;
 use Pod::Usage;
+use Digest::MD5 qw(md5 md5_hex md5_base64);
 $|=1;
 
 =pod
@@ -22,6 +23,8 @@
 my $db_cond = enabled = 1;
 my $plaintext = 0;
 my $persist = 0;
+my $isjoomla = 0;
+my $debug = 0;
 
 =pod
 
@@ -62,6 +65,7 @@
 =item	B--cond
 
 Condition, defaults to enabled=1. Specify 1 or  for no condition
+If you use --joomla flag, this condition will be changed to block=0
 
 =item	B--plaintext
 
@@ -71,6 +75,10 @@
 
 Keep a persistent database connection open between queries. 
 
+=item	B--joomla
+
+Tell helper that user database is joomla db.  So salt hasing is understood.
+
 =back
 
 =cut
@@ -85,13 +93,17 @@
 	'cond=s' = \$db_cond,
 	'plaintext' = \$plaintext,
 	'persist' = \$persist,
+	'joomla' = \$isjoomla,
+	'debug' = \$debug,
 	);
 
-my ($_dbh, $_sth);
+$db_cond = block = 0 if $isjoomla;
+
 
 sub close_db()
 {
 return if !defined($_dbh);
+$_sth-finish();
 $_dbh-disconnect();
 undef $_dbh;
 undef $_sth;
@@ -113,10 +125,17 @@
 {
 my ($password, $key) = @_;
 
-return 1 if crypt($password, $key) eq $key;
+if ($isjoomla){
+my $salt;
+my $key2;
+($key2,$salt) = split (/:/, $key);
+return 1 if md5_hex($password.$salt).':'.$salt eq $key;
+}
+else{
+return 1 if crypt($password, $key) eq $key;
 
-return 1 if $plaintext  $password eq $key;
-
+return 1 if $plaintext  $password eq $key;
+}
 return 0;
 }
 
@@ -155,8 +174,9 @@
 =head1 COPYRIGHT
 
 Copyright (C) 2007 Henrik Nordstrom hen...@henriknordstrom.net
+Copyright (C) 2010 Luis Daniel Lucio Quiroz dlu...@okay.com.mx (Joomla support)
+
 This program is free software. You may redistribute copies of it under the
 terms of the GNU General Public License version 2, or (at youropinion) any
 later version.
-
 =cut


[squid-users] checked with nothing to match against!!

2010-04-25 Thread Сухоруков Александр

hello all
every time i take a look at logs i've got

root /usr/local/etc/squid # tail -f 
/cache/cache.log   
23:56
2010/04/24 23:54:57| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:57| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:57| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:57| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:58| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:58| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!
2010/04/24 23:54:58| SECURITY ERROR: ACL 0x28fe20a4 checked with nothing 
to match against!!


what does this error means ? squid works but i don't like live with 
errors )

thank you

P.S.
root /usr/local/etc/squid # uname 
-a   
23:56
FreeBSD mail.emoxam.ru 7.3-RELEASE FreeBSD 7.3-RELEASE #0: Fri Apr  2 
00:53:43 MSD 2010 emo...@mail.emoxam.ru:/usr/obj/usr/src/sys/CUSTOM731  
i386
root /usr/local/etc/squid # pkg_version -v | grep 
squid23:56

squid-3.0.25=   up-to-date with port
root /usr/local/etc/squid 
#
23:56


Re: [squid-users] Youtube Caching

2010-04-25 Thread Amos Jeffries

Khemara Lyn wrote:

Dear All,

I know this topic have been around every now and then; but i'm desperate 
now in finding any fix.


I tried the configuration as mentioned in this page:

Caching YouTube Content
http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube?highlight=%28ConfigExamples%2FIntercept%29|%28ConfigExamples%2FAuthenticate%29|%28ConfigExamples%2FChat%29|%28ConfigExamples%2FStreams%29|%28ConfigExamples%2FReverse%29|%28ConfigExamples%2FStrange%29 



FYI,
  Chudy has just updated the 3xx bug patch.

I'm not sure if its related to your problem, but worth a try.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.1


[squid-users] R: [squid-users] Is there a way to get transparent proxy to work with Squid 2.7 stable 8 on Windows 2003 Server?

2010-04-25 Thread Guido Serassio
Hi,

On Windows a transparent interception driver is missing.

But if you can use some L3/L4 device able to redirect the http requests (like a 
firewall or a L3 switch) to the Windows Squid box, yes, it should works.

Regards

Guido Serassio
Acme Consulting S.r.l.
Microsoft Gold Certified Partner
Via Lucia Savarino, 110098 - Rivoli (TO) - ITALY
Tel. : +39.011.9530135   Fax. : +39.011.9781115
Email: guido.seras...@acmeconsulting.it
WWW: http://www.acmeconsulting.it



 -Messaggio originale-
 Da: Milan [mailto:compguy030...@gmail.com]
 Inviato: giovedì 22 aprile 2010 14.59
 A: squid-users@squid-cache.org
 Oggetto: [squid-users] Is there a way to get transparent proxy to work
 with Squid 2.7 stable 8 on Windows 2003 Server?
 
 We have a squid 2.7 stable 8 running on Windows 2003 server on a VM.
 Is it possible to get transparent proxy working on this version or is
 still impossible for windows?


[squid-users] david facon

2010-04-25 Thread Ilo Lorusso
http://Effycient.com/default/index.php


[squid-users] squid_kerb_ldap/squid_kerb_auth in Single Forest Multidomains Active Directory.

2010-04-25 Thread GIGO .

Dear All,
 
The problem under discussion is a continutity of SPN creation/Single Forest 
MultiDomain (Active Directory) topic. 
 
@ Markus
Yes my infrastructure is Active Directory based (Root Forest Directory A with 
two child domains B (80 % users)  C (20 % users) in their own trees). Only 
squid Proxy is installed on Centos OS and not joined to any domain.Markus you 
are right I Observerd that the clients in the child domain are able to use 
squidproxy without any changes required in the krb5.conf file.(no need to 
define [CAPATH] section). I got it that by design of the Active directory 
forest where Parent domains and child domains have two way transitive trusts,  
Active directory/DNS infrastructure is managing itself...and the clients in any 
domain are able to find that Service principal is in which domain to acquire a 
service ticket from that domain. Right??
 
 
 

If the UnixServer(Proxy) is not belonged to any domain then the default_realm 
section does not matter and i can choose any of my domains as default_realm. As 
i think that the default_realm tag is compulsory to define so couldn't be left 
blank. Similarly if am not to use any other kerberised service for example from 
my SquidProxyunix server then .linux.home tag will be unimportant otherwise it 
is a must. Right??
 
 
 
 
//krb5.conf for Active directory single forest multi domain its working 
correctly
[libdefaults]
 default_realm = A.COM.PK
 dns_lookup_realm = false
 dns_lookup_kdc = false
 default_keytab_name = /etc/krb5.keytab

; for windows 2003 encryption type configuration.
default_tgs_enctypes = rc4-hmac des-cbc-crc des-cbc-md5
default_tkt_enctypes = rc4-hmac des-cbc-crc des-cbc-md5
permitted_enctypes = rc4-hmac des-cbc-crc des-cbc-md5
[realms]
 A.COM.PK = {
   kdc = dc1.a.com.pk
   admin_server = dc1.a.com.pk
  }
 b.A.COM.PK = {
   kdc = childdc.b.a.com.pk
   admin_server = childdc.b.a.com.pk
}
[domain_realm]
.linux.home = A.COM.PK
.a.com.pk = A.COM.PK
a.com.pk = A.COM.PK
.b.a.com.pk = b.A.COM.PK
b.a.com.pk = b.A.COM.PK
[logging]
kdc = FILE:/var/log/kdc.log
admin_server = FILE:/var/log/kadmin.log
default = FILE:/var/log/kdc.log
\\
Any suggestions/guidance required??
 
 
 
 
My squid.conf portion related to Authentication/Authorization along with the 
questions.
 
auth_param negotiate program /usr/libexec/squid/squid_kerb_auth
auth_param negotiate children 10
auth_param negotiate keep_alive on
# basic auth ACL controls to make use of it are.
#acl auth proxy_auth REQUIRED
#http_access deny !auth
#http_access allow auth
 
 
I think now above commented directives are not required as squid_kerb_ldap has 
taken the charge. Right???
 
 
 
#external_acl_type squid_kerb1 ttl=3600  negative_ttl=3600  %LOGIN 
/usr/libexec/squid/squid_kerb_ldap -g 
gro...@a.com.pk:gro...@a.com.pk:gro...@a.com.pk:g...@b.a.com.pk:gro...@b.a.com.pk:gro...@b.a.com.pk

external_acl_type g1_parent ttl=3600  negative_ttl=3600  %LOGIN 
/usr/libexec/squid/squid_kerb_ldap -g gro...@a.com.pk
 
external_acl_type g2_parent ttl=3600  negative_ttl=3600  %LOGIN 
/usr/libexec/squid/squid_kerb_ldap -g gro...@a.com.pk
 
external_acl_type g2_child ttl=3600  negative_ttl=3600  %LOGIN 
/usr/libexec/squid/squid_kerb_ldap -g gro...@a.b.com.pk
 
 
 
Although the commented single liner was working properly for me and look more 
apporpriate to me but i had to split it into multiple linesnothing came 
into my mind how to handle the ACL's based on user group membership. Please 
guide me if there is a better way to do that as it feels that i am calling the 
helper multiple times instead of single time now??
 
 
 
(There are other expected groups from child domains and parent domains so am 
worried that isnt it affect the performance)
 
 
acl ldap_group_check1 external g1_parent
acl ldap_group_check2 external g2_parent
acl ldap_group_check3 external g2_child
 
 
Definition of YouTube.
## The videos come from several domains
acl youtube_domains dstdomain .youtube.com .googlevideo.com .ytimg.com

http_access deny  ldap_group_check1 youtube_domains
http_access allow ldap_group_check2
http_access allow ldap_group_check1
http_access allow ldap_group_check3
http_access deny  all

 

As i think squid.conf file is parsed from top to bottom and if a related 
statement/acl is met then will see no further so it means that putting the 
statments in an order where groups containing most of the users will improve 
performance. Can there be if-else structure be used in squid.conf and how? Am 
not sure??? please guide...
 
 
 
 
Thanking you 
 

 
regards,
 
 
Bilal
 
 
 
  
_
Hotmail: Free, trusted and rich email service.
https://signup.live.com/signup.aspx?id=60969

[squid-users] Dual Core CPU, etc.

2010-04-25 Thread Matt
Will Squid benefit much from moving to a dual core CPU?  Currently I
have it running on a single core 1.8ghz socket 775 with 8g of ram on
centos 5.x 64 bit.  Also, is there a way to tell squid to do most of
its cache clean up during the off peak hours of like 1am to 6am?

Matt


RE: [squid-users] Only logging DENIED or MISS actions

2010-04-25 Thread Amos Jeffries
On Sun, 25 Apr 2010 17:00:16 -0300, CactusCo cactusan...@gmail.com
wrote:
 Hi Amos, thanks...
 
 That does not look like the squid native format. Please how the bits 
 you erased as well.
 
 This Squid come from a squid native Debian Lenny, that I upgraded. I
 changed
 soruces.list from stable to testing. After I did aptitude
update...
 safe-upgrade... full-apgrade all the system became testing or
 Squeeze... And now Squeeze is system running... Maybe this upgrade it
did
 make wrong...
 

Um. What I meant was when logging the squid format the log file contains
things like:
  1272237554.366 304 192.0.20.11 TCP_MISS/200 1385 GET http://example.com/
someuser FIRSTUP_PARENT/sting text/plain

I was asking for the rest of it since you cut off the bits which is
probably important to your answer as well.

snip not a
 
 b) you're using a patched your proxy.
 Is this the Debian package? or a self-built proxy?
 
 I replied in point a)
 
 c) those are actually the requests happening.
 
 FWIW; SITE systems Inc. have their website hosted at 66.113.131.78
right 
 now. Your DNS cache may have been poisoned.
 
 The www.site.com, www.sites.com is only like show, it's not real
url
 DNs is working fine and he comes from my ISP

Polease make a habit of using the proper example.com / example.org /
example.net for that.

 site.com and domain.com and mydomain.com are real websites.

Amos


Re: [squid-users] Dual Core CPU, etc.

2010-04-25 Thread Amos Jeffries
On Sun, 25 Apr 2010 16:47:02 -0500, Matt lm7...@gmail.com wrote:
 Will Squid benefit much from moving to a dual core CPU?  Currently I
 have it running on a single core 1.8ghz socket 775 with 8g of ram on
 centos 5.x 64 bit.  Also, is there a way to tell squid to do most of
 its cache clean up during the off peak hours of like 1am to 6am?
 
 Matt

Yes. Dual-core boxes can offer a whole CPU to Squid while leaving the
other for system and other background things which would otherwise compete
with Squid for time.

As for timing of the cache cleanup; no you can't set particular
time-of-day for it. When the overload thresholds kick in they really need
to be run right then. The regular maintenance garbaging cycle can be tunned
to happen more or less often as suits you.

Amos



[squid-users] SQUID 3.0 STABLE20 +DANSGUARDIAN transparent mode (file uploads brokens)

2010-04-25 Thread David Touzeau

Dear

I'm using Squid + dansguardian in transparent mode.
Squid and dansguardian are installed on the same computer.

When using Dansguardian and uploading files more than 8Mb after severals 
seconds uploads are broked and navigators display a broken page error


Files under 8Mb are correctly uploaded.


Did anyone encounter the same problem ?


here it is the squid.conf :

auth_param basic credentialsttl 2 hour
authenticate_ttl 1 hour
authenticate_ip_ttl 60 seconds
cache_effective_user squid
cache_effective_group squid
#- TWEEKS PERFORMANCES
memory_pools off
quick_abort_min 0 KB
quick_abort_max 0 KB
log_icp_queries off
client_db off
buffered_logs on
half_closed_clients off

#- acls
acl malware_block_list url_regex -i /etc/squid3/malwares.acl
acl blockedsites url_regex /etc/squid3/squid-block.acl
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl CONNECT method CONNECT
acl office_network src 192.168.1.0/24


#- MAIN RULES...
follow_x_forwarded_for allow localhost
# - SAFE ports
acl Safe_ports port 80  #http
acl Safe_ports port 21  #ftp
acl Safe_ports port 22  #ssh
acl Safe_ports port 443 563 #https, snews
acl Safe_ports port 1863#msn
acl Safe_ports port 70  #gopher
acl Safe_ports port 210 #wais
acl Safe_ports port 1025-65535  #unregistered ports
acl Safe_ports port 280 #http-mgmt
acl Safe_ports port 488 #gss-http
acl Safe_ports port 591 #filemaker
acl Safe_ports port 777 #multiling http
acl Safe_ports port 631 #cups
acl Safe_ports port 873 #rsync
acl Safe_ports port 901 #SWAT#
http_access deny malware_block_list
http_access deny blockedsites
http_access allow localhost
http_access deny !Safe_ports
http_access deny all
# - ident_lookup_access
hierarchy_stoplist cgi-bin ?

# - General settings
visible_hostname proxyweb


# - time-out
dead_peer_timeout 10 seconds
dns_timeout 2 minutes
peer_connect_timeout 3 minutes
connect_timeout 1600 seconds
persistent_request_timeout 3 minutes
pconn_timeout 1600 seconds

# - Objects limits
request_body_max_size 500 MB
reply_body_max_size 0
request_header_max_size 10 KB
maximum_object_size 300 MB
minimum_object_size 0 KB
maximum_object_size_in_memory 8 KB
# - timeouts



#http ports
http_port 23296 transparent


# - Caches
#cache_replacement_policy heap LFUDA
cache_mem 8 MB
cache_swap_high 90
cache_swap_low 95
# - DNS and ip caches
ipcache_size 1024
ipcache_low 90
ipcache_high 95
fqdncache_size 1024


# - SPECIFIC DNS SERVERS
debug_options ALL,1
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
icp_port 3130


#Logs-
emulate_httpd_log on
coredump_dir/var/squid/cache
cache_store_log /var/log/squid/store.log
cache_log   /var/log/squid/cache.log
pid_filename/var/run/squid.pid
access_log  /var/log/squid/access.log

cache_dir   ufs /var/cache/squid 2000 16 256
# - OTHER CACHES


Here it is the main dansguardian configuration file :

reportinglevel = 3
groupname = 'Default rule'
languagedir = '/etc/dansguardian/languages'
language = 'ukenglish'
loglevel = 3
logexceptionhits = 2
logfileformat = 2
loglocation = '/var/log/dansguardian/access.log'
statlocation = '/var/log/dansguardian/stats'
#
#routing to squid proxy port : 23296 but local port is 3128
filterip =
filterport =  3128
proxyip = 127.0.0.1
proxyport = 23296
originalip = off
#
accessdeniedaddress = 'http://YOURSERVER.YOURDOMAIN/cgi-bin/dansguardian.pl'
nonstandarddelimiter = on
usecustombannedimage = on
custombannedimagefile = '/etc/dansguardian/transparent1x1.gif'
filtergroups = 1
bannediplist = '/etc/dansguardian/bannediplist'
exceptioniplist = /etc/dansguardian/exceptioniplist'
banneduserlist = '/etc/dansguardian/banneduserlist'
exceptionuserlist = '/etc/dansguardian/exceptionuserlist'
exceptionphraselist = '/etc/dansguardian/lists/exceptionphraselist'
exceptionsitelist = '/etc/dansguardian/lists/exceptionsitelist'
showweightedfound = on
weightedphrasemode = 2
urlcachenumber = 1000
urlcacheage = 900
scancleancache = on
phrasefiltermode = 2
preservecase = 0
hexdecodecontent = off
forcequicksearch = off
reverseaddresslookups = off
reverseclientiplookups = off
logclienthostnames = off
createlistcachefiles = on
maxuploadsize = -1
maxcontentfiltersize = 256
maxcontentramcachescansize = 2000
maxcontentfilecachescansize = 2
filecachedir = '/tmp'
deletedownloadedtempfiles = on
initialtrickledelay = 20
trickledelay = 10
#downloadmanager = '/etc/dansguardian/downloadmanagers/fancy.conf'
downloadmanager = '/etc/dansguardian/downloadmanagers/default.conf'
#downloadmanager = '/etc/dansguardian/downloadmanagers/trickle.conf'

#- AV/ICAP
contentscanner = '/etc/dansguardian/contentscanners/clamdscan.conf'
contentscannertimeout = 60
contentscanexceptions = off

recheckreplacedurls = off
forwardedfor = on

Re: [squid-users] Dual Core CPU, etc.

2010-04-25 Thread Luis Daniel Lucio Quiroz
Le dimanche 25 avril 2010 18:32:43, Amos Jeffries a écrit :
 On Sun, 25 Apr 2010 16:47:02 -0500, Matt lm7...@gmail.com wrote:
  Will Squid benefit much from moving to a dual core CPU?  Currently I
  have it running on a single core 1.8ghz socket 775 with 8g of ram on
  centos 5.x 64 bit.  Also, is there a way to tell squid to do most of
  its cache clean up during the off peak hours of like 1am to 6am?
  
  Matt
 
 Yes. Dual-core boxes can offer a whole CPU to Squid while leaving the
 other for system and other background things which would otherwise compete
 with Squid for time.
 
 As for timing of the cache cleanup; no you can't set particular
 time-of-day for it. When the overload thresholds kick in they really need
 to be run right then. The regular maintenance garbaging cycle can be tunned
 to happen more or less often as suits you.
 
 Amos

If you want to drop objects there is an utility in squid page, squid-purge 
that could be used in a cron o drop objects,
i've tested in 3.0 :)


Re: [squid-users] Dual Core CPU, etc.

2010-04-25 Thread Chris Woodfield
To split the hairs a bit further, squid's core is mostly single threaded, but 
does have disk i/o processes that are spawned out. So going from single to dual 
core will give you gains for that reason in addition to the benefit of other 
system processes no longer competing with squid for timeslices.

That said, due to the single-threaded core, there are diminishing returns in 
going to quad CPUs or greater when running a single instance of squid, unless 
you have helper processes (rewriters, auth helpers, etc) which consume lots of 
cpu. At my last company we solved that scaling issue by simply running multiple 
squids on each server (There's even a nifty hack to bind them all to port 80 at 
the same time so you don't need multiple IP addresses - search the archives for 
it).

-C

On Apr 25, 2010, at 7:32 PM, Amos Jeffries wrote:

 On Sun, 25 Apr 2010 16:47:02 -0500, Matt lm7...@gmail.com wrote:
 Will Squid benefit much from moving to a dual core CPU?  Currently I
 have it running on a single core 1.8ghz socket 775 with 8g of ram on
 centos 5.x 64 bit.  Also, is there a way to tell squid to do most of
 its cache clean up during the off peak hours of like 1am to 6am?
 
 Matt
 
 Yes. Dual-core boxes can offer a whole CPU to Squid while leaving the
 other for system and other background things which would otherwise compete
 with Squid for time.
 
 As for timing of the cache cleanup; no you can't set particular
 time-of-day for it. When the overload thresholds kick in they really need
 to be run right then. The regular maintenance garbaging cycle can be tunned
 to happen more or less often as suits you.
 
 Amos
 



Re: [squid-users] Youtube Caching

2010-04-25 Thread Khemara Lyn

Thank you, Amos, for your comments and sorry for my late response.

I don't use any auth methods. In fact, I am using WCCP2 with a Cisco router.

I don't know if the patch you mentioned is compatible with the version 
2.7 STABLE7.1 that i'm using now.


BTW, I am considering using cachevideos (http://cachevideos.com/) to 
help cache the videos. Could someone with experience with the software 
comment about it and suggest an alternative software?


Thanks  regards,
Khem

On 04/26/2010 02:09 AM, Luis Daniel Lucio Quiroz wrote:

Le dimanche 25 avril 2010 06:11:09, Amos Jeffries a écrit :
   

Khemara Lyn wrote:
 

Dear All,

I know this topic have been around every now and then; but i'm desperate
now in finding any fix.

I tried the configuration as mentioned in this page:

Caching YouTube Content
http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube?highlig
ht=%28ConfigExamples%2FIntercept%29|%28ConfigExamples%2FAuthenticate%29|%
28ConfigExamples%2FChat%29|%28ConfigExamples%2FStreams%29|%28ConfigExampl
es%2FReverse%29|%28ConfigExamples%2FStrange%29
   

FYI,
Chudy has just updated the 3xx bug patch.

I'm not sure if its related to your problem, but worth a try.

Amos
 

Also
do you use digest auth? as far as i remember, youtube videos are not
compatible with digest auth.