RE: [squid-users] torrent

2009-08-12 Thread SSCR Internet Admin
If by any chance that your torrent client has an option to connect to your
proxy via proxy port, by all means use it.

Hope it works.

-Original Message-
From: Kevin Kimani [mailto:kevinkim...@gmail.com] 
Sent: Wednesday, August 12, 2009 5:38 PM
To: squid-users@squid-cache.org
Subject: [squid-users] torrent

Hi,

Am trying to configure a server so that i can be able to download
files through torrent. But i have tried setting acl SSL_ports port
5050, acl SSL_ports port 5222, acl SSL_ports port 6969, acl SSL_ports
port 6881, acl SSL_ports port  but have not been able to just
geting a message from ktorrent that the download is stalled.

I would really appreciate some help here

Regards
Kevin

--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph

__ Information from ESET NOD32 Antivirus, version of virus signature
database 4328 (20090812) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com


 

__ Information from ESET NOD32 Antivirus, version of virus signature
database 4328 (20090812) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
 


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph



[squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email)

2009-08-09 Thread SSCR Internet Admin
Hi,

Can anyone give me a hint as to block 443 and let some other secured site be
excluded from the block? 

TIA


 

__ Information from ESET NOD32 Antivirus, version of virus signature
database 4295 (20090731) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
 


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph



RE: [squid-users] Blocking port 443 and let some secured site to be accessed (ie yahoo.com email)

2009-08-09 Thread SSCR Internet Admin
Thanks Amos, hope this could partially stop ultrasurf... crossing fingers..


-Original Message-
From: Amos Jeffries [mailto:squ...@treenet.co.nz] 
Sent: Monday, August 10, 2009 10:35 AM
To: SSCR Internet Admin
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Blocking port 443 and let some secured site to be 
accessed (ie yahoo.com email)

On Mon, 10 Aug 2009 10:24:04 +0800, SSCR Internet Admin
ad...@sscrmnl.edu.ph wrote:
 Hi,
 
 Can anyone give me a hint as to block 443 and let some other secured site
 be
 excluded from the block? 

Depends on what you want to block there...

I assume that you actually mean you want to block HTTPS traffic except to
some certain sites.

Squid default controls have ACLs called SSL_ports and CONNECT. With this
configuration line:
http_access deny CONNECT !SSL_ports

To restrict further and only allow certain websites to use port 443/HTTPS
create an ACL listing their domain names and change the access lien like so

acl httpSites dstdomain .example.com
http_access deny CONNECT !SSL_ports !httpsSites

Amos


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph

__ Information from ESET NOD32 Antivirus, version of virus signature 
database 4295 (20090731) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com


 

__ Information from ESET NOD32 Antivirus, version of virus signature 
database 4295 (20090731) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
 


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph



[squid-users] When will SSLBump be included on the production version?

2009-08-09 Thread SSCR Internet Admin
Hello,

I would like to ask as to when will SSLBump be included in the main stream
of squid stable version? It seems that SSLBump can help use admin specially
in schools where ultrasurf is used on some laboratories (on usb memory
sticks) mostly on WiFi users.

Regards
 

__ Information from ESET NOD32 Antivirus, version of virus signature
database 4295 (20090731) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
 


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph



[squid-users] Detect Workstation's IP Address inside a firewall from squid

2009-07-28 Thread SSCR Internet Admin
Hi,

Is it possible for squid to see what is the actual workstation ip address
which is requesting from inside a firewall?

LAN  FIREWALL --- SQUID 

Thank you.
 

__ Information from ESET NOD32 Antivirus, version of virus signature
database 4286 (20090728) __

The message was checked by ESET NOD32 Antivirus.

http://www.eset.com
 


--- 
This message  is solely  intended  to the person(s) 
indicated on the  header  and  has been scanned for 
viruses and  dangerous  content by  MailScanner. If 
any  malware detected on  this transmission, please 
email the postmaster at ad...@sscrmnl.edu.ph.

Providing Quality Catholic Education for the Masses
for more info visit us at http://www.sscrmnl.edu.ph



Re: [squid-users] Should i enable mikrotik bandwidth sharing or leave it to Squid?

2009-06-25 Thread SSCR Internet Admin
On Mon, 2009-07-20 at 03:05 +0200, Mark Lodge wrote:
 My wireless setup is as follows
 
 Client PC --- Mikrotik-Squid Cache Server--Internet
 
 Do you suggest that I should configure PCQ (equal/fair bandwidth 
 sharing/distribution) on the Mikrotik routerboard or should i leave the 
 bandwidth sharing to Squid as suggested by Amos?
 
 Thanks
 Mark
 

I think its better get your bandwidth controller setup on squid, its
straightforward and simple


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



Re: [squid-users] Possible to setup a broadband connection on Debian?

2009-06-25 Thread SSCR Internet Admin
Mark,

You may check out delay_pools on squid.conf for more details on
bandwidth control.

Nats

On Mon, 2009-07-20 at 02:56 +0200, Mark Lodge wrote:
 My current setup is:
 
 Squid Proxy Server-
 Client PC---   ADSL Router --- Internet
 Client PC---
 Client PC---
 
 I want the Client PC to use the Squid Proxy Server for internet.
 
 There are 2 ways i have in mind
 
 1st option: Set the ADSL Router into bridge mode and connect the Squid 
 Proxy Server to the internet via a dial up broadband connection with 
 username and password.
 
 2nd option:Use another server as SUSE router and use that between the 
 client and Squid Proxy Server
 
 I want to go with the 1st option but i don't know how to do it. How do 
 you setup a broadband connection on Debian?
 
 Thanks
 Mark
 
 


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



Re: [squid-users] Possible to setup a broadband connection on Debian?

2009-06-24 Thread SSCR Internet Admin
On Mon, 2009-07-20 at 02:56 +0200, Mark Lodge wrote:
 My current setup is:
 
 Squid Proxy Server-
 Client PC---   ADSL Router --- Internet
 Client PC---
 Client PC---
 
 I want the Client PC to use the Squid Proxy Server for internet.
 
 There are 2 ways i have in mind
 
 1st option: Set the ADSL Router into bridge mode and connect the Squid 
 Proxy Server to the internet via a dial up broadband connection with 
 username and password.
 
 2nd option:Use another server as SUSE router and use that between the 
 client and Squid Proxy Server
 
 I want to go with the 1st option but i don't know how to do it. How do 
 you setup a broadband connection on Debian?
 
 Thanks
 Mark
 
 
why dont you just make as

CLIENT PC --- SQUID PROXY SERVER --- ADSL ROUTER --- INTERNET

?


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] Include IP Address on extension_method

2009-01-14 Thread SSCR Internet Admin
Hi,

I would like to know if its possible to include the ip address of on
clientParseRequestMethod since I got this on my cache.log

2009/01/15 10:52:11| clientParseRequestMethod: Unsupported method: This is
not a bug. see squid.conf extension_methods
2009/01/15 10:52:11| clientParseRequestMethod: Unsupported method in request
'PASS Virus__'
2009/01/15 10:52:11| clientProcessRequest: Invalid Request

This is particular entry is bombarding my logs and experienced latency.
Could anyone have some idea? Thank you.




-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] ULTRASURF (anti-filtering program) problem

2008-01-22 Thread SSCR Internet Admin
Yes I agree, but I have setup the firewall with proxy transparency, I even
redirect all ports above 1024 to 3128, but still wont do. Ill try
redirecting all lower ports and see the result...

-Original Message-
From: Marcus Kool [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, January 22, 2008 10:38 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] ULTRASURF (anti-filtering program) problem

HTTPS tunneling is blocked by the Squid redirector ufdbGuard.
It works with all versions of Squid and was first implemented
in version 1.10 of ufdbGuard (released in Nov 2006).

Of course one needs a proper firewall that blocks direct internet
from PCs so that the use of the Squid proxy is mandatory.

Marcus


Amos Jeffries wrote:
 Amos Jeffries wrote:
 SSCR Internet Admin wrote:
 Hi,

 This is an off topic, but here it goes...

 I would like to ask if anyone from squid mailing list has stumble upon
 ultrasurf that can bypass any filtering products such as squidguard.  
 I have
 setup a test pc with ip being blocked on squidguard. But to my 
 surprise it
 bypass everything ive setup and with ultrasurf running on my test pc, IE
 internet setting has been changed to use 127.0.0.1 using port 9666.

 I know that this is a kernel level issue and I havent successfully 
 blocked
 9666 via iptables, maybe someone could try it out and maybe come up 
 with a
 solution, before young students could have this program since you 
 don't need
 to install this on a PC, just run u.exe and youre done bypassing.


 Thank you and God bless...


 Never heard of them. But going by the documentation they are 
 HTTPS-tunneling all traffic from the localhost outbound.

 You and most would naturally allow HTTPS CONNECT requests through 
 without filters for all the banking and secure sites that need it.
 
 And a read of the code confirms it. Seems to be interfacing with PuTTY, 
 stunnel, and several HTTP CONNECT methods.
 

 If I'm right about it using HTTPS-tunnels you will need squid 3.1 with 
 SSLBump to filter this programs traffic properly. We are just awaiting 
 some of Alex's time for the SSLBump to be integrated fully into the 
 daily snapshots.
 
 Amos

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] ULTRASURF (anti-filtering program) problem

2008-01-21 Thread SSCR Internet Admin
Hi,

This is an off topic, but here it goes...

I would like to ask if anyone from squid mailing list has stumble upon
ultrasurf that can bypass any filtering products such as squidguard.  I have
setup a test pc with ip being blocked on squidguard. But to my surprise it
bypass everything ive setup and with ultrasurf running on my test pc, IE
internet setting has been changed to use 127.0.0.1 using port 9666.

I know that this is a kernel level issue and I havent successfully blocked
9666 via iptables, maybe someone could try it out and maybe come up with a
solution, before young students could have this program since you don't need
to install this on a PC, just run u.exe and youre done bypassing.


Thank you and God bless...


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] For admins that wanted to stop torrent downloads

2007-10-23 Thread SSCR Internet Admin
It seems that I am observing now, 90% of torrent downloads are not
connecting... I guess I would try several days, if these active connecting
torrent can actually connect... This is somewhat useful as of now on my
opinion, it cuts down torrent access...

-Original Message-
From: Amos Jeffries [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, October 23, 2007 3:22 PM
To: SSCR Internet Admin
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] For admins that wanted to stop torrent downloads

SSCR Internet Admin wrote:
 Hi,
 
 I am experimenting on how to stop torrent downloads, but when a torrent
 client already established a connection, it don't drop the packets at all.
 I hope someone could share a thought or two about my approach
 
 1. Run squid on transparent mode
 2. I run this iptables command...
 
 #Reroute all ports to port 3128
 $IPT -t nat -I PREROUTING -i $INT -p tcp --dport 80 -j DNAT  --to
 192.168.100.1:3128

Target to use is REDIRECT not DNAT.
Or on systems with appropriately patched kernel TPROXY target is available.

snip remaining list of ports

 
 4. I have found this logs on cache.log
 
 2007/10/23 13:47:42| parseHttpRequest: Requestheader contains NULL
 characters
 2007/10/23 13:47:42| parseHttpRequest: Unsupported method 'BitTorrent'
 2007/10/23 13:47:42| clientReadRequest: FD 137 (192.168.100.61:3907)
Invalid
 Request
 2007/10/23 13:47:43| parseHttpRequest: Requestheader contains NULL
 characters
 2007/10/23 13:47:43| parseHttpRequest: Unsupported method 'BitTorrent'
 2007/10/23 13:47:43| clientReadRequest: FD 89 (192.168.100.61:3908)
Invalid
 Request
 2007/10/23 13:47:43| parseHttpRequest: Requestheader contains NULL
 characters
 2007/10/23 13:47:43| parseHttpRequest: Unsupported method 'BitTorrent'
 2007/10/23 13:47:43| clientReadRequest: FD 152 (192.168.100.61:3909)
Invalid
 
 
 I don't know if these experiment also exist, but it's a good way, maybe
 someone could make a patch that blocks torrents or p2p apps based on the
 cache.log results.
 

Better yet. The dev team is looking for somebody interested in adding 
full Torrent support to squid.
That would entail adding settings and ACL to configure access/denial 
properly.

Amos

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.

__ NOD32 2608 (20071023) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com




__ NOD32 2608 (20071023) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] For admins that wanted to stop torrent downloads

2007-10-22 Thread SSCR Internet Admin
Hi,

I am experimenting on how to stop torrent downloads, but when a torrent
client already established a connection, it don't drop the packets at all.
I hope someone could share a thought or two about my approach

1. Run squid on transparent mode
2. I run this iptables command...

#Reroute all ports to port 3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 80 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 1024:1135 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 1137:1233 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 1235:3477 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 3480:4999 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 5002:5049 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 5051:5099 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 5101:5221 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 5224:7776 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 7778:8079 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 8082:8342 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 8344:8482 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 8484:9989 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 9992:9997 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 10001:1 -j DNAT  --to
192.168.100.1:3128
$IPT -t nat -I PREROUTING -i $INT -p tcp --dport 20001:65535 -j DNAT  --to
192.168.100.1:3128

4. I have found this logs on cache.log

2007/10/23 13:47:42| parseHttpRequest: Requestheader contains NULL
characters
2007/10/23 13:47:42| parseHttpRequest: Unsupported method 'BitTorrent'
2007/10/23 13:47:42| clientReadRequest: FD 137 (192.168.100.61:3907) Invalid
Request
2007/10/23 13:47:43| parseHttpRequest: Requestheader contains NULL
characters
2007/10/23 13:47:43| parseHttpRequest: Unsupported method 'BitTorrent'
2007/10/23 13:47:43| clientReadRequest: FD 89 (192.168.100.61:3908) Invalid
Request
2007/10/23 13:47:43| parseHttpRequest: Requestheader contains NULL
characters
2007/10/23 13:47:43| parseHttpRequest: Unsupported method 'BitTorrent'
2007/10/23 13:47:43| clientReadRequest: FD 152 (192.168.100.61:3909) Invalid


I don't know if these experiment also exist, but it's a good way, maybe
someone could make a patch that blocks torrents or p2p apps based on the
cache.log results.


Thanks.




-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] anonymous proxying sites

2007-10-16 Thread SSCR Internet Admin
On my part here I used the squidguard feature of using expressions, I just
added the word proxy it helps a bit...

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, October 17, 2007 8:26 AM
To: Chuck Kollars
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] anonymous proxying sites

On Tue, Oct 16, 2007, Chuck Kollars wrote:

 The list on http://proxy.org is the most complete one
 I know of. If you can figure out a way to
 automatically suck up their entire list _every_day_,
 remove duplicates, and add all those to your banned
 list, you can stop _much_ (but not anywhere near
 _all_) of the illicit activity. 

If people would like to see these sorts of features included
in Squid then please let us know. If it can be done free then it
will be; but in reality things like this generally cost money
to implement. Its why companies like Ironport do so well.
You couldn't do what Ironport does with their mail filtering for
free; and ironport are making/have made a web appliance which
will probably do this..

The trouble for Squid at the moment, of course, is non-port-80 traffic..



Adrian


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.

__ NOD32 2595 (20071016) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com




__ NOD32 2595 (20071016) Information __

This message was checked by NOD32 antivirus system.
http://www.eset.com



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] Cant access internal webserver when using squid 3128

2007-08-10 Thread SSCR Internet Admin
Hi,

My network is intercepting port 80 to 3128 (transparent proxy) in accessing
the internet.  Lately, I have some internal webserver and is now redirecting
trafic from outside to that internal webserver.  If use squid, (ie.,
configuring proxy on firefox), it seems that I am blocked or denied by
squid.  If I use transparent proxy, everything is working perfectly good.  

On what part on squid.conf I can tweak on this matter? Thanks and more power


TIA


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] squidguard website?

2007-07-26 Thread SSCR Internet Admin
Is there a new maintainer for squidguard? IF yes, what is the URL for it?

TIA

Nats


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] Automatic switching of squid to a second internet link?

2007-07-04 Thread SSCR Internet Admin
Hi,

It seems that you need the iproute2 package, try finding out on the
www.lartc.org. I don't remember though, I guess theres a guide on how to
accomplish this.  Hope that helps

Nats
-Original Message-
From: Danish Siddiqui [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, July 04, 2007 3:37 PM
To: squid-users@squid-cache.org
Cc: Tek Bahadur Limbu
Subject: Re: [squid-users] Automatic switching of squid to a second internet
link?



Tek Bahadur Limbu wrote:
 Danish Siddiqui wrote:

 Hi,
 Ive got squid proxy server running on a CentOS 4.4 machine. This proxy
 server is connected to the internet through a Sonicwall PRO3060
 firewall machine.

 We have got three different ISP lines, one of which is used by squid.
 All the three lines terminate at the firewall. One of these links then
 goes to the squid server.
 Many a times it happens that the internet link on the squid line goes
 down, because of which we have to switch the squid server on to one of
 the remaining ISP lines.

 Hi Danish Siddiqui,

 When the 1st ISP goes down, does that mean that you actually have to 
 switch the cable from your squid box to the 2nd or 3rd ISP link on 
 your Sonicwall machine?

No, the only cable that is connected to the squid box is from the 
Sonicwall firewall.

 I was planning a setup in which an extra NIC would be attached to the
 squid server. This NIC would be connected to a different ISP line, so
 that when one link goes down, the squid proxy server automatically
 switches on to the next line, wherein the LAN users dont get to feel
 the difference while browsing. Also, when the original link gets
 restored, the squid server automatically switches back on to the
 original link

 If your Sonicwall firewall and routing policy allows you to access all 
 3 ISPs lines from your Squid box, I think that you can use the 
 tcp_outgoing_address parameter to switch to either the 2nd or 3rd 
 ISP connection when the 1st ISP goes down.

 Of course, you must have a small script in Crontab to check for 
 internet connectivity to your 1st ISP at regular intervals, say every 
 2 minutes.

How will the script go. Can you give me some pointers till the time I 
look around for it.
 If the 1st ISP gets internet connectivity again, then let the script 
 restore connectivity from the 2nd or 3rd ISP back to the 1st ISP again.

 But again, adding 2 extra NIC cards to your Squid box will provide you 
 more control and fail over. In my opinion, it will be a very 
 interesting option.

Seems interesting to me too
 If your Squid box is running on Linux with a kernel greater than 
 2.4.20, then you can apply traffic and routing rules.
Its running on a CentOS 4.4 with kernel 2.6.9-42.ELsmp

 Please see the following link:

 http://lartc.org/howto/lartc.rpdb.multiple-links.html

 This guys really seem to perform some kind of magic with advanced 
 routing and traffic control!



 My current setup requires me to deny access to the squid server till
 the time it is up again.

 I suppose that you can't access all 3 ISPs lines from your Squid box?
Ill have to go according to your suggestions. But at the moment the 
squid box can access only 1 ISP line


 Is this setup possible? And if yes, can you please tell me how or
 point me to the necessary resources.

 I definitely think it is possible. Let's wait and get more help and 
 input from other experts and professionals from the Squid mailing list.


 Thanking you...


 Thanks
 Danish The information contained in this electronic message and any 
 attachments to this message are intended for the exclusive use of the 
 addressee(s) and may contain proprietary, confidential or privileged 
 information. If you are not the intended recipient, you should not 
 disseminate, distribute or copy this e-mail. Please notify the sender 
 immediately and destroy the original message all copies of this 
 message and any attachments.
 WARNING: Computer viruses can be transmitted via email. The recipient 
 should check this email and any attachments for the presence of 
 viruses. The company accepts no liability for any damage caused by 
 any virus transmitted by this email.









The information contained in this electronic message and any attachments to
this message are intended for the exclusive use of the addressee(s) and may
contain proprietary, confidential or privileged information. If you are not
the intended recipient, you should not disseminate, distribute or copy this
e-mail. Please notify the sender immediately and destroy the original
message all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should
check this email and any attachments for the presence of viruses. The
company accepts no liability for any damage caused by any virus transmitted
by this email.

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous 

[squid-users] increasing fds

2007-06-28 Thread SSCR Internet Admin
Hi,

I already set the max number of file to open (fs.file-max = 65535) at
sysctl.conf, is it also needed to edit a part of squid source code so that
it can avail the new value (ie, __FD_SETSIZE)? If it is, where can I edit it
and what source code file?

TIA




-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] squid time restrictions

2007-05-22 Thread SSCR Internet Admin
It sounds like your boss wants a time accounting system.  Its impossible for
squid to do that even if using squidguard with it.  Time based ACL are
static.  Im not a programmer but I guess, it should be nice if theres a
program on their pc that every time they launched the browser, a plugin will
be invoked that will decide that its time for browsing if not, close the
browser.. :)

-Original Message-
From: Jason Staudenmayer [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, May 23, 2007 5:07 AM
To: Ilya Vishnyakov
Cc: Odhiambo WASHINGTON; squid-users@squid-cache.org
Subject: RE: [squid-users] squid time restrictions

You might be able to do it with the auth plugins. Only auth the user for
fifteen minutes at a time.

Jason
..·º


 -Original Message-
 From: Ilya Vishnyakov [mailto:[EMAIL PROTECTED] 
 Sent: Tuesday, May 22, 2007 4:30 PM
 To: Jason Staudenmayer
 Cc: Odhiambo WASHINGTON; squid-users@squid-cache.org
 Subject: Re: [squid-users] squid time restrictions
 
 
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1
  
 Jason Staudenmayer wrote:
  A set fifteen minute time period (i.e.. 12:00pm - 12:15pm)
 Imagine user accessing the web @ 12:16pm for his/her 15 
 unused minutes?
 
 
 or just any fifteen minute span (12:20pm - 12:35pm or 1:23 - 1:38)?
 The goal is not to do a schedule for the users where the users will
 try to fit in it. It has to be something more advanced where users
 gets online and countdown begins. i don't know if it's possible with
 squid or not. I'm not that advanced user.
 
 
  Jason ..·º
 
 
  -Original Message- From: Ilya Vishnyakov
  [mailto:[EMAIL PROTECTED] Sent: Tuesday, May 22, 2007 3:36 PM To:
  Odhiambo WASHINGTON Cc: squid-users@squid-cache.org Subject: Re:
  [squid-users] squid time restrictions
 
 
  Odhiambo WASHINGTON wrote:
  * On 22/05/07 15:07 -0400, Ilya Vishnyakov wrote: |
  -BEGIN PGP SIGNED MESSAGE- | Hash: SHA1 | | Hello
  Squid Gurus! | We are running squid 2.15.14_2 on freeBSD 6.2
  Recently,
  my boss asked
  | me to limit the usage of time spent web browsing on
  certain ips. Did
  | anyone come across of a good piece of documentation which
  could point
  | me to the right direction? Simply shutting down squid at
  the certain
  | time won't be elegant. He wants me to do something more
  complicated
  | than this. He wants me to limit the web browsing to 15
  minutes an | hour. I googled and googled but everything that
  I found wasn't not | very helpful yet. Thank you in advance.
 
  The funny thing with Google is that without knowing the
  right keywords
  to search for, it does not lead you to what you want:-)
 
  Try this:
 
  http://www.nomoa.com/bsd/squid.htm#squidEXTIME
 
 
  Ok I read it. But what I would like to do is to limit the web
  browsing time to 15 minutes an hour.
 
  -Wash
 
  http://www.netmeister.org/news/learn2quote.html
 
  DISCLAIMER: See http://www.wananchi.com/bms/terms.php
 
  --
 
  +=
  =+
  |\  _,,,---,,_ | Odhiambo Washington
  [EMAIL PROTECTED]
  Zzz /,`.-'`'-.  ;-;;,_ | Wananchi Online Ltd.
  www.wananchi.com |,4-  ) )-,_. ,\ (  `'-'| Tel: +254 20
  313985-9  +254 20 313922 '---''(_/--'  `-'\_) | GSM: +254
  722 743223   +254 733 744121
 
  +=
  =+
  What does it mean if there is no fortune for you?
 
 
 
 
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.5 (MingW32)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
  
 iD8DBQFGU1LaUZGmaUWxLn8RAjsnAKCr/BmCTps40aW3bJDJd1cfqpetRwCgiYUF
 sR0qBzGylU9RFbrj7gQwICs=
 =8juE
 -END PGP SIGNATURE-
 
 

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] SMF Behind a reverse proxy

2007-05-22 Thread SSCR Internet Admin
Hi, 

 

I just wanted to ask if someone has experienced an SMF Board behind a
reverse proxy using squid.  Currently, if someone post, it should indicate
the IP address of that person after posting, but it reads as 127.0.0.1 as
the IP of the user.  

 

Where in the config could I turned on or off?

 

TIA

 



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] Wiki help for WPAD/PAC stuff (was Re: [squid-users] proxy.pac config)

2007-05-15 Thread SSCR Internet Admin
However, if the browser is not configured to use a PAC
file but a PAC file is delivered it brings up a
Security Alert because the browser never requested it.
I know the old Netscape browsers did this but am not
sure about IE.

Well, im sure local users will accept it happily by clicking OK, if not they
don't have access.. :)

-Original Message-
From: Jeff Smith [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, May 16, 2007 7:56 AM
To: squid-users@squid-cache.org
Subject: RE: [squid-users] Wiki help for WPAD/PAC stuff (was Re:
[squid-users] proxy.pac config)

It has been a few years since I played with PAC files
in browsers. I think redirecting  a request from
browser to automatically configure the browser will
only work if the browser is first configured to use a
PAC file. When the browser starts up and it is
configured to use a PAC file, its first request goes
to the URL the PAC file is located at and the file is
downloaded. Subsequent requests use the information
contained in the PAC file to go DIRECT or to a PROXY
etc. 

However, if the browser is not configured to use a PAC
file but a PAC file is delivered it brings up a
Security Alert because the browser never requested it.
I know the old Netscape browsers did this but am not
sure about IE.

Jeff Smith



--- SSCR Internet Admin [EMAIL PROTECTED] wrote:

 That is great Adrian.  Ill keep visiting you wiki,
 and lets see what I could
 help out.  Anyway about your Q about redirecting
 port 80 to a site, iptables
 will redirect all browsers connecting to port 80 to
 a local site where a
 script can be fired automatically to configure the
 browser to use the PAC.
 (of course it should check if it's a valid ip).  I
 don't know if Php or
 javascript can do this.
 
 Regards
 
 -Original Message-
 From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
 Sent: Saturday, May 12, 2007 4:47 PM
 To: squid-users@squid-cache.org
 Subject: [squid-users] Wiki help for WPAD/PAC stuff
 (was Re: [squid-users]
 proxy.pac config)
 
 I've started building the WPAD and ProxyPac sections
 in the Wiki and
 I'd really, really appreciate any help I can get in
 fleshing out the
 content.
 I've implemented both of them enough in a
 small-sized network to know
 they mostly work but I've not got the operational
 experience some of
 you have.
 
 I'd really appreciate some help here. I might even
 organise the helpers to
 get sent some CafePress Squid shirts when its done.
 
 
 
 
 Adrian
 
 
 -- 
 This message has been scanned for viruses and
 dangerous content by MailScanner, and is
 believed to be clean.
 
 
 
 -- 
 This message has been scanned for viruses and
 dangerous content by MailScanner, and is
 believed to be clean.
 
 



 


8:00? 8:25? 8:40? Find a flick in no time 
with the Yahoo! Search movie showtime shortcut.
http://tools.search.yahoo.com/shortcuts/#news

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] Wiki help for WPAD/PAC stuff (was Re: [squid-users] proxy.pac config)

2007-05-14 Thread SSCR Internet Admin
That is great Adrian.  Ill keep visiting you wiki, and lets see what I could
help out.  Anyway about your Q about redirecting port 80 to a site, iptables
will redirect all browsers connecting to port 80 to a local site where a
script can be fired automatically to configure the browser to use the PAC.
(of course it should check if it's a valid ip).  I don't know if Php or
javascript can do this.

Regards

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Saturday, May 12, 2007 4:47 PM
To: squid-users@squid-cache.org
Subject: [squid-users] Wiki help for WPAD/PAC stuff (was Re: [squid-users]
proxy.pac config)

I've started building the WPAD and ProxyPac sections in the Wiki and
I'd really, really appreciate any help I can get in fleshing out the
content.
I've implemented both of them enough in a small-sized network to know
they mostly work but I've not got the operational experience some of
you have.

I'd really appreciate some help here. I might even organise the helpers to
get sent some CafePress Squid shirts when its done.




Adrian


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] proxy.pac config

2007-05-12 Thread SSCR Internet Admin
Hi Adrian,

Maybe a VB script or active X that will configure browsers...

Regards...

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Saturday, May 12, 2007 4:49 PM
To: SSCR Internet Admin
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] proxy.pac config

On Sat, May 12, 2007, SSCR Internet Admin wrote:

 Last night when in bed thinking over this, ive come up an idea.  When a
user
 try to browse directly (port 80), iptables should redirect those traffic
to
 a specific part on your site where it magically configures the browsers to
 use PAC.  So no user intervention or manual config will occur, I guess
 firefox can be configured automatically.. 
 
 Just my two cents idea, who knows someone has already done this (not me, I
 only understand programming algo but not into coding). 

Hm, how do you magically configure a browser to use a proxy.pac file from
one port 80 access?

Its easy to setup a port 80 redirect to a web page which shows the user how
to setup their proxy server settings.




Adrian


-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] proxy.pac config

2007-05-11 Thread SSCR Internet Admin
That's really informative and ill try this one out.  At least 75% of my
network uses IE, so I have to manually edit 25% which uses firefox and
safari (Mac users who are Spanish, better review my Spanish 101 hehe).  

Last night when in bed thinking over this, ive come up an idea.  When a user
try to browse directly (port 80), iptables should redirect those traffic to
a specific part on your site where it magically configures the browsers to
use PAC.  So no user intervention or manual config will occur, I guess
firefox can be configured automatically.. 

Just my two cents idea, who knows someone has already done this (not me, I
only understand programming algo but not into coding). 

-Original Message-
From: K K [mailto:[EMAIL PROTECTED] 
Sent: Saturday, May 12, 2007 2:04 AM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] proxy.pac config

On 5/11/07, Adrian Chadd [EMAIL PROTECTED] wrote:
 You can turn that cache behaviour off. I'll hunt around for the
instructions
 to tell IE not to cache proxy.pac lookups and add it to the documentation.

That'd be handy.

  (P.S. Have you heard about the magical PAC refresh option in Microsoft's
  IEAK?)

 Nope! Please tell.

Inside Internet Explorer Administration Kit, you can build a custom
installer for IE6 or IE7 and tune just about everything remotely
related to IE.  Great for a corporate deployment, or for the OP's
question about forcing PAC settings to all desktops.

One of the options you can control is Connections Customization.
When you check this in the first menu, after going through a dozen or
so dialogs, deep in Stage 4 you will reach Connection Settings.
This gives you the option to Import the current connection settings
from this machine, and a button for Modify Settings.  If you use
this button, it will open the connections menu, just like under IE,
but there are extra options visible which never normally appear,
including an Advanced button next to the PAC url.

This reveals new options for PAC, including refresh time; changes here
are effective immediately on your local machine.  Once you exit IEAK,
the Advanced button vanishes from the control panel, but the
settings remain in effect -- if you set a proxy URL and refresh time
in the Brigadoon Advanced tab then choosing a new URL in the normal
connection setting window is ineffective.

There's probably a registry hack you could find to accomplish the same
results, and then just push down a .REG file to all the clients.

Kevin

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



[squid-users] proxy.pac config

2007-05-10 Thread SSCR Internet Admin
Hi,

 

I wanted to ask if this is possible.  Ive just installed a second squid
server and was wondering if I could create somewhat a loadbalancing without
using TCP-loadbalancer or HA by using a proxy.pac that is capable of
detecting a busy/failed server and connect to the next available proxy
server.

 

Squid 1

Internet--+-Workstation(with proxy.pac)

Squid 2

 

 

If you have any idea or experience, can you share it with me?  

 

TIA

 

 

Nats

 



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] proxy.pac config

2007-05-10 Thread SSCR Internet Admin
Thanks Adrian it works!  I could see that it shift to the other server when
I manually shutdown squid. 

Now, this could be a harder (for a noob like me).  What if I have 500
workstation, so I have to config each browser to use my new pac file, is
there a way that this pac will eventually force all browser to use pac.
Like blindly install pac on their browser when they go directly to port 80.

Thanks

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Friday, May 11, 2007 9:37 AM
To: SSCR Internet Admin
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] proxy.pac config

On Fri, May 11, 2007, SSCR Internet Admin wrote:
 Hi,
 
  
 
 I wanted to ask if this is possible.  Ive just installed a second squid
 server and was wondering if I could create somewhat a loadbalancing
without
 using TCP-loadbalancer or HA by using a proxy.pac that is capable of
 detecting a busy/failed server and connect to the next available proxy
 server.
 
  
 
 Squid 1
 
 Internet--+-Workstation(with proxy.pac)
 
 Squid 2

There's plenty of examples of proxy.pac file based load balancing and
failover.
Failover is easy, just give a number of entries in a list, ie:

return proxy1:3128; proxy2:3128

And to failover to direct, try:

return proxy1:3128; proxy2:3128; DIRECT

let me know if this doesn't work.



Adrian



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.



RE: [squid-users] Porn sites links found on the mirror page

2006-12-14 Thread SSCR Internet Admin
well try to look at it, seems like all keywords to me? why not use them as
expressions to block porn? just a thought.. :) 

-Original Message-
From: Adrian Chadd [mailto:[EMAIL PROTECTED] 
Sent: Thursday, December 14, 2006 7:24 PM
To: Kashif Ali Bukhari
Cc: squid-users@squid-cache.org
Subject: Re: [squid-users] Porn sites links found on the mirror page

On Thu, Dec 14, 2006, Kashif Ali Bukhari wrote:
 Oh yeah i can see it
 then squid owners should must b thinking about it ?

Slowly but surely..




Adrian


--
All messages that are coming from this domain is certified to be virus and
spam free.  If ever you have received any virus infected content or spam,
please report it to the internet administrator of this domain
[EMAIL PROTECTED]


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



RE: [squid-users] anonymous again...

2006-12-08 Thread SSCR Internet Admin
In connection to this, i tried to  --enable-follow-x-forwarded-for via
config then disable it on squid.conf (both on acl_uses_indirect_client is
off and follow_x_forwarded_for deny all) i even add -DFOLLOW_X_FORWARDED_FOR
on the startup script of squid, but still monip.org gives out the header
information.. dont know why..

TIA


-Original Message-
From: Marc-Olivier Meunier [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, December 06, 2006 9:11 PM
To: squid-users@squid-cache.org
Subject: [squid-users] anonymous again...

Hi all,

Had a look at the Archives before asking but I can't find my answer.

I'm living in Finland and I'm trying to access a french VOD site (cause I'm
french) but this site don't let me in.

So i've set up a squid proxy on a box located in France with a french IP.

I've used the forwarded_for off option but still this site detect that I'm
using a proxy and since he doesn't know where it comes from, it doesn't let
me in.


When using a site like  http://www.monip.org

it says:

IP : 88.191.16.144
--- oktober.momeunier.fr ---
1.1  oktober.momeunier.fr:3128 (squid/2.5.STABLE12)


---
Proxy detecté / Proxy detected
---


ORG_IP : 82.181.61.165
---  cs181061165.pp.htv.fi ---
Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.8.1) Gecko/20061010
Firefox/2.0


Here is my squid.conf:

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255  acl SSL_ports port 443 563
1 acl Safe_ports port 80 acl Safe_ports port 21 acl Safe_ports port 443
563 acl Safe_ports port 70 acl Safe_ports port 210 acl Safe_ports port
1025-65535 acl Safe_ports port 280 acl Safe_ports port 488 acl Safe_ports
port 591 acl Safe_ports port 777 acl Safe_ports port 901 acl purge method
PURGE acl CONNECT method CONNECT acl LocalNet src  192.168.0.0/255.255.255.0
acl marco src 82.181.61.165 forwarded_for off http_access allow manager
localhost http_access deny manager http_access allow purge localhost
http_access deny purge http_access deny !Safe_ports http_access deny CONNECT
!SSL_ports http_access allow localhost http_access allow LocalNet
http_access allow marco http_access deny all icp_access allow all log_fqdn
on


Any idea ?

Maybe can I change the IP in X-Forwarded-for ?

--
Marc-Olivier Meunier
http://blog.momeunier.fr
+358 50 4840036

-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] transparent or not to transparent

2006-11-15 Thread SSCR Internet Admin
Hi,

I am just confused with this site's address :
https://www.ecensus.com.ph/Secure/frmIndex.asp

When i go transparent i will get an error on displaying the page, but when i
use my proxy via my browser's internet settings, it went perfectly fine...
any ideas why?

TIA

Nats


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] strange cache entry

2006-10-09 Thread SSCR Internet Admin
Hi,

Could someone explain the entry i have seen on my cache.log, im just curious

 storeLocateVary: Not our vary marker object,
D7C7767F3961D5FA9CBD9C4EEC72D9E3 =
'http://images.friendster.com/200609C/css/REV01/globnav.css',
'accept-encoding=gzip,deflate'/'gzip,deflate'

Thanks

Nathaniel


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] urlparse problem

2006-10-09 Thread SSCR Internet Admin
Hi,

I have another problem which is different from
http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.49, this time my
workstations are on transparent proxy, i have lots of this on my cache.log

2006/10/10 08:26:35| urlParse: Illegal hostname
'.update.toolbar.yahoo.com'
2006/10/10 08:26:38| urlParse: Illegal hostname
'.update.toolbar.yahoo.com'

and from it, only yahoo toolbar update appears.


Thanks

Nathaniel


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



RE: [squid-users] any special config for SNMP for squid-2.6STABLE3?

2006-09-17 Thread SSCR Internet Admin
Hi Henrik, 

Thanks for that. It already working.. :)

Regards
Nats

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Sent: Monday, September 18, 2006 8:45 AM
To: SSCR Internet Admin
Cc: Squid Users
Subject: RE: [squid-users] any special config for SNMP for squid-2.6STABLE3?

mån 2006-09-18 klockan 08:35 +0800 skrev SSCR Internet Admin:
 yes

index 1 (OID: 1.3.6.1.4.1.3495.1.5.2.1.2) SNMPv1_Session (remote
  host: 123.123.123.12 [123.123.123.12].161)

Hmm wait a minute.. thats
squid.cacheMesh.cacheClientTable.cacheClientEntry.cacheClientHttpRequests,
the per-client HTTP requests counter. You haven't specified which client you
want to query for so there is nothing to return.. (the cacheClientTable is
indexed by the client IP address)

I think you are looking for
squid.cachePerf.cacheProtoStats.cacheProtoAggregateStats.cacheProtoClientHtt
pRequests (.1.3.6.1.4.1.3495.1.3.2.1.1).

Regards
Henrik


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] any special config for SNMP for squid-2.6STABLE3?

2006-09-15 Thread SSCR Internet Admin
Hi,

I am just wondering if squid-2.6STABLE3 has some new tweaking for SNMPs.  I
have a working mrtg page when i have squid-2.5STABLE11, i just inserted my
working snmp config from the old version to the new version i have
installed, but i got this..

SNMP Error:
Received SNMP response with error code
  error status: noSuchName
  index 1 (OID: 1.3.6.1.4.1.3495.1.5.2.1.2)
SNMPv1_Session (remote host: 123.123.123.12 [123.123.123.12].161)
  community: public
 request ID: 1006737219
PDU bufsize: 8000 bytes
timeout: 2s
retries: 5
backoff: 1)
 at /usr/local/mrtg-2/bin/../lib/mrtg2/SNMP_util.pm line 490
SNMPGET Problem for cacheClientHttpRequests cacheClientHttpRequests
cacheUptime  cacheSoftware
cacheVersionId on [EMAIL PROTECTED]::v4only
 at /usr/local/mrtg-2/bin/mrtg line 2034


Thank you very much.


Nats


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] i got Failed to select source... at cache.log

2006-09-14 Thread SSCR Internet Admin
Hi, 

I just install squid 2.6STABLE3, but got this line at cache.log Failed to
select source.. If im not mistaken squid is looking for a parent but
couldn't find one.  But I dont have parent caches... i just enable
always_direct allow our_network to compensate.. strange 2.5 didnt have this
prob.. just a thought...


Nats


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] strange, dont have cache peers but installer adds it to the default configuration

2006-09-01 Thread SSCR Internet Admin
Hi,

I just updated my squid to squid-2.6.1 but strange it adds up a default
cache_peer entry against itself.  Could some help me out? cant figured it
out during configure...

heres my squid configure switches

./configure --build=x86_64-redhat-linux-gnu --host=x86_64-redhat-linux-gnu
--target=x86_64-redhat-linux-gnu \
--enable-arp-acl --enable-delay-pools --program-prefix= --prefix=/usr
--exec-prefix=/usr --bindir=/usr/bin \
--sbindir=/usr/sbin --sysconfdir=/etc --datadir=/usr/share
--includedir=/usr/include --libdir=/usr/lib64 \
--libexecdir=/usr/libexec --localstatedir=/var --sharedstatedir=/usr/com
--mandir=/usr/share/man \
--infodir=/usr/share/info --exec_prefix=/usr --bindir=/usr/sbin
--libexecdir=/usr/lib64/squid --localstatedir=/var \
--sysconfdir=/etc/squid --enable-snmp --enable-poll
--enable-removal-policies=heap,lru \
--with-pthreads --enable-storeio=aufs,ufs --enable-ssl
--with-openssl=/usr/kerberos --enable-delay-pools \
--enable-ssl --enable-linux-netfilter --enable-useragent-log
--enable-referer-log --disable-dependency-tracking  \
--enable-cachemgr-hostname=proxy.sscrmnl.edu.ph --disable-ident-lookups
--enable-truncate --enable-underscores \
--enable-err-languages=English --enable-htcp --datadir=/usr/share
--enable-icmp

TIA

Nats


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



[squid-users] Any Slackers running on current version with squid current version just a survey

2006-07-26 Thread SSCR Internet Admin
Hi,

I would like to know if there are any slackers running on the current
version with the latest version of squid.  I havent touched slackware since
1992 (it was frustating to install).  This is to venture into the unknown
again.. :) Thanks for replying, if you know where I could start on slackware
besides on their site, would be much appreciated.  Thanks in advance..


Jose Nathaniel G. Nengasca


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



RE: [squid-users] Re: httpd_accel in Squid 2.6.STABLE1 problem

2006-07-09 Thread SSCR Internet Admin
Hi,

This is my config on 2.6 for transparent proxy

http_port 3128 transparent defaultsite=virtual vhost vport

Ciao,

Nathaniel

-Original Message-
From: news [mailto:[EMAIL PROTECTED] On Behalf Of peter S
Sent: Monday, July 10, 2006 5:12 AM
To: squid-users@squid-cache.org
Subject: [squid-users] Re: httpd_accel in Squid 2.6.STABLE1 problem

I am having trouble with squid config in 2.6 stable1.They have taken out
httpd_accel_port and httpd_accel_host and replaced them with defaultsite
http_port  and cache_peer originserver options. When I put in the name of my
server defaultsite http_port and the port that I am using under the
cache_peer option squid returns an error saying that it doesn't understand
the host name or port. I had to go back to another version of squid. Does
anyone have a squid
config http accelrater example for 2.6?





-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]


-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



Re: [squid-users] Error during make when using --enable-snmp on squid-25.STABLE14

2006-06-30 Thread SSCR Internet Admin

Henrik Nordstrom wrote:

tor 2006-06-29 klockan 15:07 +0800 skrev SSCR Internet Admin:

  
I have a working current stable release of squid when i decided to 
recompile to use snmp feature, but i bumped on this error during make,


snmp_core.o(.text+0x72): In function `snmpConnectionOpen':
/downloads/squid-2.5.STABLE14/src/snmp_core.c:357: undefined reference 
to `theInSnmpConnection'

..snippet



FreeBSD using distcc / ccache to compile?

Try compile without any distcc / ccache wrappers.

Regards
Henrik
  

No, Im using CentOS 4.3 64bit


--
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]




Re: [squid-users] Error during make when using --enable-snmp on squid-25.STABLE14

2006-06-30 Thread SSCR Internet Admin

Henrik Nordstrom wrote:

fre 2006-06-30 klockan 14:01 +0800 skrev SSCR Internet Admin:

  

No, Im using CentOS 4.3 64bit



Are you using distcc / ccache?

Regards
Henrik
  
I dont know if i am using it, im not into programming.. can you tell me 
how to identify it? if i am what should i do?


Thanks

--
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]




Re: [squid-users] Error during make when using --enable-snmp on squid-2.5.STABLE14

2006-06-30 Thread SSCR Internet Admin

Henrik Nordstrom wrote:

lör 2006-07-01 klockan 08:30 +0800 skrev SSCR Internet Admin:

  
I dont know if i am using it, im not into programming.. can you tell me 
how to identify it? if i am what should i do?



What does the compile line used by make look like? I.e. the few lines
above the error...

Regards
Henrik
  
gcc  -g -O2 -Wall -D_REENTRANT  -g -o squid  access_log.o acl.o asn.o 
authenticate.o cache_cf.o CacheDigest.o cache_manager.o carp.o cbdata.o 
client_db.o client_side.o comm.o comm_select.o debug.o delay_pools.o 
disk.o dns_internal.o errorpage.o ETag.o event.o external_acl.o fd.o 
filemap.o forward.o fqdncache.o ftp.o gopher.o helper.o  http.o 
HttpStatusLine.o HttpHdrCc.o HttpHdrRange.o HttpHdrContRange.o 
HttpHeader.o HttpHeaderTools.o HttpBody.o HttpMsg.o HttpReply.o 
HttpRequest.o icmp.o icp_v2.o icp_v3.o ident.o internal.o ipc.o 
ipcache.o  logfile.o main.o mem.o MemPool.o MemBuf.o mime.o multicast.o 
neighbors.o net_db.o Packer.o pconn.o peer_digest.o peer_select.o 
redirect.o referer.o refresh.o send-announce.o snmp_core.o snmp_agent.o 
ssl.o ssl_support.o stat.o StatHist.o String.o stmem.o store.o 
store_io.o store_client.o store_digest.o store_dir.o store_key_md5.o 
store_log.o store_rebuild.o store_swapin.o store_swapmeta.o 
store_swapout.o tools.o unlinkd.o url.o urn.o useragent.o wais.o wccp.o 
whois.o  repl_modules.o auth_modules.o store_modules.o globals.o 
string_arrays.o -L../lib repl/libheap.a repl/liblru.a fs/libaufs.a 
fs/libufs.a auth/libbasic.a -lcrypt -L../snmplib -lsnmp 
-L/usr/kerberos/lib -L/usr/kerberos/lib64 -lssl -lcrypto -lgssapi_krb5 
-lkrb5 -lcom_err -lk5crypto -lresolv -ldl -lz -lmiscutil -lpthread -lm 
-lresolv -lbsd -lnsl

snmp_core.o(.text+0x72): In function `snmpConnectionOpen':
/downloads/squid-2.5.STABLE14/src/snmp_core.c:357: undefined reference 
to `theInSnmpConnection'
snmp_core.o(.text+0x7d):/downloads/squid-2.5.STABLE14/src/snmp_core.c:364: 
undefined reference to `theInSnmpConnection'



--
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]




[squid-users] Error during make when using --enable-snmp on squid-25.STABLE14

2006-06-29 Thread SSCR Internet Admin

Hi,

I have a working current stable release of squid when i decided to 
recompile to use snmp feature, but i bumped on this error during make,


snmp_core.o(.text+0x72): In function `snmpConnectionOpen':
/downloads/squid-2.5.STABLE14/src/snmp_core.c:357: undefined reference 
to `theInSnmpConnection'

..snippet

Does anyone have the same problem? My config is

./configure --build=x86_64-redhat-linux-gnu 
--host=x86_64-redhat-linux-gnu --target=x86_64-redhat-linux-gnu \
--enable-arp-acl --enable-delay-pools --program-prefix= --prefix=/usr 
--exec-prefix=/usr --bindir=/usr/bin \
--sbindir=/usr/sbin --sysconfdir=/etc --datadir=/usr/share 
--includedir=/usr/include --libdir=/usr/lib64 \
--libexecdir=/usr/libexec --localstatedir=/var --sharedstatedir=/usr/com 
--mandir=/usr/share/man \
--infodir=/usr/share/info --exec_prefix=/usr --bindir=/usr/sbin 
--libexecdir=/usr/lib64/squid --localstatedir=/var \
--sysconfdir=/etc/squid *--enable-snmp* --enable-poll 
--enable-removal-policies=heap,lru \
--enable-storeio=aufs,coss,diskd,null,ufs --enable-ssl 
--with-openssl=/usr/kerberos --enable-delay-pools \
--enable-linux-netfilter --with-pthreads --enable-useragent-log 
--enable-referer-log --disable-dependency-tracking  \
--enable-cachemgr-hostname=proxy.sscrmnl.edu.ph --disable-ident-lookups 
--enable-truncate --enable-underscores \

--datadir=/usr/share

everything works except for the SNMP feature..


TIA




--
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]




RE: [squid-users] problem in opening specific website

2005-11-15 Thread SSCR Internet Admin
Looks fine in here in the philippines...

-Original Message-
From: Jigar Raval [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 16, 2005 1:42 PM
To: squid-users@squid-cache.org
Subject: [squid-users] problem in opening specific website

Hello,

I have configured Squid proxy server. It works fine.
But since
last few days, I am facing one problem for opening the
below
website 

   http://www.cost723.org

I could open it successfully from other network
(Without Proxy).But While trying to open behind proxy,
it says time out,
Remote Host may be down etc..

What could be the reason ?



__ 
Yahoo! FareChase: Search multiple travel sites in one click.
http://farechase.yahoo.com

-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



-- 
All messages that are coming from this domain
is certified to be virus and spam free.  If
ever you have received any virus infected 
content or spam, please report it to the
internet administrator of this domain 
[EMAIL PROTECTED]



RE: [squid-users] slower connections using squid (squid is slowing down all connections)

2005-09-23 Thread SSCR Internet Admin
Im no expert but i guess well have to dive in to your firewall rules.. it
happened to me once you know... try loading the default firewall rules (let
it go all thru the firewall and see what happens).. 

-Original Message-
From: Chris Robertson [mailto:[EMAIL PROTECTED] 
Sent: Saturday, September 24, 2005 7:14 AM
To: squid-users@squid-cache.org
Subject: RE: [squid-users] slower connections using squid (squid is slowing
down all connections)

 -Original Message-
 From: Alex [mailto:[EMAIL PROTECTED]
 Sent: Thursday, September 22, 2005 10:26 PM
 To: squid-users@squid-cache.org
 Subject: Re: [squid-users] slower connections using squid (squid is 
 slowing down all connections)
 
 
 Hi Chris,
 
   Also, i have:
   Last 5 minutes:
   client_http.requests = 2.723098/sec client_http.hits = 
   0.779933/sec
 
  Well, the box is not being stressed... What is the output
 of hdparm -t
  /dev/sda (I think SATA shows up as SCSI in linux)? 
 
 # cat /proc/partitions
 major minor  #blocks  name
 
8 0  120059840 sda
8 1 104391 sda1
8 2  1 sda2
8 52008093 sda5
8 6  117941166 sda6
 
 # hdparm -t /dev/sda
 
 /dev/sda:
  Timing buffered disk reads:  170 MB in  3.03 seconds =  56.15 MB/sec
 [EMAIL PROTECTED] ~]# hdparm -tT /dev/sda
 
 # hdparm -tT /dev/sda
 
 /dev/sda:
  Timing cached reads:   3792 MB in  2.00 seconds = 1896.29 MB/sec
  Timing buffered disk reads:  156 MB in  3.02 seconds =  51.73 MB/sec
 [EMAIL PROTECTED] ~]#
 
 I am using 2 sata HDD drives mounted in RAID1 hardware controller 
 (3ware - 2
 ports)
 
 Timings above are ok or not?

They look okay to me...

 
  If I set the BIOS to
  use the SATA drives in Enhanced (both SATA and PATA
 drives are available)
  or SATA Only, I get very poor performance.  Only if I set them as 
  Combined (the SATA drives act like a PATA channel) do I
 get decent
  performance.
 
 I can't access BIOS settings now (require to stop our services)... 
 Only tonight, when our network will be free... I will do it only if 
 speed above is poor 

I no longer think this could be causing your problem.

 
  I see nothing in here that would be causing the horrid
 performance that you
  are reporting.  What do the Median Service Times (under the
 General Runtime
  Info) look like?
 
 I increased cache_mem from 32 MB to 256MB and nothing seems to change 
 my life ... as you said, the box is not being stressed anyway Now, 
 I have only 138 clients accessing cache and squid is working worse 
 then a dial-up connection. Here comes General Runtime Info including 
 Median Service Times too...
 
 Connection information for squid:
  Number of clients accessing cache: 135  Number of HTTP requests 
 received: 82681  Number of ICP messages received: 0  Number of ICP 
 messages sent: 0  Number of queued ICP replies: 0  Request failure 
 ratio:  0.00  Average HTTP requests per minute since start: 62.4  
 Average ICP messages per minute since start: 0.0  Select loop called: 
 22219278 times, 3.579 ms avg Cache information for squid:
  Request Hit Ratios: 5min: 34.3%, 60min: 38.8%  Byte Hit Ratios: 5min: 
 4.4%, 60min: 24.5%  Request Memory Hit Ratios: 5min: 1.0%, 60min: 5.9%  
 Request Disk Hit Ratios: 5min: 39.3%, 60min: 32.4%  Storage Swap size: 
 4053340 KB  Storage Mem size: 43480 KB  Mean Object Size: 17.73 KB  
 Requests given to unlinkd: 0
 Median Service Times (seconds)  5 min60 min:
  HTTP Requests (All):   0.01955  0.02317
  Cache Misses:  0.44492  0.35832
  Cache Hits:0.00865  0.01035
  Near Hits: 0.28853  0.22004
  Not-Modified Replies:  0.01164  0.01035
  DNS Lookups:   0.08717  0.16304
  ICP Queries:   0.0  0.0

Okay... These look pretty good.  Hits are fast, misses are okay, DNS
requests are about what I would expect...  I'm a bit perplexed.  What are
the symptoms of slow connections?  Is it throughput on large downloads,
pages with lots of connections (msn.com with it's thousands of images), does
it just take forever for a connection to get started, or is it something
else entirely?

 Resource usage for squid:
  UP Time: 79515.851 seconds
  CPU Time: 387.250 seconds
  CPU Usage: 0.49%
  CPU Usage, 5 minute avg: 0.24%
  CPU Usage, 60 minute avg: 0.17%
  Process Data Segment Size via sbrk(): 79920 KB  Maximum Resident 
 Size: 0 KB  Page faults with physical i/o: 0 Memory usage for squid 
 via mallinfo():
  Total space in arena:   79920 KB
  Ordinary blocks:79544 KB   3201 blks
  Small blocks:   0 KB  0 blks
  Holding blocks:  1784 KB  3 blks
  Free Small blocks:  0 KB
  Free Ordinary blocks: 375 KB
  Total in use:   81328 KB 100%
  Total free:   375 KB 0%
  Total size: 81704 KB
 Memory accounted for:
  Total accounted:65164 KB
  memPoolAlloc calls: 11051188
  memPoolFree calls: 10437330
 File descriptor usage for squid:
  Maximum number of file descriptors:   1024
  

RE: [squid-users] Using null fs

2003-08-14 Thread SSCR Internet Admin
yeah thats true. but IMOP, using null fs and holding those object in RAM
should give a better squid performance or maybe a good hit rate since
squid functions the same way as having a null fs or having a big cache_dir
except no IO bound problems (since this is the hindrance).  I asked this coz
i have seen on cachemgr that only have 10%-19% hit rate using null fs (with
250MB cache_mem)compared to 25%-45% using cache_dir (with 250MB cache_mem).

so as you can see, only the cache_mem is different, so why null fs will have
lower hit rate compared to aufs, or diskd fs since holding those objects in
RAM is the same holding them on disk except that using null fs will
eventually be lost on reboot.

hope you can enlighten me up a bit.

-Original Message-
From: Adam Aube [mailto:[EMAIL PROTECTED]
Sent: Tuesday, August 05, 2003 6:13 AM
To: 'squid-mailing list'
Subject: RE: [squid-users] Using null fs


 will null fs with a cache_mem of 250MB could give me a
 higher hit rate compared to a 250MB cache_mem with a
 3gig cache_dir, or with a 250MB with a 100MB cache_dir
 since i want to lessen I/O bound operations on squid.

You can usually hold more cache on disk than you can in
memory. Using no disk cache will reduce your hit rate.

Also, every time you restart Squid it will dump its
memory. Since you have no disk cache, Squid will have
to fully rebuild its cache every time it restarts. This
will also reduce your hit rate.

In fairness, you will rarely have to restart Squid.

Adam
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.237 / Virus Database: 115 - Release Date: 3/7/2001


--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.507 / Virus Database: 304 - Release Date: 8/4/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.507 / Virus Database: 304 - Release Date: 8/4/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Using null fs

2003-08-08 Thread SSCR Internet Admin
I have a new machine to test squid in it.  I only have 40 gig hard drive
with 750MB RAM, will null fs with a cache_mem of 250MB could give me a
higher hit rate compared to a 250MB cache_mem with a 3gig cache_dir, or with
a 250MB with a 100MB cache_dir, since i want to lessen I/O bound operations
on squid.

Jose Nathaniel G. Nengasca
Internet Systems Administrator
San Sebastian College-Recoletos
Claro M. Recto Ave., Quiapo, Manila
Ph +63 2 7348931 local 300
Mo +63 9164582925
Website : http://www.sscrmnl.edu.ph
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Configs

2003-08-01 Thread SSCR Internet Admin
Hi,

Is there any knowledge base that has a definite squid configuration based on
what you have like this kind of hardware setup, this kind of bandwidth used,
and this number of users...


Jose Nathaniel G. Nengasca
Internet Systems Administrator
San Sebastian College-Recoletos
Claro M. Recto Ave., Quiapo, Manila
Ph +63 2 7348931 local 300
Mo +63 9164582925
Website : http://www.sscrmnl.edu.ph
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Slow access on local network webserver when using squid.

2003-07-31 Thread SSCR Internet Admin

Dear all,

A newly installed webserver running on NT IIS 4.0 on our local network to be
used on e-learning web based application, in order to have a decent domain
name, i added a new domain name to my dns server (which is another box which
only have registered ip) and point it to 192.168.50.201.  Now when they
access IIS thru my squid proxy (another box), it automatically downloads a
websvm.exe (which is a virtual machine, the way i looked at it.) and all
workstations here are blocked from downloading files from the internet. So i
have add an ACL which could allow them to download that specific file

acl elearning url_pathregex -i websvm.exe
acl file_ext url_pathregex -i /etc/squid/file_ext.txt

http_access allow elearning lanstations
http_access deny file_ext.txt lanstations

So it works, but they are complaning that it is slow, both having or not
having use of proxy in their IE 5.0 browser. Coz if they dont use my dns
server and my proxy, they just type the hostname of elearning on the browser
and they have no problem with it... Could this be that IIS 5.0 has problem
that squid can't comprehend?

Btw, i have transparent proxy configured so that no one can bypass my squid
by redirecting them back to 3128

Jose Nathaniel G. Nengasca
Internet Systems Administrator
San Sebastian College-Recoletos
Claro M. Recto Ave., Quiapo, Manila
Ph +63 2 7348931 local 300
Mo +63 9164582925
Website : http://www.sscrmnl.edu.ph

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Slow access on local network webserver when using squid.

2003-07-30 Thread SSCR Internet Admin
Dear all,

A newly installed webserver running on NT IIS 4.0 on our local network to be
used on e-learning web based application, in order to have a decent domain
name, i added a new domain name to my dns server (which is another box which
only have registered ip) and point it to 192.168.50.201.  Now when they
access IIS thru my squid proxy (another box), it automatically downloads a
websvm.exe (which is a virtual machine, the way i looked at it.) and all
workstations here are blocked from downloading files from the internet. So i
have add an ACL which could allow them to download that specific file

acl elearning url_pathregex -i websvm.exe
acl file_ext url_pathregex -i /etc/squid/file_ext.txt

http_access allow elearning lanstations
http_access deny file_ext.txt lanstations

So it works, but they are complaning that it is slow, both having or not
having use of proxy in their IE 5.0 browser. Coz if they dont use my dns
server and my proxy, they just type the hostname of elearning on the browser
and they have no problem with it... Could this be that IIS 5.0 has problem
that squid can't comprehend?

Btw, i have transparent proxy configured so that no one can bypass my squid
by redirecting them back to 3128

Jose Nathaniel G. Nengasca
Internet Systems Administrator
San Sebastian College-Recoletos
Claro M. Recto Ave., Quiapo, Manila
Ph +63 2 7348931 local 300
Mo +63 9164582925
Website : http://www.sscrmnl.edu.ph
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Understand Cache Manager Results

2003-07-21 Thread SSCR Internet Admin
Hi all,

I just want to read the explanation of each cache manager results on every
page. Can someone point me to the right way?

Thanks

Jose Nathaniel G. Nengasca
Internet Systems Administrator
San Sebastian College-Recoletos
Claro M. Recto Ave., Quiapo, Manila
Ph +63 2 7348931 local 300
Mo +63 9164582925
Website : http://www.sscrmnl.edu.ph
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Understand Cache Manager Results

2003-07-21 Thread SSCR Internet Admin
I just wanted to know what are these results

1. Select loop called: 6069985 times, 12.885 ms avg

Especially this one.  What the difference, and are these the actual cache
hits (which is which)?
2. Cache information for squid:
Request Hit Ratios: 5min: 53.2%, 60min: 58.5%
Byte Hit Ratios:5min: 23.5%, 60min: 27.2%
Request Memory Hit Ratios:  5min: 1.4%, 60min: 3.0%
Request Disk Hit Ratios:5min: 29.2%, 60min: 26.1%

Thanks


-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
Sent: Monday, July 21, 2003 7:12 PM
To: SSCR Internet Admin; squid-mailing list
Subject: Re: [squid-users] Understand Cache Manager Results


On Tuesday 22 July 2003 18.50, SSCR Internet Admin wrote:

 I just want to read the explanation of each cache manager results
 on every page. Can someone point me to the right way?

There is no complete detailed documentation on the cachemgr. A good
rule of thumb is to read the values you understand, ignore the other.
If you have a compelling need to understand what the value is about
then try asking about the specific value or menu here on squid-users.

Most values have a title trying to explain what the value is about.
Not always very successful however..

The Squid FAQ has some documentation/mentions on cachemgr values in
various chapters. So does the Squid Users Guide.

Regards
Henrik

--
Donations welcome if you consider my Free Squid support helpful.
https://www.paypal.com/xclick/business=hno%40squid-cache.org

If you need commercial Squid support or cost effective Squid or
firewall appliances please refer to MARA Systems AB, Sweden
http://www.marasystems.com/, [EMAIL PROTECTED]

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.502 / Virus Database: 300 - Release Date: 7/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] extract the IPAddress and the MAC Address of the machine from which request came

2003-07-17 Thread SSCR Internet Admin
True.  Coz routers will replace the packet headers that are passing thru it
with its own ARP, but if you want to see those ip and arp , you can simply
type arp -a or ip neigh show to see those.  And if you want to allow clients
only on based on their arp address, you should --enable-arp on squid during
configure session.

Nats

-Original Message-
From: fooler [mailto:[EMAIL PROTECTED]
Sent: Thursday, July 17, 2003 7:01 PM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: Re: [squid-users] extract the IPAddress and the MAC Address of
the machine from which request came
Importance: High


- Original Message -
From: Reena Panwar [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Thursday, July 17, 2003 7:21 PM
Subject: [squid-users] extract the IPAddress and the MAC Address of the
machine from which request came


 Squid is running in the transparent mode. Now whatever request comes to
squid it has to extract the IPAddress and the MAC Address of the machine
from which the request came.
 How can this be achieved.


both mac and ip address are significant when the client and the proxy server
are on the same network segment ...  but when the client is on the different
network segment, only the ip address is significant with regards to
transparent thing, even if your router, proxy server and the client are on
the same network segment and the router is the one hijacking the http
packets, the mac address of the router and the ip address of the client are
significant here...

fooler.





--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.497 / Virus Database: 296 - Release Date: 7/4/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.497 / Virus Database: 296 - Release Date: 7/4/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] deny downloads fr webmails

2003-06-30 Thread SSCR Internet Admin
If you like you can try using mime types and deny it using http_reply_access
using application/x-msdos-program as you mime type to block

-Original Message-
From: Peña, Botp [mailto:[EMAIL PROTECTED]
Sent: Sunday, June 29, 2003 8:12 PM
To: [EMAIL PROTECTED]
Subject: RE: [squid-users] deny downloads fr webmails


Adam Aube [mailto:[EMAIL PROTECTED] wrote:

 I'd like to deny downloading of files fr common webmails like
 yahoo/hotmail. It's the webmail downloads I cannot catch.
 
 I only get this kind of log:
 
 1056815851.164934 10.1.1.1 TCP_MISS/200 11237 GET
 http://us.f138.mail.yahoo.com/ym/ShowLetter? - DEFAULT_PARENT/10.
 254.254.6
 application/x-msdos-program
 
 Is it possible in squid to deny such download?

 Check more log entries for these downloads - do all of them come
 from a mail.yahoo.com server and use the ShowLetter script?
 Is ShowLetter
 used only when downloading files?

 If so, you could try using a url_regex acl - something like this:

 acl YahooFiles url_regex -i mail\.yahoo\.com.+ShowLetter
 http_access deny YahooFiles

but this in itself is not enough. I would not let them download executable
files only. Surely, I would allow them to download textfiles.

 If it were possible, it would be far easier to simply block the sites

sorry, cannot do that.

Does anyone know of any squid add-in that does filtering of files in
webmails?

 Adam

kind regards -botp

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.493 / Virus Database: 292 - Release Date: 6/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.493 / Virus Database: 292 - Release Date: 6/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] testing

2003-06-08 Thread SSCR Internet Admin

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.487 / Virus Database: 286 - Release Date: 6/1/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Antivirus / per-user ACLs

2003-05-31 Thread SSCR Internet Admin
I have configured that one using dansguardian patch, which you can download
sophos, f-secure, whatever.. just go to dansguardian website for more
indepth detail

-Original Message-
From: Christoph Haas [mailto:[EMAIL PROTECTED]
Sent: Friday, May 30, 2003 12:05 PM
To: [EMAIL PROTECTED]
Subject: Re: [squid-users] Antivirus / per-user ACLs


On Fri, May 30, 2003 at 02:11:34PM +0200, [EMAIL PROTECTED] wrote:
 1. Can Squid be configured to use 'whatever' antivirus product (F-Secure,
 Sophos, ...) to scan the stuff it passes to its clients? I only found
 something for OpenAntiVirus project...

I don't think so. But you can use any anti-virus software which acts as
a proxy. Just build a proxy chain and you are set. I use this setup with
TrendMicro.

 2. Can Squid handle per-user ACLs with user authentication?

Yes. See acl aclname proxy_auth.

 Christoph

--
~
~
.signature [Modified] 3 lines --100%--3,41 All

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.483 / Virus Database: 279 - Release Date: 5/19/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.483 / Virus Database: 279 - Release Date: 5/19/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] make pinger error

2003-03-24 Thread SSCR Internet Admin
Well I just installed and compiled squid as root and i have --enable-icmp
during configure.. but still pinger will exit when squid is running, thats
why i have to make pinger.. but no luck...

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of
Henrik Nordstrom
Sent: Sunday, March 23, 2003 11:49 PM
To: Marc Elsen
Cc: SSCR Internet Admin; squid-mailing list
Subject: Re: [squid-users] make pinger error


Marc Elsen wrote:

 SSCR Internet Admin wrote:
 
  I just run make pinger on the src tree.. and have this result
 
  [EMAIL PROTECTED] src]# make pinger
  gcc  -g -O2 -Wall  -g -o pinger  pinger.o debug.o
  globals.o -L../lib -lmiscutil -lpthread -ldl -lm -lresolv -lbsd -lnsl
  /usr/bin/ld: cannot find -lmiscutil
  collect2: ld returned 1 exit status
  make: *** [pinger] Error 1
 
  where shall i begun looking?

  Did you execute an appropriate configure command for squid ?


You also need to first build Squid.

Note: pinger is automatically built and installed when ICMP pinging is
enabled by configure, but you will need to finish the installation as
root. See the Squid FAQ for pinger installation instructions.

Regards
Henrik

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] transparent proxyng works but...

2003-03-24 Thread SSCR Internet Admin
I have already set transparent proxying on my squid server, workstations' ip
addresses are masqueraded on iptables and invisibly redirected to squid 3128
if anyone tries to bypass squid so those workstations are already can
connect to the internet without specifying squid 3128 on their browsers, but
those workstations which are 2 to 3 hops away from my proxy/firewalled
server cant connect to the internet directly or not even redirected to port
3128 unlike those workstations that are 1 hop away from my server.. whats
happening? is there a bug on iptables or something that i have to tweak on
squid?

Thanks.
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] make pinger error

2003-03-23 Thread SSCR Internet Admin
I just run make pinger on the src tree.. and have this result

[EMAIL PROTECTED] src]# make pinger
gcc  -g -O2 -Wall  -g -o pinger  pinger.o debug.o
globals.o -L../lib -lmiscutil -lpthread -ldl -lm -lresolv -lbsd -lnsl
/usr/bin/ld: cannot find -lmiscutil
collect2: ld returned 1 exit status
make: *** [pinger] Error 1

where shall i begun looking?

Nats
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] make pinger error

2003-03-23 Thread SSCR Internet Admin
yes i have just re-compiled my squid with new parameter

-Original Message-
From: Marc Elsen [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 23, 2003 11:23 PM
To: SSCR Internet Admin
Cc: squid-mailing list
Subject: Re: [squid-users] make pinger error




SSCR Internet Admin wrote:
 
 I just run make pinger on the src tree.. and have this result
 
 [EMAIL PROTECTED] src]# make pinger
 gcc  -g -O2 -Wall  -g -o pinger  pinger.o debug.o
 globals.o -L../lib -lmiscutil -lpthread -ldl -lm -lresolv -lbsd -lnsl
 /usr/bin/ld: cannot find -lmiscutil
 collect2: ld returned 1 exit status
 make: *** [pinger] Error 1
 
 where shall i begun looking?

 Did you execute an appropriate configure command for squid ?

 M.

 
 
 Nats
 ---
 Outgoing mail is certified Virus Free.
 Checked by AVG anti-virus system (http://www.grisoft.com).
 Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003
 
 --
 This message has been scanned for viruses and
 dangerous contents on SSCR Email Scanner Server, and is
 believed to be clean.

-- 

 'Time is a consequence of Matter thus
 General Relativity is a direct consequence of QM
 (M.E. Mar 2002)

-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.463 / Virus Database: 262 - Release Date: 3/17/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] RE: redirect authenticate and surf

2003-03-19 Thread SSCR Internet Admin
What if radius takes the authentication at hand, when a user is not
authenticated via radius, he will not pass it to squid, if its authenticated
then radius pass it on to squid...

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Behalf Of James Ambursley
Sent: Wednesday, March 19, 2003 10:44 AM
To: Rick Matthews
Cc: Squid List; [EMAIL PROTECTED] Org
Subject: RE: redirect authenticate and surf


could you please send me a sample squid.conf and squidguard.conf file which
redirects and includes authentication normally.



-Original Message-
From: Rick Matthews [mailto:[EMAIL PROTECTED]
Sent: Wednesday, March 19, 2003 10:44 AM
To: James Ambursley
Cc: Squid List; [EMAIL PROTECTED] Org
Subject: RE: redirect authenticate and surf


James Ambursley writes:

 I would like non authenticated users to be rediercted to an internal
 site and have no access.  I would like users when they initially
 connect to be redirected to a internal web site.  Then they will
 authenticate with a link on the web site via radius and then surf.
 Squidguard redirects users to the internal site.  After the users
 authenticate they should be allowed to surf freely.

I'm not the best source of information on authentication (since I've
never used it), but it doesn't look like the people who do use
authentication are going to help you.

I think what you are trying to set up is authentication, plain and
simple.  I think the main difference between your authentication
and the authentication used by others here is you are using radius.
I did a little reading on squid/radius authentication
http://selm.www.cistron.nl/authtools/, and I don't think radius
changes the overall authentication process.  Sure, the authentication
program/helpers are different, but I think the process remains the
same.

It is not uncommon for squid to receive a request from a user who
has not yet been authenticated, but whose acl requires authentication.
I'll never say never, but I've been reading the squidGuard mailing
list for going on 3 years, and I can't remember a single time where
someone was using squidGuard to redirect to the authenticator.  I
think you are trying to manually construct something that can
happen automatically.

Hopefully someone familiar with authentication will jump in and
straighten us out.

Rick


 -Original Message-
 From: Rick Matthews [mailto:[EMAIL PROTECTED]
 Sent: Monday, March 17, 2003 10:00 PM
 To: James Ambursley
 Cc: [EMAIL PROTECTED] Org
 Subject: RE: redirect authenticate and surf


 James Ambursley writes:
 
  Please help, I would like to have users authenticate, via radius
  and redirect to a page.  Only authenticated users can surf freely.
  All users are redirected to the page, then authenticate, then surf
  freely.

 If I am interpreting correctly, the two important points from your
 statements above are:

  users authenticate via radius
  Only authenticated users can surf freely.

 Is that meant to be different from Only authenticated users are
 allowed access?  Do you want to allow limited access to
 non-authenticated users?

 What part do you want squidGuard to play in this?

 Does your squid configuration authenticate properly without
 squidGuard? (comment out redirect_program)

  I have tried many combinations, and none work.

 I guess I need to better understand what you are trying to do.  Squid
 can handle authentication and access control.  Get that part working
 first before you factor squidGuard into the mix.

  My redirector is squidguard.  I have been able to redirect, but
  users at the page are only able to surf to pages which I preset in
  my list file.

 That's exactly what you told squidGuard to do in squidGuard.conf:
 - You haven't defined any source groups, so everyone is processed
   under the default acl.
 - The default acl is pass test none.  This says to allow access
   to the test destination group, which you said includes 4 domains.
 - If the incoming request is not for one of those 4 domains, you've
   told squidGuard to redirect anybody and everybody to:
   http://10.190.1.86/?;

 Please provide additional information so that we can help.

 Rick





  _
  My squid.conf is below:
 
  http_port 80
  icp_port 80
  httpd_accel_host 10.190.1.86
  acl acceleratedHost dst 127.0.0.1/255.255.255.255
  httpd_accel_port 80
  httpd_accel_host 127.0.0.1
  udp_incoming_address 0.0.0.0
  udp_outgoing_address 255.255.255.255
  hierarchy_stoplist cgi-bin ?
  #acl QUERY urlpath_regex cgi-bin \?
  acl QUERY url_regex cgi-bin \?
  no_cache deny QUERY
  no_cache deny acceleratedHost
  #requests for localhost not going to peer
  always_direct allow acceleratedHost
  cache_dir ufs /usr/local/squidtest/var/cache 100 16 256
  cache_access_log /usr/local/squidtest/var/logs/access.log
  cache_log /usr/local/squidtest/var/logs/cache.log
  log_ip_on_direct on
  pid_filename /usr/local/squidtest/var/logs/squid.pid
  hosts_file 

RE: [squid-users] block downloading extensions...!!

2003-03-11 Thread SSCR Internet Admin
I have this one on my config, and it works great

acl dl-filter urlpath_regex -i /etc/squid/file_ext.block

http_access deny dl-filter

and the content of file_ext.block

is

\.dll$
\.ini$
\.asf$
\.exe$
\.avi$
\.mp3$
\.mpeg$
\.mpg$
\.asf$
\.mp3$

and so on ..etc..


-Original Message-
From: Prasanta kumar Panda [mailto:[EMAIL PROTECTED]
Sent: Monday, March 10, 2003 6:38 AM
To: Henrik Nordstrom; NetSnake
Cc: [EMAIL PROTECTED]
Subject: RE: [squid-users] block downloading extensions...!!



With first method you will get problem with cached sites i.e.

http://test.one.com/dwonload.exe?cache 

Will not get blocked with -i \.exe$

Where as problem with mime type is that the mime.conf get overwrites with
the response from server I.e.

As per squid mime.conf .exe is application/octet-stream, but your rules to
block application/octet-stream don't work when you download from MS site as
it returns mime type as application/x-msdownload.

Reg.
Prasanta




-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
Sent: Monday, March 10, 2003 7:29 PM
To: NetSnake
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] block downloading extensions...!!


mån 2003-03-10 klockan 12.05 skrev NetSnake:
 It's very simple:

 acl MyDenyMIME urlpath_regex -i \.exe \.mov \.mpg \.mp?

you also need $ after the extension, and it is not at all related to MIME
types, only file extensions in the requested URL..

acl MyDenyExtensions urlpath_regex -i \.exe$ \.mov$ \.mpg$ \.mp?$

 acl deny MyDenyMIME

http_access deny MyDenyExtension


If you want to block based on MIME types then see the rep_mime_type acl and
http_reply_access.

Regards
Henrik

--
Henrik Nordstrom [EMAIL PROTECTED]
MARA Systems AB, Sweden


--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.


---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] MSN Messenger

2003-03-07 Thread SSCR Internet Admin
You can, just setup either IE or msn msgr itself to point it to the ip +
port number...

-Original Message-
From: Rangel, Luciano [mailto:[EMAIL PROTECTED]
Sent: Friday, March 07, 2003 11:17 AM
To: '[EMAIL PROTECTED]'
Subject: [squid-users] MSN Messenger


HI,


How I make to use the MSN Messenger with squid?


Thanks for help

Luciano

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] inquiry

2003-03-03 Thread SSCR Internet Admin
Oh i get it.. thanks, but what Connection information for squid? is this
also a part of client_db setting?
BTW, i forgot, that im on the other side of the world (philippines) , its
4:00pm here and you guys are still sleeping.. hehe

Connection information for squid:
 Number of clients accessing cache:  0
 Number of HTTP requests received:   11893
 Number of ICP messages received:0
 Number of ICP messages sent:0
 Number of queued ICP replies:   0
 Request failure ratio:   0.00%
 Average HTTP requests per minute since start:   9.3
 Average ICP messages per minute since start:0.0
 Select loop called: 5762010 times, 13.383 ms avg


-Original Message-
From: Marc Elsen [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 11:49 PM
To: SSCR Internet Admin
Cc: squid-mailing list
Subject: Re: [squid-users] inquiry




SSCR Internet Admin wrote:

 This is the result from which cache manager reports on General Runtime
 Information

 Squid Object Cache: Version 2.5.STABLE1

 Start Time: Sun, 02 Mar 2003 03:36:04 GMT
 Current Time: Mon, 03 Mar 2003 01:01:15 GMT

 Connection information for squid:
 Number of clients accessing cache:  0
 Number of HTTP requests received:   11893
 Number of ICP messages received:0
 Number of ICP messages sent:0
 Number of queued ICP replies:   0
 Request failure ratio:   0.00%
 Average HTTP requests per minute since start:   9.3
 Average ICP messages per minute since start:0.0
 Select loop called: 5762010 times, 13.383 ms avg

 And this one is on Cache Client List

 Cache Clients:
 TOTALS
 ICP : 0 Queries, 0 Hits (  0%)
 HTTP: 0 Requests, 0 Hits (  0%)

 The question is why am getting this? Does it mean that my squid is not
 caching it on disk? but i got lotta TCP_HITS here on my access.log... IF
 this is something to look at, where should i start looking? Since i have
 something like on my cache.log

 2003/03/02 11:37:09| Finished rebuilding storage from disk.
 2003/03/02 11:37:09|343204 Entries scanned
 2003/03/02 11:37:09| 0 Invalid entries.
 2003/03/02 11:37:09| 0 With invalid flags.
 2003/03/02 11:37:09|343202 Objects loaded.
 2003/03/02 11:37:09| 0 Objects expired.
 2003/03/02 11:37:09| 0 Objects cancelled.
 2003/03/02 11:37:09| 0 Duplicate URLs purged.
 2003/03/02 11:37:09| 2 Swapfile clashes avoided.
 2003/03/02 11:37:09|   Took 61.4 seconds (5592.8 objects/sec).
 2003/03/02 11:37:09| Beginning Validation Procedure
 2003/03/02 11:37:09|262144 Entries Validated so far.
 2003/03/02 11:37:09|   Completed Validation Procedure
 2003/03/02 11:37:09|   Validated 343202 Entries
 2003/03/02 11:37:09|   store_swap_size = 3433648k

 And why is it Cache Client is ZERO percent...

 What is your  'client_db' setting in squid.conf ?

 M.



 ---
 Outgoing mail is certified Virus Free.
 Checked by AVG anti-virus system (http://www.grisoft.com).
 Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

 --
 This message has been scanned for viruses and
 dangerous contents on SSCR Email Scanner Server, and is
 believed to be clean.

--

 'Time is a consequence of Matter thus
 General Relativity is a direct consequence of QM
 (M.E. Mar 2002)

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Multiple Accelerators.

2003-03-03 Thread SSCR Internet Admin
In addition to his question, brian.. Is it possible before a system shutdown
squid will save all cache objects from RAM to disks? And load all cache
object from object to RAM? coz you have /dev/null as the cache directory...

-Original Message-
From: Brian [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 11:55 PM
To: Allan; [EMAIL PROTECTED]
Subject: Re: [squid-users] Multiple Accelerators.


On Saturday 01 March 2003 05:55 pm, Allan wrote:
 Hello,

 we are currently using a single server (squid 2.4-STABLE7, Linux
 RedHat 7.3, 1.4 GHz Pentium III, 2 Gb Ram, 4096 FileDescriptors) as
 reverse-proxy for a small site (approx 2,5K objects, about 150 Mb).

To extract the most out of this box, try
* Squid ufs over Linux tmpfs or
   the null fs and large cache_mem  maximum_object_size_in_memory
* If this uses an Intel NIC, replace it with a 3com
* A kernel update -- the 2.4 jam patches are very nice.

 Due to growing load - 400 hits/sec, we are experincing loads about
 30% user and 65% system-time

 Is this alarming? Should we consider bying web-switches and adding
 furter servers?

The system time seems a bit high for a squid server that should be 100%
in memory.  The NIC and the filesystem are the likely culprits.  At 400
req/sec, I would consider a trio of servers just for smoothing over
uprades or other downtime.  Obviously the budget is not always so
flexible.

 As the site is quite small, would it be a good idea to build a
 mini-linux booting of floppy/CD with no harddrives/filesystems?

Not really necessary -- with that much RAM, yous shouldn't be touching
the hard drive, anyway.  I would go with null storage or tmpfs for the
squid cache, though.

-- Brian

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Redirector not working in squid rpm Redhat 7.3

2003-03-03 Thread SSCR Internet Admin
o this means that you are actually browsing locally using 127.0.0.1 using
the command shell, now my question is, have you set it so that it will not
go directly connect to the internet? meaning connect to port 3128, which
squid is listening...

-Original Message-
From: Anthony Giggins [mailto:[EMAIL PROTECTED]
Sent: Monday, March 03, 2003 2:17 PM
To: [EMAIL PROTECTED]
Subject: RE: [squid-users] Redirector not working in squid rpm Redhat
7.3


The only thing I can see in the squidguard.log is where I've tested it
locally and db updates
so this further confirms my theory that squid isn't redirecting to it?

try checking the squidguard.log, im sure you have some problems with your
config file (squidguard) thats why its going into emergency mode (let all
pass).. and still its running on background coz if its not running, no one
can surf since squid will pass all request to squidguard.

-Original Message-
From: Anthony Giggins [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 8:22 PM
To: [EMAIL PROTECTED]
Subject: [squid-users] Redirector not working in squid rpm Redhat 7.3


 the rpm version is Version 2.4.STABLE6

the redirector is squidGuard, and squid guards localised tests are working
but when going through squid it basically ignores it, the config ie.
squidGuard.conf was copied from a working server and the blacklists were
downloaded from the squidGuard website

I can see the squidGuard redirector children are started by squid

 has anyone else noticed this?

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Commercial Cache server

2003-03-03 Thread SSCR Internet Admin
hmmm squid and squidguard works great in here... so theres no sense of
buying when you can get FREE products

-Original Message-
From: Alireza Naderi [mailto:[EMAIL PROTECTED]
Sent: Monday, March 03, 2003 1:44 PM
To: [EMAIL PROTECTED]
Subject: [squid-users] Commercial Cache server


Hi Guys

I know that this mailing list is just for squid, but i
have a question
about cache servers.
Do any one work with one of them?
Do you think which of them is better and have more
ability and performance?
How about filtering with them?
Is it necessary to install others product such as
smartfilter  websence?

Any suggestion,recomendation  experience is
Appreciated.

Regards
Hedieh




--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] mimeLoadIconFile?

2003-03-02 Thread SSCR Internet Admin
these are the icon files for the FTP session to be used by squid

-Original Message-
From: Amy Anderson [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 3:19 PM
To: [EMAIL PROTECTED]
Subject: [squid-users] mimeLoadIconFile?


I am not sure why these images are trying to load, I do see them in the
errors of the squid dir, but not sure why they are trying to load. is
there anyway i can stop them from trying to load?  
 
Here is the squid cache_log it is actually 2 times as long as this.  I
will also include my squid.conf so you can see where I went wrong. 
Thanks so much.
 
cache_log
!--StartFragment--2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-image.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-text.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-dirup.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-link.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-text.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-image.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-sound.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-movie.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-portal.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-box.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-text.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-box.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
(2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-compressed.gif: (2) No such file or directory
2003/03/02 11:25:23| mimeLoadIconFile:
/usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 
squid.conf
 
# make sure you use rc.firewall http_port 3128 icp_port 0
hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache
deny QUERY cache_mem 8 MB cache_dir diskd /var/spool/squid 100 16 256
cache_access_log /var/log/squid/access.log cache_log
/var/log/squid/cache.log cache_store_log /var/log/squid/store.log
debug_options ALL,1 33,2 #redirect_program /usr/bin/squidGuard -c
/etc/squid/squidGuard.conf half_closed_clients off acl all src
0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src
127.0.0.1/255.255.255.255 acl mynetwork src 192.168.0.0/255.255.255.0
acl SSL_ports port 443 563 acl Safe_ports port 80 # http acl Safe_ports
port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports
port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port
1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl
Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl
Safe_ports port 777 # multiling http acl CONNECT method CONNECT
http_access allow manager localhost http_access deny manager http_access
deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow
localhost http_access allow mynetwork http_access deny all icp_access
allow all miss_access allow all #proxy_auth_realm MandrakeSoft Squid
proxy-caching web server visible_hostname xxx.xxx.st httpd_accel_host
virtual httpd_accel_with_proxy on httpd_accel_uses_host_header on
append_domain .xxx.st err_html_text [EMAIL PROTECTED] memory_pools
off cache_effective_user squid cache_effective_group squid #deny_info
ERR_CUSTOM_ACCESS_DENIED all memory_pools_limit 8 MB #cache_mem_high 80

RE: [squid-users] mimeLoadIconFile?

2003-03-02 Thread SSCR Internet Admin
Yes the other one already load the icon files and it doesnt matter you run
ftp or not, squid uses those icon files for file listing files when someone
on your network access an outside ftp server.  Try 'updatedb' then 'locate
anthony-image.gif' where it resides, and copy or move all those related gif
files that squid wants to /usr/share/icons/, this should work when you
restart squid...

-Original Message-
From: Amy Anderson [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 4:15 PM
To: SSCR Internet Admin
Cc: [EMAIL PROTECTED]
Subject: RE: [squid-users] mimeLoadIconFile?


I dont have FTP on this server, the port is also closed in my MNF, I
have another machine almost identical, and it does not try to load the
images.
any ideas?
Amy
On Mon, 2003-03-03 at 10:49, SSCR Internet Admin wrote:
 these are the icon files for the FTP session to be used by squid

 -Original Message-
 From: Amy Anderson [mailto:[EMAIL PROTECTED]
 Sent: Sunday, March 02, 2003 3:19 PM
 To: [EMAIL PROTECTED]
 Subject: [squid-users] mimeLoadIconFile?


 I am not sure why these images are trying to load, I do see them in the
 errors of the squid dir, but not sure why they are trying to load. is
 there anyway i can stop them from trying to load?

 Here is the squid cache_log it is actually 2 times as long as this.  I
 will also include my squid.conf so you can see where I went wrong.
 Thanks so much.

 cache_log
 !--StartFragment--2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-image.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-text.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-dirup.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-link.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-text.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-dir.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-image.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-sound.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-movie.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-portal.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-box.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-text.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-box.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile: /usr/share/icons/anthony-ps.gif:
 (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-compressed.gif: (2) No such file or directory
 2003/03/02 11:25:23| mimeLoadIconFile:
 /usr/share/icons/anthony-unknown.gif: (2) No such file or directory

 squid.conf

 # make sure you use rc.firewall http_port 3128 icp_port 0
 hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? no_cache
 deny QUERY cache_mem 8 MB cache_dir diskd /var/spool/squid 100 16 256
 cache_access_log /var/log/squid/access.log cache_log
 /var/log/squid/cache.log cache_store_log /var/log/squid/store.log
 debug_options ALL,1 33,2 #redirect_program /usr/bin/squidGuard -c
 /etc/squid/squidGuard.conf half_closed_clients off acl all src
 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src
 127.0.0.1/255.255.255.255 acl mynetwork src 192.168.0.0/255.255.255.0
 acl SSL_ports port 443 563 acl Safe_ports port 80 # http acl Safe_ports
 port 21 # ftp acl Safe_ports port 443 563 # https, snews acl Safe_ports
 port 70 # gopher acl

RE: [squid-users] Spyware block (Not Squidguard)?

2003-03-02 Thread SSCR Internet Admin
Hmmm I wonder why you want to switch to another program? SquidGuard is
sufficient rather work great..

-Original Message-
From: Rob Poe [mailto:[EMAIL PROTECTED]
Sent: Sunday, March 02, 2003 5:20 PM
To: Henk-Jan (squid)
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] Spyware block (Not Squidguard)?


I use the regex url with a badsites.txt  and have the site names
of most of the spyware providers (plus a few porn sites) and a lot
of banner ad servers.

seems to not make too much of a difference in surfing

r


Henk-Jan (squid) wrote:

 Does anybody know if there is a blacklist that can be easily
 read/implemented that would bloack spyware in our proxyserver?
 Is there such an initiatif going on? Or am I stuck to squidguard?

 (Or is there another proxy server that could do this?)

 Henk-Jan

--
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] Blocking Yahoo! MSN Messengers thru Squid.

2003-03-02 Thread SSCR Internet Admin
have you tried blocking it on iptables? port 5101 is the one that msn msgr
is using...

  -Original Message-
 From: Georges J. Jahchan, Eng. [mailto:[EMAIL PROTECTED] 
 Sent: Saturday, March 01, 2003 4:20 PM
 To:   Squid-Users
 Subject:  [squid-users] Blocking Yahoo!  MSN Messengers thru Squid.
 
 Trying to block Yahoo! Messenger  MSN Messenger without disallowing
 connections to TCP ports 443  563 or blocking Yahoo! Mail and HotMail in
 squid v2.5 stable1.
 Following is the interesting part in squid.conf:
 acl SSL_ports port 443 563
 acl CONNECT method CONNECT
 http_access deny CONNECT !SSL_ports
 In the access log, I see the following:
 1046394571.141   9423 127.0.0.1 TCP_MISS/200 2755 CONNECT
 loginnet.passport.com:443 - DIRECT/65.54.228.253 -
 1045515709.636  12619 127.0.0.1 TCP_MISS/200 15952 CONNECT
 login.yahoo.com:443 - DIRECT/64.58.76.98 -
 My guess is the Messengers would connect through any open port, but I have
 not tested it.
 Obviously, both are working. Blocking access to port 443 is out of the
 question as this disables access to secure web sites through squid.
 Blocking loginnet.passport.com and login.yahoo.com would mean login
 becomes impossible to HotMail and Yahoo! Mail.
 Any ideas on how to surgically block the Messengers without blocking
 Yahoo! Mail and HotMail in squid?
 TIA
 
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.459 / Virus Database: 258 - Release Date: 2/25/2003
 

-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.

attachment: winmail.dat

[squid-users] {Suspected Spam} Follow-up

2003-02-28 Thread SSCR Internet Admin
Just want to follow-up my question on the result of my cache manager..

Cache Clients:
TOTALS
ICP : 0 Queries, 0 Hits (  0%)
HTTP: 0 Requests, 0 Hits (  0%)

Why is it that its 0 since i have lots of TCP_HIT on my access.log
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.456 / Virus Database: 256 - Release Date: 2/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



[squid-users] Question

2003-02-27 Thread SSCR Internet Admin
Is it normal that i have

Cache Clients:
TOTALS
ICP : 0 Queries, 0 Hits (  0%)
HTTP: 0 Requests, 0 Hits (  0%)

in my cache manager page? This seems impossible since i got lots of TCP_HITS
all the time...
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.456 / Virus Database: 256 - Release Date: 2/18/2003


-- 
This message has been scanned for viruses and
dangerous contents on SSCR Email Scanner Server, and is
believed to be clean.



RE: [squid-users] SIGSEGV

2003-02-21 Thread SSCR Internet Admin
I already post the new bug...

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
Sent: Friday, February 21, 2003 12:11 AM
To: SSCR Internet Admin; squid-mailing list
Subject: Re: [squid-users] SIGSEGV


Pleasse get the full backtrace of the segmentation fault and register 
a bug for the problem. 
http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.19

Squid should never segmentation fault.

The first two messages are from invalid requests, first with a space 
character in the middle of the host name, the second due to the user 
using , instead of .

Regards
Henrik


On Friday 21 February 2003 17.51, SSCR Internet Admin wrote:
 I have this on my newly installed squid-2.5STABLE1 which i ran gdb

 2003/02/20 17:43:52| urlParse: Illegal character in hostname
 'www.ultra passwords.com'
 2003/02/20 17:46:40| urlParse: Illegal character in hostname
 'hot,ail.com' ^[[A2003/02/20 18:12:04| NETDB state saved; 0
 entries, 0 msec 2003/02/20 18:52:06| NETDB state saved; 0 entries,
 13 msec 2003/02/20 19:33:46| NETDB state saved; 0 entries, 13 msec

 Program received signal SIGSEGV, Segmentation fault.
 [Switching to Thread 8192 (LWP 8242)]
 malloc (bytes=4104) at dlmalloc.c:2090
 2090  victim_size = chunksize(victim);

 Regards
 Nats
 ---
 Outgoing mail is certified Virus Free.
 Checked by AVG anti-virus system (http://www.grisoft.com).
 Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.456 / Virus Database: 256 - Release Date: 2/18/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.456 / Virus Database: 256 - Release Date: 2/18/2003




RE: [squid-users] Unsupported Methods Found on cache.log

2003-02-20 Thread SSCR Internet Admin
I am sorry to tell you that i have upgraded my squid-2.4STABLE7 to
squid-2.5.STABLE1 which really fix the problem.  I dont know if ill meet
this in the near future, and i have not set the coredump_dir, tsk tsk.. ill
see to it ill send a bug report once this problem occurs again

-Original Message-
From: Henrik Nordstrom [mailto:[EMAIL PROTECTED]]
Sent: Thursday, February 20, 2003 12:01 AM
To: SSCR Internet Admin; squid-mailing list
Subject: Re: [squid-users] Unsupported Methods Found on cache.log


On Thursday 20 February 2003 18.16, SSCR Internet Admin wrote:
 Hi,

 Im having lots of this on cache.log - unsupported method
 'MachineId:.. blahblah...' and followed by ClientReadRequest : FD36
 Invalid Request,

Looks like a client sending a malformed request to Squid.

Are you running Squid as a normal proxy or transparently intercepting
port 80?

 and if it continues i got FATAL : Recevied
 Segmentation Fault.. Dying...

Please get a stack trace of the segmentation fault and report a Squid
bug in our bug management tool (bugzilla). Squid should not
segmentation fault.

For details on how get a stack trace and to register a bug see Squid
FAQ 11.19 Sending in Squid bug reports
url:http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.19

Regards
Henrik
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003




[squid-users] SIGSEGV

2003-02-20 Thread SSCR Internet Admin
I have this on my newly installed squid-2.5STABLE1 which i ran gdb

2003/02/20 17:43:52| urlParse: Illegal character in hostname 'www.ultra
passwords.com'
2003/02/20 17:46:40| urlParse: Illegal character in hostname 'hot,ail.com'
^[[A2003/02/20 18:12:04| NETDB state saved; 0 entries, 0 msec
2003/02/20 18:52:06| NETDB state saved; 0 entries, 13 msec
2003/02/20 19:33:46| NETDB state saved; 0 entries, 13 msec

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 8192 (LWP 8242)]
malloc (bytes=4104) at dlmalloc.c:2090
2090  victim_size = chunksize(victim);

Regards
Nats
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003




[squid-users] Continuation on squid-2.5STABLE1 bug ( i hope its not )

2003-02-20 Thread SSCR Internet Admin
Continued i have backtraced where gdb received a segmentation fault

(gdb) backtrace
#0  malloc (bytes=4104) at dlmalloc.c:2090
#1  0x080c3b33 in calloc (n=1, elem_size=135222552) at dlmalloc.c:2791
#2  0x080c7c6b in xcalloc (n=1, sz=4104) at util.c:557
#3  0x0808a762 in memPoolAlloc (pool=0x1008) at MemPool.c:254
#4  0x080a0045 in stmemAppend (mem=0xa6ad698,
data=0x8133e70 \221\235Ô
\020@çn \017(]Ó%É­^¾\026KkXɹÙÅuÇÒå\iÅ-H\031Ø*76æ\237MǬ\032\211¼a\212ü\216
\235,[\025ÿ, len=702) at stmem.c:109
#5  0x080a0fc2 in storeAppend (e=0xa94c868,
buf=0x81339c0
\201Vã_\177XÚ\205«o'\231\177\203Úú\026\210\031\226V­\201ÐFjEÑ2D\032(\221\20
7Ùt¢W\\yÅ\017ò¥î\017BT\221ý/\017X¶T¢\v±\227\\§\177\211Ðç$Âs-m\034\203ó\215S÷
\002\021±n\213Aé³e\212BÕê;µ\f=)Z\220Aø\211aÂ\231B\006£'ʱã\227-ýúÂn-Çr6cj\20
5®¹ú~\203)ê\nñ}\022\23336\025#8Ø»\205më\227êÓMÈ\004A¥\017$S|\tBõ¥¶Q\217(\216
\217HÁòs\177\fÜĬÜÚt»ë\v¹è\aÊO\225Ìt!Zx:ëµR\216ì4¿..., len=1902)
at store.c:527
#6  0x0807d2bb in httpReadReply (fd=125, data=0xaa465b8) at http.c:642
#7  0x0806763a in comm_poll (msec=10) at comm_select.c:445
#8  0x08088fbb in main (argc=2, argv=0xba94) at main.c:738
#9  0x420158d4 in __libc_start_main () from /lib/i686/libc.so.6

Regards
Nats
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003




RE: [squid-users] SQUID.PID

2003-02-16 Thread SSCR Internet Admin
cant you try service squid restart?

-Original Message-
From: Anil K P [mailto:[EMAIL PROTECTED]]
Sent: Sunday, February 16, 2003 7:35 PM
To: [EMAIL PROTECTED]
Subject: [squid-users] SQUID.PID


Hi all,

I am having Squid V 2.4 Stable 1 on RH 7.3 working as a Transparent Proxy.
Squid.pid file is not getting updated. I have to manually kill squid using
kill
command and restart the squid again .
Thanks in advance
Regds
Anil K P
--
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.455 / Virus Database: 255 - Release Date: 2/13/2003




RE: [squid-users] IP based access control through restricting password reuse

2003-02-14 Thread SSCR Internet Admin
You can create an acl for it... Like

acl privilege_ip src /etc/squid/ip_add

contents of ip_add will be
156.160.1.1/32
156.160.45.5/32
and
so 
on

then

http_access allow privilege_ip
http_access deny all


Nats

-Original Message-
From: Mr. Singh [mailto:[EMAIL PROTECTED]]
Sent: Friday, February 14, 2003 2:20 AM
To: [EMAIL PROTECTED]
Subject: [squid-users] IP based access control through restricting
password reuse



Hi Users

 My local network  ip address is as follows(however fictitious)

156.160.1.1 to 156.160.45.255 .  I have configured user authentication
too. Now  What I am planning is to allow a  user  to browse the
internet  from a particular range of computers only. Can I achieve this
arrangement through access control list ?? If so what is the way to
achieve this? 

T. Singh



-- 
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003




RE: [squid-users] Ignore

2003-02-14 Thread SSCR Internet Admin
Hmm thats sounds interesting, Henrik can you provide us a step by step code
on this? This is for non-programmer like me...

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of
Henrik Nordstrom
Sent: Friday, February 14, 2003 5:27 PM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] Ignore


You can't without modifying the source.

Regards
Henrik


Richard StClair wrote:

 How can you get squid to ignore sites that have the 'Cache-Control:
no-cache'
 option set in the initial HTTP packets so that they'll cache anyway??

 --
 Regards,
 Richard Saint Clair,
 Co-Founder Technical Manager
 Internet Users Society Niue
 Chairman, Pacific Island Chapter ISOC
 
 [EMAIL PROTECTED] www.niue.nu
 Voice (68 3) 4630 Fax (68 3) 4237
 Internet Service Provider, Niue Island
 
 ISP/C, ISOC, APIA, NCOC, ISOCNZ, PICISOC, ARRL
 
 Niue Island South Pacific 169 West 19 South
 
 Don't forget, Nuts feed the squirrels.
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003




RE: [squid-users] question concerning php-sites and caching -still some questions

2003-02-13 Thread SSCR Internet Admin
some sites dont want their pages to cached, so i guess squid will eventually
reload pages.

-Original Message-
From: alp [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, February 12, 2003 11:01 PM
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] question concerning php-sites and caching
-still some questions


thanks marc,

i knowed this page already, it's a really nice one.
but my problem is: does squid never caches an object without validation
headers (expires, max-age, lastmod,...)?
if i have a refresh-pattern like
refresh_pattern . 0 20% 5
such an object should retain at most 5 minutes in cache, shouldn't it?
or is refresh_pattern only used if an object has validation headers?

thx in advance,
alp

- Original Message -
From: Marc Elsen [EMAIL PROTECTED]
To: alp [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Wednesday, February 12, 2003 5:05 PM
Subject: Re: [squid-users] question concerning php-sites and caching




 alp wrote:
 
  hi,
  i have on my webserver a simple php site which i query via squid 2.5.
  this works (of course) and i see that no last_modified or expiry-header
is
  replied, which is correct for dynamic sites, too, as far as i know
  i have no cache_deny for php-sites and only the usual refresh_patterns
of
  default squid.conf.
 
  squid does not cache this php side (also ok), but my question is: why?
  is it hardcoded into squid not to cache php-sites, or is the missing of
  expiry and last_mod headers the reason for this?

   Most probably, you may,for instance check objects (urls)
   with :

   http://www.ircache.net/cgi-bin/cacheability.py

   M.

 
  thx in advance,
  alp

 --

  'Time is a consequence of Matter thus
  General Relativity is a direct consequence of QM
  (M.E. Mar 2002)

---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003




RE: [squid-users] Latency

2003-02-07 Thread SSCR Internet Admin
I have this on my squid.conf

acl porn1 dstdom_regex -i /etc/squid/banned
acl porn2 dstdom_regex -i /etc/squid/banned1
acl exe-filter urlpath_regex -i /etc/squid/file_ext.block

on /etc/squid/...

-rw-r--r--1 root root18041 Feb  5 09:27 banned
-rw-r--r--1 root root  1807580 Nov 14 09:25 banned1
-rw-r--r--1 root root  512 Jan  7 12:13 file_ext.block

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of
Henrik Nordstrom
Sent: Friday, February 07, 2003 4:58 PM
To: SSCR Internet Admin
Cc: [EMAIL PROTECTED]
Subject: Re: [squid-users] Latency


SSCR Internet Admin wrote:
 
 I just wanted to know if having a big list banned site on =
 /etc/squid/banned_site will actually contribute to internet sluggish =
 or network latency.  I have a top result with


How big? And using what kind of acl type?


A very big regex list will be noticeable in CPU performance.

A very big dst or dstdomain should not make much of a noticeable
impact..

Regards
Henrik
---
Incoming mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.449 / Virus Database: 251 - Release Date: 1/27/2003