[squid-users] Configuration Problems

2011-05-30 Thread patric . glazar
Hello!

We are using squid to solve bandwith regulation for our patchmanagment.

What is our purpose:
- we have a limited bandwith to our central PM-Server so each squid has a 
connection bandwith max 20 KB!
  therefore we are using delay pools class 1
 delay_pools 1
 delay_class 1 1
 delay_parameters 1 2/2 2/2 
 delay_access 1 allow localnet

- the clients in the internal Lan should get the download from squid as 
fast as possible,
  same request should be handelt as one - first fill the cache and then 
serve the download to the clients
  the cache should be cleaned after 1 year!

hierarchy_stoplist cgi-bin ?

refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  525600
refresh_pattern .   0   20% 4320
 
 range_offset_limit -1
 collapsed_forwarding on 

rightnow it looks like that all clients are sharing the 20 KB and therfore 
not one is getting the update
- the cache is staing empty I don´t know why
- a trace on the central PM Server shows that the squid Servers are 
donwloading a huge amount of data


Mit freundlichen Grüßen / Best regards
Patric Glazar 

Allgemeines Rechenzentrum GmbH 
PC Infrastructure Management 
Team Security 

A-6020 Innsbruck, Anton Melzerstr.11 
Tel.: +43 / (0)504009-1309 
Fax: +43 / (0)504009-71309 
E-Mail: patric.gla...@arz.at 
http://www.arz.co.at 
DVR: 0419427

Disclaimer:
Diese Nachricht dient ausschließlich zu Informationszwecken und ist nur 
für den Gebrauch des angesprochenen Adressaten bestimmt.

This message is only for informational purposes and is intended solely for 
the use of the addressee.


Disclaimer:
Diese Nachricht dient ausschließlich zu Informationszwecken und ist nur 
für den Gebrauch des angesprochenen Adressaten bestimmt.

This message is only for informational purposes and is intended solely for 
the use of the addressee.



Re: [squid-users] Configuration Problems

2011-05-30 Thread Amos Jeffries

On 30/05/11 23:38, patric.gla...@arz.at wrote:

Hello!

We are using squid to solve bandwith regulation for our patchmanagment.

What is our purpose:
- we have a limited bandwith to our central PM-Server so each squid has a
connection bandwith max 20 KB!
   therefore we are using delay pools class 1
  delay_pools 1
  delay_class 1 1
  delay_parameters 1 2/2 2/2
  delay_access 1 allow localnet

- the clients in the internal Lan should get the download from squid as
fast as possible,
   same request should be handelt as one - first fill the cache and then
serve the download to the clients
   the cache should be cleaned after 1 year!

hierarchy_stoplist cgi-bin ?

refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  525600
refresh_pattern .   0   20% 4320

  range_offset_limit -1
  collapsed_forwarding on

rightnow it looks like that all clients are sharing the 20 KB and therfore
not one is getting the update


Yes. Assuming this is 2.7 (where collapsed forwarding works) one client 
will be downloading and others waiting for the same URL will be sharing 
the trickle as it arrives.



- the cache is staing empty I don´t know why


If these are publicly accessible URL you can plug one of them into 
redbot.org. It will tell you if there are any caching problems that will 
hinder Squid.


Otherwise you will have to locate and figure out the headers (both 
request and reply) manually.


Not being able to cache these objects could be part of the below problem...


- a trace on the central PM Server shows that the squid Servers are
donwloading a huge amount of data


Check what that is. You can expect all clients to parallel download many 
objects. But they should still be bandwidth limited by Squid within the 
delay pool limits.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.12
  Beta testers wanted for 3.2.0.7 and 3.1.12.1


[squid-users] Configuration problems attempting to cache Google Earth/dynamic content

2009-11-18 Thread Jeremy LeBeau
I am trying to set up a server that is running SUSE SLES 11 as a Squid
Proxy to help cache Google Earth content in a low-bandwidth
environment.  I have tried following the steps in this article:
http://wiki.squid-cache.org/Features/StoreUrlRewrite?action=recallrev=7
but I am not having any luck with getting it to work.  In fact, when I
try those steps, Squid will automatically stop about 15 seconds after
start.  The system is running version 2.7 Stable, as installed by
YAST.

Anyone who could offer some help or a configuration file that would
work with this?


RE: [squid-users] Configuration problems attempting to cache Google Earth/dynamic content

2009-11-18 Thread Mike Marchywka





 Date: Wed, 18 Nov 2009 12:02:40 -0600
 From:
 To: squid-users@squid-cache.org
 Subject: [squid-users] Configuration problems attempting to cache Google 
 Earth/dynamic content

 I am trying to set up a server that is running SUSE SLES 11 as a Squid
 Proxy to help cache Google Earth content in a low-bandwidth
 environment. I have tried following the steps in this article:
 http://wiki.squid-cache.org/Features/StoreUrlRewrite?action=recallrev=7
 but I am not having any luck with getting it to work. In fact, when I
 try those steps, Squid will automatically stop about 15 seconds after
 start. The system is running version 2.7 Stable, as installed by
 YAST.

Why does it stop? There should be some logs to check and if you invoke
it from the command line in foreground you can get quick feedback.
Do you want it to cache in contradiction to server response headers?


 Anyone who could offer some help or a configuration file that would
 work with this?
  
_
Windows 7: It works the way you want. Learn more.
http://www.microsoft.com/Windows/windows-7/default.aspx?ocid=PID24727::T:WLMTAGL:ON:WL:en-US:WWL_WIN_evergreen:112009v2

Re: [squid-users] Configuration problems for ClubPenguin.com, blocked ports?

2007-09-02 Thread Henrik Nordstrom
On lör, 2007-09-01 at 19:59 -0400, Albert E. Whale wrote:

 The 2.6 version (which has DansGuardian and squidGuard installed as
 well) has a problem because the site indicates that the firewall is not
 permitting connections via TCP Ports:  3724 6112 9339 9875

What do access.log say?

Regards
Henrik


signature.asc
Description: This is a digitally signed message part


Re: [squid-users] Configuration problems for ClubPenguin.com, blocked ports?

2007-09-02 Thread Albert E. Whale
Henrik Nordstrom wrote:
 On lör, 2007-09-01 at 19:59 -0400, Albert E. Whale wrote:

   
 The 2.6 version (which has DansGuardian and squidGuard installed as
 well) has a problem because the site indicates that the firewall is not
 permitting connections via TCP Ports:  3724 6112 9339 9875
 

 What do access.log say?

 Regards
 Henrik
   
Unfortunately the access logs said nothing about this.

I also tested the connection with another Laptop, and I am suspecting
that the first laptop was infected with spyware, and I suspect it is not
behaving as it is supposed to.  It may be time to switch the home
laptops to linux as well.

I am first re-installing windows to validate the issues.

Thanks for the reply.

-- 
Albert E. Whale, CHS CISA CISSP
Sr. Security, Network, Risk Assessment and Systems Consultant

ABS Computer Technology, Inc. http://www.ABS-CompTech.com - Email,
Internet and Security Consultants
SPAMZapper http://www.Spam-Zapper.com - No-JunkMail.com
http://www.No-JunkMail.com - *True Spam Elimination*.


[squid-users] Configuration problems for ClubPenguin.com, blocked ports?

2007-09-01 Thread Albert E. Whale
I have an older Squid server (2.5) which I can connect easily with to
ClubPenguin.

The 2.6 version (which has DansGuardian and squidGuard installed as
well) has a problem because the site indicates that the firewall is not
permitting connections via TCP Ports:  3724 6112 9339 9875

Has anyone seen an issue like this?  How do I configure this to permit
this access to ClubPenguin?

-- 
Albert E. Whale, CHS CISA CISSP
Sr. Security, Network, Risk Assessment and Systems Consultant

ABS Computer Technology, Inc. http://www.ABS-CompTech.com - Email,
Internet and Security Consultants
SPAMZapper http://www.Spam-Zapper.com - No-JunkMail.com
http://www.No-JunkMail.com - *True Spam Elimination*.