[squid-users] ACLs Implementation help

2010-11-11 Thread Edmonds Namasenda
Much appreciated for the previous help.
Some more clarification on the in-line requests below.
On Wed, Nov 10, 2010 at 2:38 PM, Amos Jeffries squ...@treenet.co.nz wrote:

 On 09/11/10 20:25, Edmonds Namasenda wrote:

 Dear all.
 Using openSuse 11.2 and Squid 3.0 Stable 18

 Besides commenting out anything to do with 'localnet', below is all that
 I added or edited on squid.conf

 # Authentication Program
 auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/squid_passwd

 # Start ACLs (bottom of ACL section defaults)
 acl passt proxy_auth REQUIRED        # Authentication file to be used
 passt
 acl net_ed src 10.100.10.0/24 http://10.100.10.0/24 192.168.7.0/24
 http://192.168.7.0/24 10.208.6.0/24 http://10.208.6.0/24        # My
 networks
 acl dove src 10.100.10.248-10.100.10.255        # Unrestricted Internet
 access I.P range
 acl whrs1 time MTWHF 9:00-12:59        # Morning work shift
 acl whrs2 time MTWHF 13:00-16:59        # Afternoon work shift

meant to be ...
acl whrs2 time MTWHF 14:00-16:59

 acl nowww dstdomain /etc/squid/noWWW        # Inaccessible URLs file path
 acl nodwnld urlpath_regex /etc/squid/noDWNLD        # Unavailable
 downloads file path

 # End ACLs

 # Start http_access Edits (top of http_access section defaults)
 http_access allow dove        # Internet access without authentication,
 denied URLs or download restrictions
 http_access deny nowww whrs1 whrs2        # Deny URLs during work shifts

 Um, this means that when the clock says simultaneously that it is both 
 morning AND afternoon...

 ... to deny with an OR combine the time periods into one ACL name or split 
 the http_access into two lines.

http_access deny nowww whrs1
http_access deny nodwnld whrs1
http_access deny nowww whrs2
http_access deny nodwnld whrs2
... works great so far as tested.

 Amos

How do I enforce password authentication ONLY ONCE for users to
internet access using file passt?
http_access allow passt net_ed  ?!


--
Thank you and kind regards,

I.P.N Edmonds

Cel:    +256 70 227 3374
       +256 71 227 3374

Y! / MSN: zibiced | GMail: namasenda | Skype: edsend


Re: [squid-users] ACLs Implementation help

2010-11-11 Thread Edmonds Namasenda
Yeah, I guess I am getting there.
Please look in-line...


 How do I enforce password authentication ONLY ONCE for users to

 What do you mean by ONLY ONCE? A user can be authenticated or not, there is 
 no multiple about it.
No continuous authentication required with every URL accessed or
re-directions once the first log-in is accepted.

 internet access using file passt?
 http_access allow passt net_ed  ?!

 With the above Squid will pull the auth details sent by the browser out of
 the request. If there are none it will skip the access line.

 You place the ACL of type proxy_auth (in this case past) last on the line  
 to make Squid request credentials from the browser.

acl passt proxy_auth REQUIRED # Last ACL line; passt = ncsa authentication file
?!

http_access allow passt net_ed # Last http_access line; net_ed = my network
?!

 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.9
  Beta testers wanted for 3.2.0.3




-- 
Thank you and kind regards,

I.P.N Edmonds


Re: [squid-users] ACLs Implementation help

2010-11-11 Thread Edmonds Namasenda
Thank you all.

On Thu, Nov 11, 2010 at 4:19 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 On 12/11/10 01:22, Edmonds Namasenda wrote:

 No continuous authentication required with every URL accessed or
 re-directions once the first log-in is accepted.

 Understood. That is not possible.

 HTTP is by design stateless. Each single TCP connection being able to be
 used identically by both a single end-user browser or a middleware proxy
 serving multiple users. Even if you believe your end-users are all browsers
 you will likely be wrong at some point.

That means every URL accessed will ask for a password from the users.
Then password authentication by squid is not advisable for corporate
end users... it is an inconvenience.

 Amos

I believe I am a better squid administrator than when I joined. Throw me a bone!


-- 
Thank you and kind regards,

I.P.N Edmonds

Cel:    +256 70 227 3374
       +256 71 227 3374

Y! / MSN: zibiced | GMail: namasenda | Skype: edsend


Re: [squid-users] ACLs Implementation help

2010-11-11 Thread Edmonds Namasenda
Amos, thank you for the responses always.

On Thu, Nov 11, 2010 at 6:56 PM, Amos Jeffries squ...@treenet.co.nz wrote:

 On 12/11/10 04:08, Edmonds Namasenda wrote:

 I believe I am a better squid administrator than when I joined. Throw me a 
 bone!


 Switch users with browsers and you have it right. There is a whole layer 
 of software between squid and the people at the screen.

 The browser is supposed to remember these things once the person has entered 
 them. Or as in the case of Kerberos, to locate the credentials without 
 bothering the person at all.


 If you are seeing a browser repeatedly asking for login then there is a 
 problem with the browser. Those can occasionally be hit by something it does 
 not like coming back from Squid. When that happens some network forensics are 
 needed to figure out whats going on.

I know Firefox asks whether to keep authentication details. I am not
sure about MS IE.
Assuming they are using Firefox and the log-in details are kept by the
browser, are you implying there will not be continuous requests for
logging in with each accessed page forever?
That is the browser sends authentication details to squid and squid
allows them access accordingly.



--
Thank you and kind regards,

I.P.N Edmonds

Cel:    +256 70 227 3374
       +256 71 227 3374

Y! / MSN: zibiced | GMail: namasenda | Skype: edsend


[squid-users] Default Home Page

2010-12-01 Thread Edmonds Namasenda
Hello Members.

Can squid in transparent mode force all local (proxy LAN) http
requests to a certain default page? Possibly a locally made web
application or any other page!
How can this be done?

--
Thank you and kind regards,

I.P.N Edmonds
ICT Practitioner  Consultant

Cel:    +256 70 227 3374
       +256 71 227 3374

P.O.    Box 22249, Kampala UGANDA

Y! / MSN: zibiced | GMail: namasenda | Skype: edsend

COMPUTER NETWORKS: WIRELESS; CABLED; VPNs | UNIX SERVERS: MAIL; FILE;
PROXY; WEB; VoIP | WEBSITE DESIGN: STATIC; FLASH; DYNAMIC | CREATIVE
GRAPHICS  IDENTITY MANAGEMENT | I.T SUPPORT  CONSULTANCY |
ANTI-VIRUS


[squid-users] Dedicate Bandwidth to IP Address

2011-02-22 Thread Edmonds Namasenda
Dear all.

I would like to have a video conference call on my LAN using a
particular I.P Address. This is going to be for a limited time and I
want a clear connection.
We are already running Squid in transparent proxy mode with some ACLs
limiting HTTP access, downloads, streaming to a particular group of
I.P Addresses. However the I.P Address I want to use is among the
admin addresses with open (unrestricted) access to anything.

How can I allocate 512K of my bandwidth to that particular I.P Address
for a test call? I can then adjust (increase or decrease) the
bandwidth to test the effects.

--
Thank you and kind regards,

I.P.N Edmonds
ICT Practitioner  Consultant


Re: [squid-users] Dedicate Bandwidth to IP Address

2011-02-23 Thread Edmonds Namasenda
Eliezer, these are general video calls; Skype, Sonix etc

On Wed, Feb 23, 2011 at 12:45 PM, Eliezer elie...@ec.hadorhabaac.com wrote:
 Well it depends on the protocol that is used on the video conference
 and not necessarily related to  the cache proxy.


 Regards Eliezer


 On 23/02/2011 09:17, Edmonds Namasenda wrote:

 Dear all.

 I would like to have a video conference call on my LAN using a
 particular I.P Address. This is going to be for a limited time and I
 want a clear connection.
 We are already running Squid in transparent proxy mode with some ACLs
 limiting HTTP access, downloads, streaming to a particular group of
 I.P Addresses. However the I.P Address I want to use is among the
 admin addresses with open (unrestricted) access to anything.

 How can I allocate 512K of my bandwidth to that particular I.P Address
 for a test call? I can then adjust (increase or decrease) the
 bandwidth to test the effects.

 --
 Thank you and kind regards,

 I.P.N Edmonds
 ICT Practitioner  Consultant




[squid-users] Explicit Allow / Deny

2011-04-26 Thread Edmonds Namasenda
Hello there.
Is there a possibility in Squid 3.0 on openSuSe to explicitly allow or
deny ACLs?

### Example
acl admins src 10.0.0.245-10.0.0.255
acl updates dstdomain -i /path/to_file_with/updates_domain/urls
http_access allow admins
http_access allow updates
reply_body_max_size 10 MB !admins
reply_body_max_size 10 MB !updates

With the above set-up example, the reply_body_max_size still affects
the admins group. Systems with I.P Addresses in the admin range fail
to download files bigger than 10MBs.
How can I explicitly allow them? As well if I want to explicitly deny,
what can I use?

--
Thank you and kind regards,

I.P.N Edmonds
ICT Practitioner  Consultant

Cel:    +256 70 227 3374
       +256 71 227 3374

P.O.    Box 22249, Kampala UGANDA

Y! / MSN: zibiced | GMail: namasenda | Skype: edsend


Re: [squid-users] Need advise about Squid statistics.

2011-10-31 Thread Edmonds Namasenda
Siur,
Look for Squid Analyzer on freshmeat.net Easy to install and
customize. It will do the magic for you.

# Edmonds Namasenda.

On Sun, Oct 30, 2011 at 2:56 AM, siur siur@gmail.com wrote:

 Hello!
 I've got 500 archived squid log files. Now I need to analyze all of
 them and make a statistics report (top visited sites, per-user
 statistic, all that stuff).
 What's the best way to do it?


[squid-users] Squid URL / Network Free Access

2011-11-24 Thread Edmonds Namasenda
Dear Friends,
I am using Squid 3.1 in transparent mode.
How can I stop Squid from scanning and logging traffic to particular
URLs or networks?
Something like users can connect to the URLs or networks freely
without Squid's interception.

There is an official video access portal which seems to be eating up
my logs space and then access to it is slowed, somehow.

Thank you in advance,
Edmonds.


Re: [squid-users] Make Squid in interception mode completely

2011-12-05 Thread Edmonds Namasenda
Hai,
Seems your network set-up is what might be ruining your connection
expectations or the default gateway needs a rule (possibly using a
firewall) to direct all HTTP traffic to the squid box rather than to
the internet.

Otherwise, think of the set-up below (with the Squid box the same as
the Gateway)

Internet Router   Eth0 |- Squid box  Default Gateway -| Eth1
   Switch   LAN

# Edz.

On Mon, Dec 5, 2011 at 5:14 PM, Nguyen Hai Nam nam...@nd24.net wrote:

 Hi Amos,

 You're right, switch is not really true.

 But I still can't find the way on Solaris-like system like 
 /proc/sys/net/bridge


 On Mon, Dec 5, 2011 at 7:25 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 
 
  Like a switch? or or did you really mean like a bridge?
 
  * switch ... no solution. Switches do not perform the NAT operations
  required for interception. They also don't run software like Squid, so I
  think this is a bad choice of word in your description.
 
  * bridge ... requires dropping packets out of the bridge into the routing
  functionality. See the bridge section at
  http://wiki.squid-cache.org/Features/Tproxy4#ebtables_on_a_Bridging_device
 
  Amos


Re: [squid-users] Make Squid in interception mode completely

2011-12-06 Thread Edmonds Namasenda
Your diagram or illustration shows a difference with my illustration.
If you believe they are the same and getting header fields shown, look
through your firewall and squid acls.

# Edz.

On Tue, Dec 6, 2011 at 5:05 PM, Nguyen Hai Nam nam...@nd24.net wrote:
 Hi Edmonds,

 That's really like my setup right now. But, as Amos said, the traffic
 just pass from eth0 to eth1 but don't come to Squid, because it's
 bridged. Actually, when watching IP nat table, I still found some nat
 rules show up, but at client-side it still looks direct access. And
 more strange, if I use an other linux box from LAN to check out by
 curl -I http://something.com/ it's returned the header fields that has
 Via: 1.1 (squid 3.2). I have no idea why.

 At this moment, I still don't find more documentation from IPfilter
 for deeper discovery.

 ~ Neddie

 On Tue, Dec 6, 2011 at 12:03 PM, Edmonds Namasenda namase...@gmail.com 
 wrote:
 Hai,
 Seems your network set-up is what might be ruining your connection
 expectations or the default gateway needs a rule (possibly using a
 firewall) to direct all HTTP traffic to the squid box rather than to
 the internet.

 Otherwise, think of the set-up below (with the Squid box the same as
 the Gateway)

 Internet Router       Eth0 |- Squid box  Default Gateway -| Eth1
   Switch       LAN

 # Edz.

 On Mon, Dec 5, 2011 at 5:14 PM, Nguyen Hai Nam nam...@nd24.net wrote:



[squid-users] Connectivity Choke With Squid 3.1

2012-02-09 Thread Edmonds Namasenda
Hello Group,

I am using Squid 3.1 in transparent mode with basic settings enforced
by a redirect rule in Shoreline Firewall.
I have two LANs; 172.x.x.x for LAN servers not controlled by the
proxy, and 10.x.x.x for LAN users. The squid server has 2 NICs to
create the users LAN  routing.

My main problem is, the squid server seems to be choking or limiting
connection to reach the cap allocated. We have 20MBs up and down but
we only reach 5MBs. Replacing the proxy with routers raises up the cap
to expectation.

How can I improve this connectivity issue? I really do not want to rid
of my OpenSuSe 11.4 box running my apps and squid.

# Edmonds


Re: [squid-users] whitelisted IP problem

2012-03-19 Thread Edmonds Namasenda
Vijay,
Just a quick look has shown me you did not specify your network and there are a 
few typo errors.
Re-adjust, test, and fill us in some more.

I.P.N Edmonds
Systems | Networks | ICTs
UgM: +256 71 227 3374 | TzM: +255 68 422 1561
# 22249, Kampala Uganda.

-Original Message-
From: Vijay S vi...@reactmedia.com
Date: Mon, 19 Mar 2012 22:28:03 
To: squid-users@squid-cache.org
Subject: [squid-users] whitelisted IP problem
Hi

I have a my server box hosting apache and squid on centos machine.
When I send my request for clients feeds it works as they have
whitelisted my IP address, and when I make the call via squid its give
me invalid IP. I checked the access log for more information and found
out instead of sending my IP address its sending the localhost IP
address (127.0.0.1).

I googled a little and found that using tcp_outgoing_address directive
I can control the outgoing IP address  and to my bad luck this didn’t
work

My configuration file is as follows

acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/32
acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

http_access allow localhost
http_access deny all

icp_access allow all

http_port 3128

visible_hostname loclahost
debug_options ALL,1 33,2 28,9
tcp_outgoing_address 122.166.1.184

Can somebody help me with configuration for the my servers. It will be
of great help.

Thanks  Regards
Vijay


Re: [squid-users] whitelisted IP problem

2012-03-19 Thread Edmonds Namasenda
You might need a firewall of sorts.
And, you need to specify your LAN's network (s) in Squid conf.

I.P.N Edmonds
Systems | Networks | ICTs
UgM: +256 71 227 3374 | TzM: +255 68 422 1561
# 22249, Kampala Uganda.

-Original Message-
From: Vijay S vi...@reactmedia.com
Date: Mon, 19 Mar 2012 23:22:30 
To: namase...@gmail.com; squid-users@squid-cache.org
Subject: Re: [squid-users] whitelisted IP problem

DO i have to do any IP tables configurations for this as well?

On Mon, Mar 19, 2012 at 10:57 PM, Vijay vi...@reactmedia.com wrote:
 I am still a beginner, I googled some site and found this configuration
 initially it was this


 #
 # Recommended minimum configuration:
 #
 acl manager proto cache_object
 acl server src 192.168.1.10
 acl localhost src 192.168.1.0/32 ::1
 acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1


 # Example rule allowing access from your local networks.
 # Adapt to list your (internal) IP networks from where browsing
 # should be allowed
 acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
 acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
 acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
 acl localnet src fc00::/7       # RFC 4193 local private network range
 acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged)
 machines

 acl SSL_ports port 443
 acl Safe_ports port 80          # http
 acl Safe_ports port 21          # ftp
 acl Safe_ports port 443         # https
 acl Safe_ports port 70          # gopher
 acl Safe_ports port 210         # wais
 acl Safe_ports port 1025-65535  # unregistered ports
 acl Safe_ports port 280         # http-mgmt
 acl Safe_ports port 488         # gss-http
 acl Safe_ports port 591         # filemaker
 acl Safe_ports port 777         # multiling http
 acl CONNECT method CONNECT

 #
 # Recommended minimum Access Permission configuration:
 #
 # Only allow cachemgr access from localhost
 http_access allow manager localhost server
 http_access deny manager

 # Deny requests to certain unsafe ports
 http_access deny !Safe_ports

 # Deny CONNECT to other than secure SSL ports
 http_access deny CONNECT !SSL_ports

 # We strongly recommend the following be uncommented to protect innocent
 # web applications running on the proxy server who think the only
 # one who can access services on localhost is a local user
 #http_access deny to_localhost

 #
 # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
 #

 # Example rule allowing access from your local networks.
 # Adapt localnet in the ACL section to list your (internal) IP networks
 # from where browsing should be allowed
 http_access allow localnet
 http_access allow localhost server

 # And finally deny all other access to this proxy
 http_access deny all

 # Squid normally listens to port 3128
 http_port 3128

 # We recommend you to use at least the following line.
 hierarchy_stoplist cgi-bin ?

 # Uncomment and adjust the following to add a disk cache directory.
 #cache_dir ufs /var/spool/squid 100 16 256

 # Leave coredumps in the first cache dir
 coredump_dir /var/spool/squid

 # Add any of your own refresh_pattern entries above these.
 refresh_pattern ^ftp:           1440    20%     10080
 refresh_pattern ^gopher:        1440    0%      1440
 refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
 refresh_pattern .               0       20%     4320


 visible_hostname reactmedia.com

 debug_options ALL,1 33,2 28,9

 tcp_outgoing_address 122.166.1.184



 Thanks  Regards
 Vijay


 -Original Message-
 From: Edmonds Namasenda [mailto:namase...@gmail.com]
 Sent: Monday, March 19, 2012 10:33 PM
 To: Vijay S; squid-users@squid-cache.org
 Subject: Re: [squid-users] whitelisted IP problem

 Vijay,
 Just a quick look has shown me you did not specify your network and there
 are a few typo errors.
 Re-adjust, test, and fill us in some more.

 I.P.N Edmonds
 Systems | Networks | ICTs
 UgM: +256 71 227 3374 | TzM: +255 68 422 1561 # 22249, Kampala Uganda.

 -Original Message-
 From: Vijay S vi...@reactmedia.com
 Date: Mon, 19 Mar 2012 22:28:03
 To: squid-users@squid-cache.org
 Subject: [squid-users] whitelisted IP problem Hi

 I have a my server box hosting apache and squid on centos machine.
 When I send my request for clients feeds it works as they have whitelisted
 my IP address, and when I make the call via squid its give me invalid IP. I
 checked the access log for more information and found out instead of sending
 my IP address its sending the localhost IP address (127.0.0.1).

 I googled a little and found that using tcp_outgoing_address directive I can
 control the outgoing IP address  and to my bad luck this didn't work

 My configuration file is as follows

 acl all src all
 acl manager proto cache_object
 acl localhost src 127.0.0.1/255.255.255.255 acl to_localhost dst
 127.0.0.0/32 acl SSL_ports port 443
 acl Safe_ports port 80          # http
 acl Safe_ports port 21          # ftp
 acl Safe_ports port 443

Re: [squid-users] Allowing downloads from certain sites

2012-06-29 Thread Edmonds Namasenda
Shastri, try the below

Assume...
1. Preventing Downloads File (nodowns.txt) has the following
\.msi$
\.exe$
\.zip$
\.etc$

2. Trusted Sites File (goodsites.txt) has the following
*.*microsoft*.com*
*.*windows*.com*
*.*etc*.com*.com*

3. Accompanying ACLs for files above
acl nodowns urlpath_regex -i /path_to/nodowns.txt # With quotation marks
acl goodsites dstdomains -i /path_to/goodsites.txt # With quotation marks

4. Controlling Rule
http_access deny nodowns !goodsites # Put it above any allow rule

The above is my thinking, and I could do with correction.

# Edmonds

On Fri, Jun 29, 2012 at 12:30 PM, Chaitanya Shastri
chait.shas...@gmail.com wrote:

 Hi Amos,

    I have acl rules for preventing downloads on client machines. So a
 client cannot download any file (for example, .exe, .zip .. etc ) on
 his/her machine.
 What I want is that all clients should be able to download any type of
 file from certain trusted domain.
 In short I want to allow a domain in my squid configuration from which
 any client can download any type of file.

 Thanks.

 On Fri, Jun 29, 2012 at 1:15 PM, Amos Jeffries squ...@treenet.co.nz
 wrote:
  On 29/06/2012 6:10 p.m., Chaitanya Shastri wrote:
 
  Hi list,
 
     Is it possible to allow downloads from certain trusted sites?  I
  tried using the url_regex acl to list certain trusted sites from which
  our users can download any file.
 
     Ex. acl allow_downloads url_regex -i ^http:\/\/example\.com
           http_reply_access allow allow_downloads localnet  # where
  localnet is my LAN range
 
     But its not working. Any ideas on how to get it work?
 
  Thanks.
 
 
  Any idea what is blocking them from working in the first place?
 
  Amos
 


Re: [squid-users] Allowing downloads from certain sites

2012-06-29 Thread Edmonds Namasenda
 Shastri, try the below

 Assume...
 1. Preventing Downloads File (nodowns.txt) has the following
 \.msi$
 \.exe$
 \.zip$
 \.etc$

The above is regex


 2. Trusted Sites File (goodsites.txt) has the following
 *.*microsoft*.com*
 *.*windows*.com*
 *.*etc*.com*.com*


 WTF? Does regex even accept that?

 *.*microsoft*.com*

Amos, the above is dstdomain. You must have missed the regex entries
before this.


  ==   (zero or more 'nothings')(zero or more characters)(the text
 microsof)(zero or more 't' characters)(any single character)(thetext
 co)(zero or more 'm' characters)

 Don't you mean this?
  \.microsoft\.com
  \.windows\.com
  \.etc\.com\.com


Thanks for that insight.


 Or perhapse the better version:

  acl goodsites dstdomain .microsoft.com .windows.com .etc.com.com



Will that not be too much is you have a long list of sites?


 3. Accompanying ACLs for files above
 acl nodowns urlpath_regex -i /path_to/nodowns.txt # With quotation marks
 acl goodsites dstdomains -i /path_to/goodsites.txt # With quotation
 marks

 4. Controlling Rule
 http_access deny nodowns !goodsites # Put it above any allow rule

 The above is my thinking, and I could do with correction.

 # Edmonds


 Pretty much. The problem is that Chaitanya supplied no details about their
 config. Could be much simpler or much more complicated.

 Amos


 On Fri, Jun 29, 2012 at 12:30 PM, Chaitanya Shastri wrote:

 Hi Amos,

    I have acl rules for preventing downloads on client machines. So a
 client cannot download any file (for example, .exe, .zip .. etc ) on
 his/her machine.
 What I want is that all clients should be able to download any type of
 file from certain trusted domain.
 In short I want to allow a domain in my squid configuration from which
 any client can download any type of file.

 Thanks.

 On Fri, Jun 29, 2012 at 1:15 PM, Amos Jeffries wrote:

 On 29/06/2012 6:10 p.m., Chaitanya Shastri wrote:

 Hi list,

    Is it possible to allow downloads from certain trusted sites?  I
 tried using the url_regex acl to list certain trusted sites from which
 our users can download any file.

    Ex. acl allow_downloads url_regex -i ^http:\/\/example\.com
          http_reply_access allow allow_downloads localnet  # where
 localnet is my LAN range

    But its not working. Any ideas on how to get it work?

 Thanks.


 Any idea what is blocking them from working in the first place?

 Amos



Re: [squid-users] Allowing downloads from certain sites

2012-06-29 Thread Edmonds Namasenda
Shastri,
That is what we are trying to help you solve. Are we writing gibberish?!

At the http_access restricting downloads, add something like
!\.example\.com (without the quotation marks)

Else, provide more accurate information for faster troubleshooting if
that fails. Many people dump their whole configuration files for
analysis!

# Edmonds.

On Fri, Jun 29, 2012 at 3:36 PM, Chaitanya Shastri
chait.shas...@gmail.com wrote:
 Hi,
  My question is simple. I have blocked all the downloads on my LAN
 systems using acl rules. But there is a trusted domain, say
 example.com from which users on my LAN should be able to download
 any file. For example: zip or exe file. I do not have site restriction
 on that site. So all users can access the site, but they are not able
 to download through that site.
 I tried using \.microsoft\.com using dstdomain acl type. But its not
 working. The http_reply_access rule is denying the download.
 I have following configuration:

 acl allow_downloads dstdomain -i \.microsoft\.com
 http_reply_access allow  allow_downloads
 http_reply_access deny all

 I want to allow downloading from example.com domain while still
 restricting downloads from other domains.
 Any ideas?

 Thanks.

 On Fri, Jun 29, 2012 at 5:32 PM, Edmonds Namasenda namase...@gmail.com 
 wrote:
 Shastri, try the below

 Assume...
 1. Preventing Downloads File (nodowns.txt) has the following
 \.msi$
 \.exe$
 \.zip$
 \.etc$

 The above is regex


 2. Trusted Sites File (goodsites.txt) has the following
 *.*microsoft*.com*
 *.*windows*.com*
 *.*etc*.com*.com*


 WTF? Does regex even accept that?

 *.*microsoft*.com*

 Amos, the above is dstdomain. You must have missed the regex entries
 before this.


  ==   (zero or more 'nothings')(zero or more characters)(the text
 microsof)(zero or more 't' characters)(any single character)(thetext
 co)(zero or more 'm' characters)

 Don't you mean this?
  \.microsoft\.com
  \.windows\.com
  \.etc\.com\.com


 Thanks for that insight.


 Or perhapse the better version:

  acl goodsites dstdomain .microsoft.com .windows.com .etc.com.com



 Will that not be too much is you have a long list of sites?


 3. Accompanying ACLs for files above
 acl nodowns urlpath_regex -i /path_to/nodowns.txt # With quotation marks
 acl goodsites dstdomains -i /path_to/goodsites.txt # With quotation
 marks

 4. Controlling Rule
 http_access deny nodowns !goodsites # Put it above any allow rule

 The above is my thinking, and I could do with correction.

 # Edmonds


 Pretty much. The problem is that Chaitanya supplied no details about their
 config. Could be much simpler or much more complicated.

 Amos


 On Fri, Jun 29, 2012 at 12:30 PM, Chaitanya Shastri wrote:

 Hi Amos,

    I have acl rules for preventing downloads on client machines. So a
 client cannot download any file (for example, .exe, .zip .. etc ) on
 his/her machine.
 What I want is that all clients should be able to download any type of
 file from certain trusted domain.
 In short I want to allow a domain in my squid configuration from which
 any client can download any type of file.

 Thanks.

 On Fri, Jun 29, 2012 at 1:15 PM, Amos Jeffries wrote:

 On 29/06/2012 6:10 p.m., Chaitanya Shastri wrote:

 Hi list,

    Is it possible to allow downloads from certain trusted sites?  I
 tried using the url_regex acl to list certain trusted sites from which
 our users can download any file.

    Ex. acl allow_downloads url_regex -i ^http:\/\/example\.com
          http_reply_access allow allow_downloads localnet  # where
 localnet is my LAN range

    But its not working. Any ideas on how to get it work?

 Thanks.


 Any idea what is blocking them from working in the first place?

 Amos



[squid-users] DSTDOMAIN Wildcards and Multiple http_port

2012-07-04 Thread Edmonds Namasenda
Hello Team,

Please bare with me if this was resolved before, but I am asking out
of curiosity and need to test it soon
What is the difference and implications of the following as dstdomain entries?

# Assuming namasenda.com is a registered domain, and the outcome is
controlling any domain with the word namasenda like
hatenamasenda.com or .net
a) .\namasenda\.
b) .namasenda.
c) .namasenda.com
d) .\namasenda\.com

Is a)  b) correct, anyway?

I want one squid instance to listen on ports 80, 8080,  3128
Do I just do http_port 3128 8080 80 transparent work?

Thank you,

# Edmonds


Re: [squid-users] DSTDOMAIN Wildcards and Multiple http_port

2012-07-04 Thread Edmonds Namasenda
 # Assuming namasenda.com is a registered domain, and the outcome is
 controlling any domain with the word namasenda like
 hatenamasenda.com or .net
 a) .\namasenda\.
 b) .namasenda.
 c) .namasenda.com
 d) .\namasenda\.com

 Is a)  b) correct, anyway?


 No they are not. Neither is (d).

 (c) is correct dstdomain syntax but the wildcard is label-based, not
 character-based.


Is it possible to have / write a label-based dstdomain syntax?
How?

 .namasenda.com  will match namasenda.com, www.namasenda.com,
 www2.namasenda.com, other.namasenda.com
 but not:  anamasenda.com or hatenamasenda.com or anything else with a
 different 2nd-tier label than namasenda.



 I want one squid instance to listen on ports 80, 8080,  3128
 Do I just do http_port 3128 8080 80 transparent work?


 No, one line per http_port entry.

 Amos


Thank you,

# Edmonds.


Re: [squid-users] Time based access

2012-07-20 Thread Edmonds Namasenda
Try the below,

acl urlsFB dstdomain -r .facebook.com .fbcdn.net  # Sample Facebook domains
acl timedFB time SMTWHFA 12:30-13:29 # Accessible time
for Facebook

# Place the below above all access rules
http_access deny urlsFB !timedFB  # Rule to
deny Facebook unless it is time

My 2Cents.
# Edmonds.

On Fri, Jul 20, 2012 at 1:55 PM, Amos Jeffries squ...@treenet.co.nz wrote:

 On 20/07/2012 9:41 p.m., a bv wrote:

 Hi ,

 I tried to write a timebased acl for for testingat special time of
 day but it didnt seem to work. I tried to block the access to CNN
 using dstdomain .cnn.com.

 My aim is to allow facebook at special times of the day at lunch time
 for example.

 Can someone provide me a working rule for  which will allow access
 facebook access  at 12.30-13.30 ?


 Regards


 Maybe it had something to do with what you configured. Or what software
 you were configuring to achieve it.

 ... details. Please!


Re: [squid-users] Deny URLs with Transparent proxy

2012-08-31 Thread Edmonds Namasenda
acl sites_bloqueados url_regex -i /etc/squid/acls/sites_bloqueados.txt


acl sites_bloqueados dstdomain -i /etc/squid/acls/sites_bloqueados.txt


http_access deny sites_bloqueados

http_access allow lan

in the sites sites_bloqueados.txt there is the url facebook.com but
the proxy doesn't DENY it.


Make sure site_bloqueados.txt has the following

.fbcdn.net
.facebook.com

Put the http_access deny on top of any http_access allow

# Edmonds


Re: [squid-users] Time spent on website?

2013-05-09 Thread Edmonds Namasenda
Matt,

Take time and install / used Darold's Squid Analyzer.
It might give you close or more than you are looking for. So far, I am
not complaining and he is often available for assistance.

On Mon, May 6, 2013 at 5:55 PM, matthew vassallo mat...@hotmail.com wrote:
 Hi,

 I am currently using squid to gather information regarding users such as on 
 which website they have been, tracking their IP address etc... Kindly would 
 it be possible to know the total time that a user spent browsing a specific 
 website please? So if user X went on Facebook and spent 15 minutes, would I 
 be able to know such information through squid or other integrated software 
 please? Thanks

 Regards,
 Matthew



--
Thank you and kind regards,

I.P.N Edmonds
ICTs Practitioner: Systems | Networks | Applications
Mob: +256 71 227 3374 / +256 75 327 3374 | Tel: +256 41 466 3066
Skype: edsend


Re: [squid-users] Squid monitering Tool

2013-07-30 Thread Edmonds Namasenda
Squid Analyzer on http://squidanalyzer.darold.net did me good thanks to Darold.

On Tue, Jul 30, 2013 at 2:24 PM, javed_samtiah
javed.iq...@northbaysolutions.net wrote:
 Hi,

 Is there any squid monitering tool.
 *1. I want to monitor Web trafic that which Ip or PC is accessing to which
 Website.
 2. How much bandwidth is utilized by each PC.
 3. Want to monitor on real time basis.






 --
 View this message in context: 
 http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-monitering-Tool-tp4661330.html
 Sent from the Squid - Users mailing list archive at Nabble.com.



-- 
Thank you and kind regards,

I.P.N Edmonds
ICTs Practitioner: Systems | Networks | Applications
Mob: +256 71 227 3374 / +256 75 327 3374 | Tel: +256 41 466 3066
Skype: edsend | P.O. Box 22249, Kampala UGANDA


[squid-users] Squidblacklists Now Commercial ONLY?!

2013-08-01 Thread Edmonds Namasenda
Hello Nichols (Squid Blacklists Team)  Fellow Squid Users,

Please enlighten me as I seem to be confused.
A couple of weeks back I realized accumulative 401.shtml files in my
Squid Blacklists update script folder only to check the website and I
must subscribe (and pay).

When did this take effect without warning given Squid Users (this
forum) contributed to the project?!

# Edmonds


Re: [squid-users] Re: squidblacklist.org

2013-09-02 Thread Edmonds Namasenda
Ricardo,

Did you ask a question and answered it yourself?!

Anyway, the ideas of the SquidBlacklists are good  I used it
initially. Quite good and developing. Yes some URLs had issues as
well as contradictions when reloading squid  reading the related
ACLs.

However, I stopped testing them when it all went commercial in a blink
of an eye without warning like when they (idea developers) asked us to
contribute (and test) initially.

# Edmonds

On Sat, Aug 31, 2013 at 8:12 PM, Ricardo Klein klein@gmail.com wrote:
 they're proxy list covers a good amount of proxyes?
 They have any address range of UltraSurf?
 --
 Att...

 Ricardo Felipe Klein
 klein@gmail.com


 On Sat, Aug 31, 2013 at 9:36 AM, Ahmad ahmed.za...@netstream.ps wrote:
 hi ,

 i use squidblacklist ,

 it  is very strong acl and is updated   at least every week ,


 regards



 -
 Mr.Ahmad
 --
 View this message in context: 
 http://squid-web-proxy-cache.1019090.n4.nabble.com/squidblacklist-org-tp4661852p4661865.html
 Sent from the Squid - Users mailing list archive at Nabble.com.


[squid-users] Access Groups Problem

2013-10-16 Thread Edmonds Namasenda
Hello All,

We use one openSuSe 11.4 server to manage access of five networks four
of which are connected through VPN.
Initial configuration used in-built Squid (3.0 Stable 18) in
transparent mode. We recently ran an upgrade and got Squid 3.2 on the
same oS 11.4

We realized some admin IP addresses are blocked from access and branch
users require adding proxy settings in the browsers / apps to connect
to the internet.
I experienced the former after the upgrade while not sure about the
latter. I am a timely consultant (practical) to the team not in-house.

How can the above be rectified?
Attached is the current conf with a few alterations and a question on http_port

-- 
Thank you and kind regards,

# Edmonds
#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
#acl localnet src 10.0.0.0/8# RFC1918 possible internal network
#acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
#acl localnet src 192.168.0.0/16# RFC1918 possible internal network
#acl localnet src fc00::/7   # RFC 4193 local private network range
#acl localnet src fe80::/10  # RFC 4291 link-local (directly plugged) 
machines

acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
#acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

# USUL Connection ACLs
acl usul src 10.40.1.0/24 10.40.2.0/24 10.40.3.0/24 10.40.4.0/24 10.40.5.0/24

acl noaccess src /etc/squid/noaccess.txt
acl admin src /etc/squid/admin.txt
acl a37 src /etc/squid/one37.txt
acl srvips src /etc/squid/srvips.txt
acl mgrs src /etc/squid/mgrs.txt
acl clerix src /etc/squid/clerix.txt

# USUL Connectivity Time-Frames
acl NoGenNet time MTWHFA 08:00-12:59
acl NoGenNet time MTWHFA 13:59-16:59
acl NoGenNet time S 07:00-12:59
acl NoGenNet time SMTWHFA 19:00-23:59
acl NoGenNet time SMTWHFA 00:00-06:59

## You Tube
acl YouTube time SMTWHFA 19:00-23:59
acl YouTube time SMTWHFA 00:00-07:59

# USUL Streaming Restrictions
acl nommq req_mime_type -i /etc/squid/nommq.txt

# USUL File  URL Restrictions
acl donot urlpath_regex -i /etc/squid/donot.txt
#acl nowords url_regex -i /etc/squid/nowords.txt
acl srvurls dstdomain -i /etc/squid/srvurls.txt
acl fewurls dstdomain -i /etc/squid/fewww.txt
acl one37 dstdomain -i /etc/squid/url37.txt
acl malice dstdomain -i /etc/squid/malware.acl
acl porn dstdomain -i /etc/squid/xxx.acl
acl ads dstdomain -i /etc/squid/ads.acl
acl tubeyou dstdomain -i /etc/squid/utube.txt
#acl blackout dstdomain -i /etc/squid/blackout.txt

#
# Recommended minimum Access Permission configuration:
#
#http_access deny usul all
# Only allow cachemgr access from localhost
http_access allow manager localhost

# USUL HTTP Access Rules

http_access allow srvurls all
http_access allow fewurls all
http_access allow admin mgrs all

http_access allow one37 a37

http_access deny tubeyou !YouTube

http_access deny malice all
http_access deny porn all
http_access deny ads all

#http_access deny nowords all

http_access deny noaccess
http_access deny srvips !srvurls all

#http_access allow fewurls
http_access deny NoGenNet clerix all

#http_access deny pmhr clerix
#http_access deny sday clerix
#http_access deny night_s clerix
#http_access deny night_e clerix

http_access deny donot !admin
http_access deny nommq !admin !mgrs
http_access allow usul all

http_access deny manager noaccess

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on localhost is a local user
http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
#http_access allow localnet
#http_access allow localhost

# allow localhost always proxy functionality
http_access allow localhost

# And finally deny all other access to this proxy
http_access deny all

error_directory /usr/share/squid/errors/en
#deny_info PORN_DENIED blackout

icp_access allow usul
icp_access deny all

htcp_access allow usul
htcp_access deny all

# Squid normally listens to port 3128
#http_port 3128

http_port 3128 intercept

#http_port 80 intercept
#http_port 8080 intercept

#http_port all intercept # Best each port above 

[squid-users] Access Groups Problem

2013-10-16 Thread Edmonds Namasenda
Hello All,

We use one openSuSe 11.4 server to manage access of five networks four
of which are connected through VPN.
Initial configuration used in-built Squid (3.0 Stable 18) in
transparent mode. We recently ran an upgrade and got Squid 3.1.23 on the
same oS 11.4

We realized some admin IP addresses are blocked from access and branch
users require adding proxy settings in the browsers / apps to connect
to the internet.
I experienced the former after the upgrade while not sure about the
latter. I am a timely consultant (practical) to the team not in-house.

How can the above be rectified?
Below is the current conf with a few alterations

## Start Conf ##
#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
#acl localnet src 10.0.0.0/8# RFC1918 possible internal network
#acl localnet src 172.16.0.0/12# RFC1918 possible internal network
#acl localnet src 192.168.0.0/16# RFC1918 possible internal network
#acl localnet src fc00::/7   # RFC 4193 local private network range
#acl localnet src fe80::/10  # RFC 4291 link-local (directly
plugged) machines

acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
#acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

# USUL Connection ACLs
acl usul src 10.40.1.0/24 10.40.2.0/24 10.40.3.0/24 10.40.4.0/24 10.40.5.0/24

acl noaccess src /etc/squid/noaccess.txt
acl admin src /etc/squid/admin.txt
acl a37 src /etc/squid/one37.txt
acl srvips src /etc/squid/srvips.txt
acl mgrs src /etc/squid/mgrs.txt
acl clerix src /etc/squid/clerix.txt

# USUL Connectivity Time-Frames
acl NoGenNet time MTWHFA 08:00-12:59
acl NoGenNet time MTWHFA 13:59-16:59
acl NoGenNet time S 07:00-12:59
acl NoGenNet time SMTWHFA 19:00-23:59
acl NoGenNet time SMTWHFA 00:00-06:59

## You Tube
acl YouTube time SMTWHFA 19:00-23:59
acl YouTube time SMTWHFA 00:00-07:59

# USUL Streaming Restrictions
acl nommq req_mime_type -i /etc/squid/nommq.txt

# USUL File  URL Restrictions
acl donot urlpath_regex -i /etc/squid/donot.txt
#acl nowords url_regex -i /etc/squid/nowords.txt
acl srvurls dstdomain -i /etc/squid/srvurls.txt
acl fewurls dstdomain -i /etc/squid/fewww.txt
acl one37 dstdomain -i /etc/squid/url37.txt
acl malice dstdomain -i /etc/squid/malware.acl
acl porn dstdomain -i /etc/squid/xxx.acl
acl ads dstdomain -i /etc/squid/ads.acl
acl tubeyou dstdomain -i /etc/squid/utube.txt
#acl blackout dstdomain -i /etc/squid/blackout.txt

#
# Recommended minimum Access Permission configuration:
#
#http_access deny usul all
# Only allow cachemgr access from localhost
http_access allow manager localhost

# USUL HTTP Access Rules

http_access allow srvurls all
http_access allow fewurls all
http_access allow admin mgrs all

http_access allow one37 a37

http_access deny tubeyou !YouTube

http_access deny malice all
http_access deny porn all
http_access deny ads all

#http_access deny nowords all

http_access deny noaccess
http_access deny srvips !srvurls all

#http_access allow fewurls
http_access deny NoGenNet clerix all

#http_access deny pmhr clerix
#http_access deny sday clerix
#http_access deny night_s clerix
#http_access deny night_e clerix

http_access deny donot !admin
http_access deny nommq !admin !mgrs
http_access allow usul all

http_access deny manager noaccess

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on localhost is a local user
http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
#http_access allow localnet
#http_access allow localhost

# allow localhost always proxy functionality
http_access allow localhost

# And finally deny all other access to this proxy
http_access deny all

error_directory /usr/share/squid/errors/en
#deny_info PORN_DENIED blackout

icp_access allow usul
icp_access deny all

htcp_access allow usul
htcp_access deny all

# Squid normally listens to port 3128
#http_port 3128

http_port 3128 intercept

#http_port 80 intercept
#http_port 8080 intercept

#http_port all intercept # Best each port above or this?

# We recommend you to use at least the 

[squid-users] Config Analysis for VPN Access

2013-10-17 Thread Edmonds Namasenda
Hi Squid Users / Admins,

Below is my squid.conf of 3.1.23 running on openSuSe 11.4 32-bit using
transparent mode and Shoreline Firewall.
The server controls five networks, four of them through VPNs.

Problems...
- Failure to access Internet on VPNs unless proxy settings are added
- Failure to reach DNS and HQ public IP Address
- Some admin IP addresses are randomly (time) blocked from accessing
the Internet

Where could we be going wrong? How can the set-up be improved?

## Start squid.conf

acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1

alnet src fe80::/10  # RFC 4291 link-local (directly plugged) machines

acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

acl usul src 10.40.1.0/24 10.40.2.0/24 10.40.3.0/24 10.40.4.0/24 10.40.5.0/24

acl noaccess src /etc/squid/noaccess.txt
acl admin src /etc/squid/admin.txt
acl a37 src /etc/squid/one37.txt
acl srvips src /etc/squid/srvips.txt
acl mgrs src /etc/squid/mgrs.txt
acl clerix src /etc/squid/clerix.txt

acl NoGenNet time MTWHFA 08:00-12:59
acl NoGenNet time MTWHFA 13:59-16:59
acl NoGenNet time S 07:00-12:59
acl NoGenNet time SMTWHFA 19:00-23:59
acl NoGenNet time SMTWHFA 00:00-06:59

acl YouTube time SMTWHFA 19:00-23:59
acl YouTube time SMTWHFA 00:00-07:59

acl nommq req_mime_type -i /etc/squid/nommq.txt

acl donot urlpath_regex -i /etc/squid/donot.txt
acl srvurls dstdomain -i /etc/squid/srvurls.txt
acl fewurls dstdomain -i /etc/squid/fewww.txt
acl one37 dstdomain -i /etc/squid/url37.txt
acl malice dstdomain -i /etc/squid/malware.acl
acl porn dstdomain -i /etc/squid/xxx.acl
acl ads dstdomain -i /etc/squid/ads.acl
acl tubeyou dstdomain -i /etc/squid/utube.txt

http_access allow manager localhost

http_access allow srvurls all
http_access allow fewurls all
http_access allow admin mgrs all

http_access allow one37 a37

http_access deny tubeyou !YouTube

http_access deny malice all
http_access deny porn all
http_access deny ads all

http_access deny noaccess
http_access deny srvips !srvurls all

http_access deny NoGenNet clerix all

http_access deny donot !admin
http_access deny nommq !admin !mgrs
http_access allow usul all

http_access deny manager noaccess

http_access deny !Safe_ports

http_access deny CONNECT !SSL_ports

http_access deny to_localhost

http_access allow localhost


http_access deny all

error_directory /usr/share/squid/errors/en

icp_access allow usul
icp_access deny all

htcp_access allow usul
htcp_access deny all


http_port 3128 intercept

#http_port 80 intercept## ?!
#http_port 8080 intercept## ?!

#http_port all intercept## Best each port above or this?

hierarchy_stoplist cgi-bin ?

cache_mem 400 MB

cache_dir ufs /var/cache/squid 2 16 256

coredump_dir /var/cache/squid

access_log /var/log/squid/access.log squid

minimum_object_size 512 KB
maximum_object_size 4 MB
maximum_object_size_in_memory 6 MB

refresh_pattern ^ftp:144020%10080
refresh_pattern ^gopher:14400%1440
refresh_pattern -i (/cgi-bin/|\?) 00%0
refresh_pattern .020%4320

dns_nameservers 41.##.##.# 41.##.##.#

visible_hostname ##
icp_port 3130
cache deny YouTube tubeyou

## End squid.conf

-- 
Thank you and kind regards,

I.P. Edmonds


Re: [squid-users] You don't have permission to access /squidreport/ on this server.

2013-12-03 Thread Edmonds Namasenda
Check the ownership and or mode of the squidreport directory / folder.
Else, you are missing some entries for that directory in your httpd.conf

# Edmonds


Re: Re: [squid-users] Issues with Opensuse12.3 with squid

2014-03-26 Thread Edmonds Namasenda
Back up your squid config file or any other associated files with
custom info before stopping and uninstalling the old squid. Someone
suggested copying the running squid binary too. Follow the steps below

* This is assuming you are using the CLI
* This is no official guide  I am not responsible for any consequences

- Log-in as root
- Type yast2
- Go to Software repositories
- Select Add. Choose links (from
http://en.opensuse.org/Package_repositories)  add appropriately
- Make sure the repos you have added are enabled  select Ok
- Use YAST to reinstall and it should then be able to find the latest
version available

Good luck.

# Edmonds

On Wed, Mar 26, 2014 at 2:43 PM, Edmonds Namasenda namase...@gmail.com wrote:
 Back up your squid config file or any other associated files with custom
 info before stopping and uninstalling the old squid. Someone suggested
 copying the running squid binary too. Follow the steps below

 * This is assuming you are using the CLI
 * This is no official guide  I am not responsible for any consequences

 - Log-in as root
 - Type yast2
 - Go to Software repositories
 - Select Add. Choose links (from
 http://en.opensuse.org/Package_repositories)  add appropriately
 - Make sure the repos you have added are enabled  select Ok
 - Use YAST to reinstall and it should then be able to find the latest
 version available

 Good luck.

 # Edmonds


 On Wed, Mar 26, 2014 at 9:26 AM, Oluseyi Akinboboye
 seyiakinbob...@gmail.com wrote:

 Thank you for your response.

 But already we have ACL on the squid and also on the mikrotik we have a
 series f permitted users who have access to these ports so opening them
 should not really be a problem.

 When i mean slow i mean that the pages being fetched from the cache
 considerablely slow down! i have restarted it this morning now and would
 like to watch it handle requests over peak periods and will report back to
 you.

 But please would like to know how to update my version of squid from
 3.2.11 to the current 3.4.4 I would appreciate it if you would tell me in
 layman's ters how to do so.

 Thanks

  Hello,
 
  I am using a Opensuse12.3 with squid as a gateway  squid though its
  not transparent! I would like to know how to do the following
  1.   Allow the following port have access in the network; ports 25,
  110,
  465, 995 and other specific ports required for specific mail servers
  to work.
 
 Two problems with this.
 
 1) Squid is an HTTP proxy not an email server.
 
 2) Opening these ports in any way through Squid turns it into an open
 proxy and permits spamming or other email abuse through your server.
 
 
  2.   The squid seems to slow down after a few hours on the job;
  although
  there are a few clients on transparent and a few on non-transparent
  proxies here! Is there any script or such that will make the squid box
  refresh itself every few hours or so?
 
 What? please define slow.
 
  3.   The clients who are not using the proxy are doing so due to the
  fact that they won’t be able to pull and or push their emails due to
  firewall restrictions from the squid!
 
 Squid is not a firewall.
 
 Check the actual firewall settings on the box Squid is running on.
 Perhapse that is what is getting in their way.
 
  4.   After a few days can we release the contents of the squid so as
  not
  to have a filled up squid?
 
 What contents and why?
 
 
 Amos


[squid-users] Duration Access Limits

2014-04-03 Thread Edmonds Namasenda
Hello All,

Is there a possibility for Squid to drop connections of particular IP
Addresses after a specific period of time, and only enable those
addresses after a specified period of time? NO authentication
required. Just DHCP  Squid

For example ..a device connects for 30 minutes then that IP Address is
denied access for another 30 minutes before being allowed to
reconnect.

What is the format of such an ACL?

Thanks in advance

# Edmonds


Re: [squid-users] Duration Access Limits

2014-04-03 Thread Edmonds Namasenda
Thanks Yanier.

However my requirement is different.
Squid will need to count down the connected IPs duration. After it
runs out, that IP is blocked for sometime before it is allowed again
to connect.
The ACL you suggested allows access during a specific period of time.
The one I request for should allow access at any time / day of the
week but for a specific duration before reconnecting.

# Edmonds



On Fri, Apr 4, 2014 at 1:00 AM, Yanier Salazar Sánchez
yan...@eleccav.une.cu wrote:
 Acl nameacl time MTWHF beginhour-endhour



 Examples



 Acl work_days time MTW 08:00-16:00



 http_access allow workdays who_access



 -Original Message-

 From: Edmonds Namasenda [mailto:namase...@gmail.com]

 Sent: Thursday, April 03, 2014 9:08 AM

 To: squid-users

 Subject: [squid-users] Duration Access Limits



 Hello All,



 Is there a possibility for Squid to drop connections of particular IP

 Addresses after a specific period of time, and only enable those addresses

 after a specified period of time? NO authentication required. Just DHCP 

 Squid



 For example ..a device connects for 30 minutes then that IP Address is
 denied

 access for another 30 minutes before being allowed to reconnect.



 What is the format of such an ACL?



 Thanks in advance



 # Edmonds




Re: [squid-users] Re: Duration Access Limits

2014-04-04 Thread Edmonds Namasenda
Okay!
Unless I am getting it wrong ...are you telling me to find (or
propose) a solution to my problem?

Else, what is required of me for the helper? Must it be external? Such
tend to be slow.
I want to offer free WiFi internet to people at an eatery but it
should disable their connection after some time. Otherwise, people
might pitch camp (or move offices) realizing the free internet.

Is there another open source solution I can implement besides
tinkering with the existing squid installation?

# Edmonds

On Fri, Apr 4, 2014 at 11:58 AM, babajaga augustus_me...@yahoo.de wrote:
 I could think about a custom external auth helper, checking the IP,
 maintaining its own DB regarding the connect times, and allowing/disallowing
 access to squid.
 However, this helper has to be provided by you.



 --
 View this message in context: 
 http://squid-web-proxy-cache.1019090.n4.nabble.com/Duration-Access-Limits-tp4665424p4665435.html
 Sent from the Squid - Users mailing list archive at Nabble.com.


Re: [squid-users] Re: Duration Access Limits

2014-04-05 Thread Edmonds Namasenda
Thanks Meyer.
While I always want to get my hands dirty, I have not yet written any
squid helpers.
Point me somewhere for starters.
And please note that because the WiFi will be a free service, I do not
want to incur more costs on equipment (Mikrotik) unless it is the only
alternative. Any specifics off Mikrotik you can recommend?

Amos, other users,

Is there nothing else I can do with squid to achieve my goal?

# Edmonds

On Fri, Apr 4, 2014 at 2:17 PM, babajaga augustus_me...@yahoo.de wrote:
Unless I am getting it wrong ...are you telling me to find (or
 propose) a solution to my problem?
 I proposed a possible solution using squid, however, must be implemented
 (programmed) by yourself, as not available AFAIK.

 Must it be external? Such
 tend to be slow.
 Not necessarily, as the result of the auth helper might be cached within
 squid, so number of accesses to this helper is reduced.

Is there another open source solution I can implement besides
 tinkering with the existing squid installation? 
 You might have a look at mikrotik.com. Their hotspot system within RoS
 should be able to do, what you want. Their hardware is really cheap, and not
 so bad. As there is also a huge forum regarding scripts, you should find
 something suitable on the spot.
 BTW: You might use squid as an upstream caching proxy for your MT-box, if
 you want. Simple to implement.



 --
 View this message in context: 
 http://squid-web-proxy-cache.1019090.n4.nabble.com/Duration-Access-Limits-tp4665424p4665438.html
 Sent from the Squid - Users mailing list archive at Nabble.com.


[squid-users] Time-Based Download Restrictions

2015-11-30 Thread Edmonds Namasenda
Greetings.

I want to deny access to certain downloads (in str-med.txt) during "WorkHrs"
This is failing miserably as this is not achieved.

Please look through my files (squid.conf and str-med.txt) below for
pointers to rectify this. Thanks in advance

### Start squid.conf ###
acl office-net src 10.10.2.0/24

acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

acl WorkHrs time MTWHF 08:29-12:59
acl WorkHrs time MTWHFA 14:00-16:59

## Wrong Files and URLS
acl malice dstdomain -i "/etc/squid/malware.acl"
acl porn dstdomain -i "/etc/squid/xxx.acl"
acl ads dstdomain -i "/etc/squid/ads.acl"
acl proxies dstdomain -i "/etc/squid/proxies.acl"

acl nostr urlpath_regex -i "/etc/squid/str-med.txt"

http_access deny nostr WorkHrs
http_reply_access deny nostr WorkHrs

http_access deny !Safe_ports
http_access deny ads
http_access deny porn
http_access deny malice
http_access deny proxies

http_access deny CONNECT !SSL_ports

http_access allow localhost manager
http_access deny manager

http_access allow office-net all

# Allow localhost always proxy functionality
http_access allow localhost

# And finally deny all other access to this proxy
http_access deny all

error_directory /usr/share/squid/errors/en

icp_access allow office-net
icp_access deny all

htcp_access allow office-net
htcp_access deny all

http_port 10.10.2.10:3128 intercept
http_port 127.0.0.1:3127

hierarchy_stoplist cgi-bin ?

cache_mem 400 MB

cache_dir aufs /var/cache/squid 2 16 256

coredump_dir /var/cache/squid

access_log /var/log/squid/access.log squid

minimum_object_size 512 bytes
maximum_object_size_in_memory 10 MB

refresh_pattern http://.*\.windowsupdate\.microsoft\.com/ 0 80% 20160
reload-into-ims
refresh_pattern http://.*\.update\.microsoft\.com/ 0 80% 20160 reload-into-ims
refresh_pattern http://download\.microsoft\.com/ 0 80% 20160 reload-into-ims
refresh_pattern http://windowsupdate\.microsoft\.com/ 0 80% 20160
reload-into-ims
refresh_pattern http://office\.microsoft\.com/ 0 80% 20160 reload-into-ims
refresh_pattern http://.*\.office\.net/ 0 80% 20160 reload-into-ims
refresh_pattern http://.*\.windowsupdate\.com/ 0 80% 20160 reload-into-ims

refresh_pattern http://.*\.youtube\.com/ 0 80% 20160 reload-into-ims
refresh_pattern http://.*\.espnfc\.com/ 0 80% 20160 reload-into-ims

refresh_pattern http://.*\.kaspersky\.com/ 0 80% 20160 reload-into-ims

refresh_pattern http://.*\.mozilla\.net/ 0 80% 20160 reload-into-ims
refresh_pattern http://.*\.mozilla\.org/ 0 80% 20160 reload-into-ims

refresh_pattern -i \.(iso|deb|rpm|zip|tar|tgz|ram|rar|bin|ppt|doc)$
10080 90% 43200 ignore-no-cache ignore-auth store-stale
refresh_pattern -i \.(zip|gz|arj|lha|lzh)$ 10080 100% 43200
override-expire ignore-no-cache ignore-auth store-stale
refresh_pattern -i \.(rar|tgz|tar|exe|bin)$ 10080 100% 43200
override-expire ignore-no-cache ignore-auth ignore-reload
ignore-no-cache store-stale
refresh_pattern -i \.(hqx|pdf|rtf|doc|swf)$ 10080 100% 43200
override-expire ignore-no-cache ignore-auth store-stale
refresh_pattern -i \.(inc|cab|ad|txt|dll)$ 10080 100% 43200
override-expire ignore-no-cache ignore-auth store-stale

logfile_rotate 7
debug_options rotate=1

quick_abort_min -1 KB

maximum_object_size 4 GB

acl youtube dstdomain .youtube.com
cache allow youtube

refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
refresh_pattern .   0   0%  4320

dns_nameservers 8.8.8.8 8.8.4.4

visible_hostname TheOffice
icp_port 3130

### End squid.conf ###

### Start str-med.txt

\.flv(\?.*)?$
\.(avi|mp4|mov|m4v|mkv|flv)(\?.*)?$
\.(mpg|mpeg|mp3|avi|mov|flv|wmv|mkv|rmvb)(\?.*)?$
\.exe(\?.*)$
\.(msi|cab|mar)(\?.*)$
\.torrent(\?.*)$
\.txt(\?.*)$
\.(afx|asf)(\?.*)?$
\.swf(\?.*)?$

### End str-med.txt

-- 
Namasenda I. P. Edmonds
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users