Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-20 Thread Amos Jeffries
On 19/08/2014 3:42 a.m., nuhll wrote:
 Just to clarify my problem: I dont use it as a transparente proxy! I
 distribute the proxy with my dhcp server and a .pac file. So it gets used on
 all machines with auto detection proxy
 

Your earlier config file posted contained:

  http_port 192.168.0.1:3128 transparent

transparent/intercept mode ports are incompatible with WPAD and PAC
configuration. You need a regular forward-proxy port (no transparent)
for receiving that type of traffic.

This is probably a good hint as to what yoru problem actually is. The
logs you posted in other email are showing what could be the side effect
of this misunderstanding. I will reply to that email with details.

Amos



Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-18 Thread Alex Crow


http://www.squid-cache.org/Doc/config/cache/

On 03/08/14 10:25, nuhll wrote:

Seems like acl all src all fixed it. Thanks!

One problem is left. Is it possible to only cache certain websites, the rest
should just redirectet?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667127.html
Sent from the Squid - Users mailing list archive at Nabble.com.




Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-17 Thread Amos Jeffries
On 16/08/2014 8:02 a.m., nuhll wrote:
 I got nearly all working. Except Battle.net. This problem seems to known, but
 i dont know how to fix.
 
 http://stackoverflow.com/questions/24933962/squid-proxy-blocks-battle-net

That post displays a perfectly working proxy transaction. No sign of an
error anywhere.


 https://forum.pfsense.org/index.php?topic=72271.0
 

Contains three solutions. All of which are essentially turn on PNP at
the router.

Amos


Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-12 Thread Amos Jeffries
On 12/08/2014 7:57 a.m., nuhll wrote:
 Thanks for your help.
 
 But i go crazy. =)
 
 Internet is slow as fuck. I dont see any errors in the logs. And some
 services (Battle.net) is not working.
 
 /etc/squid3/squid.conf
 debug_options ALL,1 33,2
 acl domains_cache dstdomain /etc/squid/lists/domains_cache
 cache allow domains_cache
 acl localnet src 192.168.0.0
 acl all src all
 acl localhost src 127.0.0.1
 cache deny all
 
 #access_log daemon:/var/log/squid/access.test.log squid
 
 http_port 192.168.0.1:3128 transparent
 
 cache_dir ufs /daten/squid 10 16 256
 
 range_offset_limit 100 MB windowsupdate
 maximum_object_size 6000 MB
 quick_abort_min -1
 
 
 # Add one of these lines for each of the websites you want to cache.
 
 refresh_pattern -i
 microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
 reload-into-ims
 
 refresh_pattern -i
 windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
 432000 reload-into-ims
 
 refresh_pattern -i
 windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
 reload-into-ims
 
 #kaspersky update
 refresh_pattern -i
 geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip)
 4320 80% 432000 reload-into-ims
 
 #nvidia updates
 refresh_pattern -i
 download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
 432000 reload-into-ims
 
 #java updates
 refresh_pattern -i
 sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
 432000 reload-into-ims
 
 # DONT MODIFY THESE LINES
 refresh_pattern \^ftp:   144020% 10080
 refresh_pattern \^gopher:14400%  1440
 refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
 refresh_pattern .   0   20% 4320
 
 #kaspersky update
 acl kaspersky dstdomain geo.kaspersky.com
 
 acl windowsupdate dstdomain windowsupdate.microsoft.com
 acl windowsupdate dstdomain .update.microsoft.com
 acl windowsupdate dstdomain download.windowsupdate.com
 acl windowsupdate dstdomain redir.metaservices.microsoft.com
 acl windowsupdate dstdomain images.metaservices.microsoft.com
 acl windowsupdate dstdomain c.microsoft.com
 acl windowsupdate dstdomain www.download.windowsupdate.com
 acl windowsupdate dstdomain wustat.windows.com
 acl windowsupdate dstdomain crl.microsoft.com
 acl windowsupdate dstdomain sls.microsoft.com
 acl windowsupdate dstdomain productactivation.one.microsoft.com
 acl windowsupdate dstdomain ntservicepack.microsoft.com
 
 acl CONNECT method CONNECT
 acl wuCONNECT dstdomain www.update.microsoft.com
 acl wuCONNECT dstdomain sls.microsoft.com
 
 http_access allow kaspersky localnet
 http_access allow CONNECT wuCONNECT localnet
 http_access allow windowsupdate localnet
 
 #test
 http_access allow localnet
 http_access allow all
 http_access allow localhost
 
 
 /etc/squid/lists/domains_cache
 microsoft.com
 windowsupdate.com
 windows.com
 #nvidia updates
 download.nvidia.com
 
 #java updates
 sdlc-esd.sun.com
 #kaspersky
 geo.kaspersky.com
 
 /var/log/squid3/access.log
 1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET
 http://dist.blizzard.com.edgesuite.net/hs-pod/beta/EU/4944.direct/base-Win-deDE.MPQ
 - DIRECT/dist.blizzard.com.edgesuite.net -
 1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET
 http://llnw.blizzard.com/hs-pod/beta/EU/4944.direct/base-Win.MPQ -
 DIRECT/llnw.blizzard.com -

The blizzard.com servers did not produce a response for these requests.
Squid waited almost 18 seconds and nothing came back.

TCP window scaling, ECN, Path-MTU discovery, ICMP blocking are things to
look for here. Any one of them could be breaking the connection from
transmitting or receiving properly.

The rest of the log shows working traffic. Even for battle.net. I
suspect battle.net uses non-80 ports right? I doubt those are being
intercepted in your setup.

 /var/log/squid3/cache.log
 2014/08/11 21:51:29| Squid Cache (Version 3.1.20): Exiting normally.
 2014/08/11 21:53:04| Starting Squid Cache version 3.1.20 for
 x86_64-pc-linux-gnu...

Hmm. Which version of Debian (or derived OS) are you using? and can you
update it to the latest stable? squid3 package has been at 3.3.8 for
most of a year now.

 2014/08/11 21:53:04| Process ID 32739
 2014/08/11 21:53:04| With 65535 file descriptors available
 2014/08/11 21:53:04| Initializing IP Cache...
 2014/08/11 21:53:04| DNS Socket created at [::], FD 7
 2014/08/11 21:53:04| DNS Socket created at 0.0.0.0, FD 8
 2014/08/11 21:53:04| Adding nameserver 8.8.8.8 from squid.conf
 2014/08/11 21:53:04| Adding nameserver 8.8.4.4 from squid.conf
 2014/08/11 21:53:05| Unlinkd pipe opened on FD 13
 2014/08/11 21:53:05| Local cache digest enabled; rebuild/rewrite every
 3600/3600 sec
 2014/08/11 21:53:05| Store logging disabled
 2014/08/11 21:53:05| Swap maxSize 10240 + 262144 KB, estimated 7897088
 objects
 2014/08/11 21:53:05| Target number of buckets: 394854
 2014/08/11 21:53:05| Using 524288 Store buckets
 2014/08/11 21:53:05| 

Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-06 Thread Igor Novgorodov

Well, english is not my native language too, but that does not hurt much :)

1. Define an access list (text file with domains you wanna cache, one 
domain per line):

acl domains_cache dstdomain /etc/squid/lists/domains_cache.txt

2. Define a parameter that will allow cache for these domains, while 
denying all others:

cache allow domains_cache
cache deny all

That's all, wasn't that difficult :)

P.S.
always_direct directive is for something a little different, it's used 
with parent proxies,

so use just cache.


On 06.08.2014 21:33, nuhll wrote:

Thanks for your answer.

Ill try to get it working but im not sure how. I dont understand this acl
system. I know there are alot of tutorials out there, but not in my mother
language so im not able to fully understand such expert things.

Could you maybe show me atleast at one exampel how to get it work? Also
maybe there are things i can remove?

Heres my actual list:

acl localnet src 192.168.0.0
acl all src all
acl localhost src 127.0.0.1

#access_log daemon:/var/log/squid/access.test.log squid

http_port 192.168.0.1:3128 transparent

cache_dir ufs /daten/squid 10 16 256

range_offset_limit 100 MB windowsupdate
maximum_object_size 6000 MB
quick_abort_min -1


# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i
microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
reload-into-ims

refresh_pattern -i
windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
432000 reload-into-ims

refresh_pattern -i
windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
reload-into-ims

#kaspersky update
refresh_pattern -i
geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip)
4320 80% 432000 reload-into-ims

#nvidia updates
refresh_pattern -i
download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
432000 reload-into-ims

#java updates
refresh_pattern -i
sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
432000 reload-into-ims

# DONT MODIFY THESE LINES
refresh_pattern \^ftp:   144020% 10080
refresh_pattern \^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
refresh_pattern .   0   20% 4320

#kaspersky update
acl kaspersky dstdomain geo.kaspersky.com

acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com
acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow kaspersky localnet
http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet

#test
http_access allow localnet
http_access allow all
http_access allow localhost
  




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667157.html
Sent from the Squid - Users mailing list archive at Nabble.com.




Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-05 Thread Igor Novgorodov

Piece of cake:

always_direct deny acl_not_direct
always_direct allow all

On 05.08.2014 23:19, nuhll wrote:

Thanks, but its not possible to make a list of all possible websites which i
could visit but i dont want to cache xD.

Is there no way to direct ALL websites direct EXCEPT only some websites?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667140.html
Sent from the Squid - Users mailing list archive at Nabble.com.




Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-04 Thread Amos Jeffries
On 3/08/2014 9:25 p.m., nuhll wrote:
 Seems like acl all src all fixed it. Thanks!
 
 One problem is left. Is it possible to only cache certain websites, the rest
 should just redirectet?

The cache directive is used to tell Squid any transactions to be
denied storage (deny matches). The rest (allow matches) are cached (or
not) as per HTTP specification. http://www.squid-cache.org/Doc/config/cache/

Redirect is done with url_rewrite_program helper or a deny_info ACL
producing a 30x status and alternative URL for the client to be
redirected to. Although I guess you used the word redirectet to mean
something other than HTTP redirection - so this may not be what you want
to do.

Amos



Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-04 Thread Igor Novgorodov

always_direct directive

On 04.08.2014 22:15, nuhll wrote:

Hello,
you are right. I dont mean redirect like 301.

I mean, squid should not touch the website or connection and just send it
direct to the website, except some websites which i want to cache.

How to archive this?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667134.html
Sent from the Squid - Users mailing list archive at Nabble.com.




Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-04 Thread Igor Novgorodov
You should create an access list with sites that you don't want to cache 
like:


always_direct allow acl_direct_sites

always_direct allow all will make ALL requests to go directly bypassing cache.
Also see cache_deny directive.


On 04.08.2014 22:25, nuhll wrote:

always_direct allow all
and then my other code, or i need to add it before?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667136.html
Sent from the Squid - Users mailing list archive at Nabble.com.




Re: [squid-users] Re: ONLY Cache certain Websites.

2014-08-03 Thread Amos Jeffries
On 3/08/2014 3:07 a.m., nuhll wrote:
 im not able to fix it.
 
 Normal websites work. But i cant get it to cache (or even allow access to
 Windows Update or Kaspersky).
 
 Whats i am doin wrong?
 
 2014/08/02 17:05:35| The request GET
 http://dnl-16.geo.kaspersky.com/updaters/updater.xml is DENIED, because it
 matched 'localhost'
 2014/08/02 17:05:35| The reply for GET
 http://dnl-16.geo.kaspersky.com/updaters/updater.xml is ALLOWED, because it
 matched 'localhost'
 
 
 2014/08/02 17:06:32| The request CONNECT 62.128.100.41:443 is DENIED,
 because it matched 'localhost'
 2014/08/02 17:06:32| The reply for CONNECT 62.128.100.41:443 is ALLOWED,
 because it matched 'localhost'
 
 
 014/08/02 17:07:07| The request CONNECT sls.update.microsoft.com:443 is
 DENIED, because it matched 'localhost'
 2014/08/02 17:07:07| The reply for CONNECT sls.update.microsoft.com:443 is
 ALLOWED, because it matched 'localhost'
 

So what access.log linesmatch these transactions?

 
 my config atm:
 debug_options ALL,1 33,2
 acl localnet src 192.168.0.0
 acl all src 0.0.0.0

1) you are defining the entire Internet to be a single IP address
0.0.0.0 ... which is invalid.

This should be:
   acl all src all

 acl localhost src 127.0.0.1
 
 access_log daemon:/var/log/squid/access.test.log squid
 
 http_port 192.168.0.1:3128 transparent
 
 cache_dir ufs /daten/squid 10 16 256
 
 range_offset_limit 100 MB windowsupdate
 maximum_object_size 6000 MB
 quick_abort_min -1
 
 
 # Add one of these lines for each of the websites you want to cache.
 
 refresh_pattern -i
 microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
 reload-into-ims
 
 refresh_pattern -i
 windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
 432000 reload-into-ims
 
 refresh_pattern -i
 windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
 reload-into-ims
 
 refresh_pattern -i
 geo.kaspersky.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
 432000 reload-into-ims
 
 # DONT MODIFY THESE LINES
 refresh_pattern \^ftp:   144020% 10080
 refresh_pattern \^gopher:14400%  1440
 refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
 refresh_pattern .   0   20% 4320
 
 acl kaspersky dstdomain .kaspersky.com
 acl windowsupdate dstdomain windowsupdate.microsoft.com
 acl windowsupdate dstdomain .update.microsoft.com
 acl windowsupdate dstdomain download.windowsupdate.com
 acl windowsupdate dstdomain redir.metaservices.microsoft.com
 acl windowsupdate dstdomain images.metaservices.microsoft.com
 acl windowsupdate dstdomain c.microsoft.com
 acl windowsupdate dstdomain www.download.windowsupdate.com
 acl windowsupdate dstdomain wustat.windows.com
 acl windowsupdate dstdomain crl.microsoft.com
 acl windowsupdate dstdomain sls.microsoft.com
 acl windowsupdate dstdomain productactivation.one.microsoft.com
 acl windowsupdate dstdomain ntservicepack.microsoft.com
 
 acl CONNECT method CONNECT
 acl wuCONNECT dstdomain www.update.microsoft.com
 acl wuCONNECT dstdomain sls.microsoft.com
 
 http_access allow kaspersky localnet
 http_access allow CONNECT wuCONNECT localnet
 http_access allow windowsupdate localnet
 
 http_access allow localnet
 http_access allow localhost
 

The above rule set is equivalent to:
 http_access allow localhost
 http_access deny !localnet
 http_access allow all

Amos