Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-05-11 Thread Omid Kosari
Eliezer Croitoru wrote
> You can try to use the atime and not the mtime.

Each time the fetcher script runs , all of request files will access and
then atime will refreshed .
I think for "request" directory it should be "mtime" and for "body"
directory it should be "atime" .


Eliezer Croitoru wrote
> It is possible that some fetchers will consume lots of memory and some of
> the requests are indeed un-needed but... don’t delete them.
> Try to archive them and only then remove from them some by their age or
> something similar.
> Once you have the request you have the option to fetch files and since
> it's such a small thing(max 64k per request) it's better to save and
> archive first and later wonder if some file request is missing.

But currently there is more than 23 files in old request directory .
Maybe the garbage collector of GoLang will not release the memory after
processing each file .



Eliezer Croitoru wrote
> * if you want me to test or analyze your archived requests archive them
> inside a xz and send them over to me.

I have sent you the request directory in previous private email .

Thanks




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682360.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-05-10 Thread Omid Kosari
I have deleted and recreate the request directory and see huge decrease in
memory usage of the fetcher process .

Did i do the right thing ? Is there anything that should i do after a while
?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682352.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-24 Thread Omid Kosari
Hello,

Thanks



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682189.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-18 Thread Eliezer Croitoru
Did you got my answer?
You should be able to dispatch more  then one fetcher but you should somehow 
manage them and restrict their amount and dispatch rate.

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il



-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Saturday, April 15, 2017 2:52 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hello,

I have sent the files you mentioned to your email 2 days ago . 

A little more investigation shows that some big files (~ 2GB ) are downloading 
slowly ( ~ 100KBytes/s) while some others downloading very faster. The problem 
is related to networking (BGP and IXP ) stuff and the fetcher script can not 
solve that .

But is there a way to run more than one fetcher script at the same time to 
parallel downloading and not one by one ? There is free bandwidth but fetcher 
script takes a long time for some downloads .

Thanks again for you support



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682113.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-15 Thread Omid Kosari
Hello,

I have sent the files you mentioned to your email 2 days ago . 

A little more investigation shows that some big files (~ 2GB ) are
downloading slowly ( ~ 100KBytes/s) while some others downloading very
faster. The problem is related to networking (BGP and IXP ) stuff and the
fetcher script can not solve that .

But is there a way to run more than one fetcher script at the same time to
parallel downloading and not one by one ? There is free bandwidth but
fetcher script takes a long time for some downloads .

Thanks again for you support



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682113.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-08 Thread Omid Kosari
Thanks for reply.


Eliezer Croitoru wrote
> Also what is busy for you?

The fetcher script is always downloading . For example right now i can see
that a fetcher script is running for more than 3 days and it is downloading
files one by one .


Eliezer Croitoru wrote
> Also what is busy for you?
> Are you using a lock file ?( it might be possible that your server is
> downloading in some loop and this is what causing this load)

Yes . Everything looks fine in that mean .


Eliezer Croitoru wrote
> Did you upgraded to the latest version?

Yes


I will send you the files .

Thanks




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682033.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-06 Thread Eliezer Croitoru
I am not using it daily but I know that MS updates have more then one language 
support and it's pretty simple to verify the subject.
Just send me privately the tar of the requests and headers files and I will try 
to see if something got changed or not.
I do not believe that MS will change their system to such extent but I have not 
issue looking at the subject and try to verify what is causing for the load on 
your side.
Just so you know that it might take me time to verify the issue.

Also what is busy for you?
Are you using a lock file ?( it might be possible that your server is 
downloading in some loop and this is what causing this load)
Did you upgraded to the latest version?

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il



-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Thursday, April 6, 2017 5:39 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hey Eliezer,

Recently i have found that the fetcher script is very busy and it is always 
downloading . It seems that microsoft changed something . I am not sure and it 
is just a guess . 

Whats up at your servers ?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682002.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2017-04-06 Thread Omid Kosari
Hey Eliezer,

Recently i have found that the fetcher script is very busy and it is always
downloading . It seems that microsoft changed something . I am not sure and
it is just a guess . 

Whats up at your servers ?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4682002.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-09-07 Thread Eliezer Croitoru
Hey Omid,

For now the software is restricted only to windows updates which is protected 
and secured enough to sustain caching.
About Mozilla, I need to verify it before I am doing anything about it.
From my point of view it is hosted on Akamai and HSTS is restricting couple 
things on their service.
I will try to look at it later without any promises.

Do you have any starting points else then the domain itself?
Have you tried to analyze some logs?

Eliezer 


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Tuesday, September 6, 2016 5:48 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hey Eliezer,

According to these threads
http://squid-web-proxy-cache.1019090.n4.nabble.com/range-offset-limit-not-working-as-expected-td4679355.html

http://squid-web-proxy-cache.1019090.n4.nabble.com/TProxy-and-client-dst-passthru-td4670189.html

Is there any chance that you implement something that may be used for other
(206 partial) popular sites like download.cdn.mozilla.net . I think it has also 
same problem as windows update and has lots of uncachable requests .

Thanks in advance .



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4679373.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-09-06 Thread Omid Kosari
Hey Eliezer,

According to these threads
http://squid-web-proxy-cache.1019090.n4.nabble.com/range-offset-limit-not-working-as-expected-td4679355.html

http://squid-web-proxy-cache.1019090.n4.nabble.com/TProxy-and-client-dst-passthru-td4670189.html

Is there any chance that you implement something that may be used for other
(206 partial) popular sites like download.cdn.mozilla.net . I think it has
also same problem as windows update and has lots of uncachable requests .

Thanks in advance .



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4679373.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-25 Thread Eliezer Croitoru
Hey Omid,

I will comment inline.
And there are couple details which we need to understand couple issues.


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Monday, July 25, 2016 12:15 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hi,

Thanks for support .

recently i have seen a problem with version beta 0.2 . when fetcher is working 
the kernel logs lots of following error
TCP: out of memory -- consider tuning tcp_mem

# To verify the actual status we need the output of:
$ free -m
$ cat /proc/sys/net/ipv4/tcp_mem
$ top -n1 -b
$ cat /proc/net/sockstat
$ cat /proc/sys/net/ipv4/tcp_max_orphans 

I think the problem is about orphaned connections which i mentioned before .
Managed to try new version to see what happens.

# If you have an orphaned connections on the machine with or without the MS 
updates proxy, you should consider to analyze the machine structure and load in 
general.
If indeed there are orphan connections we need to verify if it's from the squid 
or my service or the combination of them together.


Also i have a feature request . Please provide a configuration file for example 
in /etc/foldername or even beside the binary files to have selective options 
for both fetcher and logger.

# With what options for the logger and fetcher?

I have seen following change log
beta 0.3 - 19/07/2016
+ Upgraded the fetcher to honour private and no-store cache-control  headers
when fetching objects.

As my point of view the more hits is better and there is no problem to store 
private and no-store objects if it helps to achieve more hits and bandwidth 
saving . So it would be fine to have an option in mentioned config file to 
change it myself .

# I understand your way of looking at things but this is a very wrong way to 
look at cache and store.
The problem with storing private and no-store responses is very simple.
These files are temporary and exists for one request only(in most cases).
Specifically for MS it is true and they do not use private files more then once.
I do not wish to offend you or anyone by not honoring such a request but since 
it's a public service this is the definition of it.
If you want to see the options of the fetcher and the service just add the "-h" 
option to see the available options.

I have considered to use some log file but yet to get to the point which I have 
a specific format that I want to work with.
I will try to see what can be done with log files and also what should be done 
to handle log rotation. 

Thanks again


## Resources
* http://blog.tsunanet.net/2011/03/out-of-socket-memory.html

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-25 Thread Omid Kosari
Hi,

Thanks for support .

recently i have seen a problem with version beta 0.2 . when fetcher is
working the kernel logs lots of following error
TCP: out of memory -- consider tuning tcp_mem

I think the problem is about orphaned connections which i mentioned before .
Managed to try new version to see what happens.

Also i have a feature request . Please provide a configuration file for
example in /etc/foldername or even beside the binary files to have selective
options for both fetcher and logger . 

I have seen following change log
beta 0.3 - 19/07/2016
+ Upgraded the fetcher to honour private and no-store cache-control headers
when fetching objects.

As my point of view the more hits is better and there is no problem to store
private and no-store objects if it helps to achieve more hits and bandwidth
saving . So it would be fine to have an option in mentioned config file to
change it myself .

Thanks again



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678669.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-20 Thread Eliezer Croitoru
Hey Omid,

After inspection of more data I have seen that there are couple cases which 
will result in disks space consumption.
Windows Updates supports a variety of languages. When you have more then one or 
two languages the amount of cache is rapidly changes.
To give some numbers to the picture:
- Each Windows version have multiple versions(starter, home, professional, 
enterprise..)
- Each cpu arch requires it's own updates(x86, x64) 
- Each Windows version can have a big update for multiple languages, depends on 
the locality of the system
- Each Windows product such as office has it's own language packs and 
updates(some updates are huge..)

Since I am not one of Microsoft Engineers or product\updates managers I cannot 
guarantee that my understanding of the subject is solid like the ground.
But in the other hand since I do have background with HTTP and it's structure I 
can guarantee some assurance that my research can be understood by most if not 
any HTTP expert.

Squid by it's nature honors specific caching rules and these are very general.
To my understanding Squid was not built to satisfy each use case but it helps 
many of them.
Since you also noticed that windows updates can consume lots of disk space then 
what you mentioned about last accessed time seems pretty reasonable for a cache.
You have the choice on how to manage your store\cache according to whatever is 
required\needed.
For example the command:
find /cache1/body/v1/  -atime +7 -type f|wc -l

Should give you some details about the files which was not accessed in the last 
week.
We can try to enhance the above command\idea to calculate statistics in a way 
that will help us to get an idea of what files or updates are downloaded 
periodically.
Currently only with the existence of the request files we can understand what 
responses belongs to what request.

Let me know if you want me to compose some script that will help you to decide 
what files to purge. (I will probably write it in ruby)
There is an option to "blacklist" a response from being fetched by the fetcher 
or to be used by the web-service but you will need to update to the latest 
version of the fetcher and to use the right cli option(don't remember now) or 
to use the command under a "true" pipe such as "true | /location/fetcher ..." 
to avoid any "pause" which it will cause.

Thanks,
Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Tuesday, July 19, 2016 1:59 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Eliezer Croitoru-2 wrote
> Hey Omid,
> 
> Indeed my preference is that if you can ask ask and I will try to give you
> couple more details on the service and the subject.

Hey Eliezer,


4.Current storage capacity is 500G andmore than 50% of it becomes full and
growing fast . Is there any mechanism for garbage collection in your code ?
If not is it good idea to remove files based on last access time (ls -ltu
/cache1/body/v1/) ? should i also delete old files from header and request
folders ?




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678581.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-19 Thread Amos Jeffries
On 19/07/2016 10:58 p.m., Omid Kosari wrote:
> Eliezer Croitoru-2 wrote
>> Hey Omid,
>>
>> Indeed my preference is that if you can ask ask and I will try to give you
>> couple more details on the service and the subject.
> 
> Hey Eliezer,
> 
> 1.I have refresh patterns from days before your code . Currently i prefer
> not to store windows updates in squid internal storage because of
> deduplication . Now what should i do ? delete this refresh pattern ? or even
> create a pattern not to cache windows updates ?
> 
> refresh_pattern -i
> (microsoft|windowsupdate)\.com/.*?\.(cab|exe|dll|ms[iuf]|asf|wm[va]|dat|zip|iso|psf)$
> 10080 100% 172800 ignore-no-store ignore-reload ignore-private
> ignore-must-revalidate override-expire override-lastmod
> 

Either;
  cache deny ...

Or (if your Squid supports it)

  store_miss deny ...


The cache ACLs are again request-only ones. So based on dstdomain of WU
services.

The store_miss ACLs can be based on request or reply. So nice things
like reply Content-Type header etc. can be used.


If your refresh_pattern causes something to be a HIT in cache, then the
store_miss stuff will never happen of course.

Likewise, if the store_miss prevents something being added to cache the
refresh_pattern will not then be able to have any effect on its cache entry.



> 2.Is the position of your squid config important to prevent logical
> conflicts? for example should it be before above refresh patterns to prevent
> deduplication ?
> 
> acl wu dstdom_regex \.download\.windowsupdate\.com$
> acl wu-rejects dstdom_regex stats
> acl GET method GET
> cache_peer 127.0.0.1 parent 8080 0 proxy-only no-tproxy no-digest no-query
> no-netdb-exchange name=ms1
> cache_peer_access ms1 allow GET wu !wu-rejects
> cache_peer_access ms1 deny all
> never_direct allow GET wu !wu-rejects
> never_direct deny all


For these directives ordering is relevant only with regards to other
lines of the same directive name.

The exception being cache_peer_access; where the peer name field defines
which lines are a sequential group. And the cache_peer definition line
must come first.


> 
> 3.Is it good idea to change your squid config as bellow to have more hits?
> Or maybe it is big mistake !
> 
> acl msip dst 13.107.4.50
> acl wu dstdom_regex \.download\.windowsupdate\.com$
> \.download\.microsoft\.com$
> acl wu-rejects dstdom_regex stats
> acl GET method GET
> cache_peer 127.0.0.1 parent 8080 0 proxy-only no-tproxy no-digest no-query
> no-netdb-exchange name=ms1
> cache_peer_access ms1 allow GET wu !wu-rejects
> cache_peer_access ms1 allow GET msip !wu-rejects
> cache_peer_access ms1 deny all
> never_direct allow GET wu !wu-rejects
> never_direct allow GET msip !wu-rejects
> never_direct deny all


Your question here is not clear. None of this config is directly related
to HITs. With Eliezers setup HITs are a intentional by-product of the
manipulatinoon happening in the peer.
So you either use the peer and get what HITs it causes, or you don't.

> 
> 4.Current storage capacity is 500G andmore than 50% of it becomes full and
> growing fast . Is there any mechanism for garbage collection in your code ?
> If not is it good idea to remove files based on last access time (ls -ltu
> /cache1/body/v1/) ? should i also delete old files from header and request
> folders ?
> 

I'll leave that to Eliezer to answer.

Amos

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-19 Thread Omid Kosari
Eliezer Croitoru-2 wrote
> Hey Omid,
> 
> Indeed my preference is that if you can ask ask and I will try to give you
> couple more details on the service and the subject.

Hey Eliezer,

1.I have refresh patterns from days before your code . Currently i prefer
not to store windows updates in squid internal storage because of
deduplication . Now what should i do ? delete this refresh pattern ? or even
create a pattern not to cache windows updates ?

refresh_pattern -i
(microsoft|windowsupdate)\.com/.*?\.(cab|exe|dll|ms[iuf]|asf|wm[va]|dat|zip|iso|psf)$
10080 100% 172800 ignore-no-store ignore-reload ignore-private
ignore-must-revalidate override-expire override-lastmod

2.Is the position of your squid config important to prevent logical
conflicts? for example should it be before above refresh patterns to prevent
deduplication ?

acl wu dstdom_regex \.download\.windowsupdate\.com$
acl wu-rejects dstdom_regex stats
acl GET method GET
cache_peer 127.0.0.1 parent 8080 0 proxy-only no-tproxy no-digest no-query
no-netdb-exchange name=ms1
cache_peer_access ms1 allow GET wu !wu-rejects
cache_peer_access ms1 deny all
never_direct allow GET wu !wu-rejects
never_direct deny all

3.Is it good idea to change your squid config as bellow to have more hits?
Or maybe it is big mistake !

acl msip dst 13.107.4.50
acl wu dstdom_regex \.download\.windowsupdate\.com$
\.download\.microsoft\.com$
acl wu-rejects dstdom_regex stats
acl GET method GET
cache_peer 127.0.0.1 parent 8080 0 proxy-only no-tproxy no-digest no-query
no-netdb-exchange name=ms1
cache_peer_access ms1 allow GET wu !wu-rejects
cache_peer_access ms1 allow GET msip !wu-rejects
cache_peer_access ms1 deny all
never_direct allow GET wu !wu-rejects
never_direct allow GET msip !wu-rejects
never_direct deny all

4.Current storage capacity is 500G andmore than 50% of it becomes full and
growing fast . Is there any mechanism for garbage collection in your code ?
If not is it good idea to remove files based on last access time (ls -ltu
/cache1/body/v1/) ? should i also delete old files from header and request
folders ?




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678581.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-19 Thread Omid Kosari
Also i have seen that another guy did successfully something like that (not
exactly ) in this thread
http://squid-web-proxy-cache.1019090.n4.nabble.com/cache-peer-hit-miss-and-reject-td4661928.html



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678574.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-18 Thread Omid Kosari
Alex Rousskov wrote
> On 07/18/2016 05:39 AM, Omid Kosari wrote:
> 
>> acl mshit rep_header X-SHMSCDN HIT
>> clientside_tos 0x30 mshit
> 
> You cannot use response-based ACLs like rep_header with clientside_tos.
> That directive is currently evaluated only at request processing time,
> before there is a response.
> 
>> 2016/07/18 16:26:31.927 kid1| WARNING: mshit ACL is used in context
>> without
>> an HTTP response. Assuming mismatch.
> 
> ... which is what Squid is trying to tell you.
> 
> 
> HTH,
> 
> Alex.
> 
> ___
> squid-users mailing list

> squid-users@.squid-cache

> http://lists.squid-cache.org/listinfo/squid-users

Apart from that , can you confirm that we may use cutom header in rep_header
?
Also the problem is acl mshit does not count att all .



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678566.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-18 Thread Alex Rousskov
On 07/18/2016 05:39 AM, Omid Kosari wrote:

> acl mshit rep_header X-SHMSCDN HIT
> clientside_tos 0x30 mshit

You cannot use response-based ACLs like rep_header with clientside_tos.
That directive is currently evaluated only at request processing time,
before there is a response.

> 2016/07/18 16:26:31.927 kid1| WARNING: mshit ACL is used in context without
> an HTTP response. Assuming mismatch.

... which is what Squid is trying to tell you.


HTH,

Alex.

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-18 Thread Eliezer Croitoru
About the mismatch log output I cannot say a thing since I have not researched 
it.
And about an option to add a HIT HEADER you can use the next script:
https://gist.github.com/elico/ac58073812b8cad14ef154d8730e22cb

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Monday, July 18, 2016 2:39 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Dear Eliezer,

Unfortunately no success . I will describe what i did maybe i missed something .

run the command
perl -pi -e '$/=""; s/\r\n\r\n/\r\nX-SHMSCDN: HIT\r\n\r\n/;' 
/cache1/header/v1/*

and verified that the text injected correctly

squid config

acl mshit rep_header X-SHMSCDN HIT
clientside_tos 0x30 mshit

but got the following popular log
2016/07/18 16:26:31.927 kid1| WARNING: mshit ACL is used in context without an 
HTTP response. Assuming mismatch.
2016/07/18 16:26:31.927 kid1| 28,3| Acl.cc(158) matches: checked: mshit = 0


One more thing . as i am not so familiar with perl , may i ask you to please 
edit it to ignore the files which already have the text ?

Thanks




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678557.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-18 Thread Omid Kosari
Dear Eliezer,

Unfortunately no success . I will describe what i did maybe i missed
something .

run the command
perl -pi -e '$/=""; s/\r\n\r\n/\r\nX-SHMSCDN: HIT\r\n\r\n/;' 
/cache1/header/v1/*

and verified that the text injected correctly

squid config

acl mshit rep_header X-SHMSCDN HIT
clientside_tos 0x30 mshit

but got the following popular log
2016/07/18 16:26:31.927 kid1| WARNING: mshit ACL is used in context without
an HTTP response. Assuming mismatch.
2016/07/18 16:26:31.927 kid1| 28,3| Acl.cc(158) matches: checked: mshit = 0


One more thing . as i am not so familiar with perl , may i ask you to please
edit it to ignore the files which already have the text ?

Thanks




--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678557.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-17 Thread Omid Kosari
Apart from previous email , maybe this is a bug or not but the fetcher does
not release open files/sockets . 
Its number of open files just grows . currently i have added 'ulimit 65535'
at the line 4 of fetch-task.sh to see what happens . before it was killed.



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678536.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-17 Thread Omid Kosari
It looks like the guy there is having the same request as I have. 

http://squid-web-proxy-cache.1019090.n4.nabble.com/cache-peer-communication-about-HIT-MISS-between-squid-and-and-non-squid-peer-td4600931.html



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678532.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-16 Thread Eliezer Croitoru
Hey Omid,

1. You should understand what you are doing and not blindly fetch downloads.
The estimation is that you will need maximum of 100GB of storage for the whole 
"store" for a period of time.
This is also due to this that Microsoft Windows Update service will not 
download files without a need.
The fetcher should help you to download periodical updates but I assume that 
the updates have a limit... You should consider asking MS on what is expected 
to be in the downloads or when do download happen.

2. If you need more then one location you should use some logical volume to do 
that instead of spreading manually over more then one disk.
This is based on the basic understanding that the service is a "web-service" 
which is serving files and you should treat it the same way like any other.
When I am running a web-service and I need more then one disk I do not run to 
"spread" it manually but use some OS level tools.
I do trust the OS and the logical volume management tools to do their work 
properly. When I will loss my trust in them I will stop using this OS, this is 
as simple as that.
3. The HITS are counted but I need to dig into the code to verify how a HIT is 
logged and how it can be counted manually.
QOS or TOS, by what? How?
The service how one way out and one way in..
If the requested file is in store you will not see outgoing traffic for the 
file.
The right way to show a HIT in this service is to change the response headers 
file to have another header.
This could be done manually using a tiny script but not as a part of the store 
software.
An example to such addition would be:
# perl -pi -e '$/=""; s/\r\n\r\n/\r\nX-Store-Hit: HIT\r\n\r\n/; 
/var/storedata/header/v1/fff8db4723842074ab8d8cc4ad20a0f97d47f6d849149c81c4e52abc727d43b5

And it will change the response headers and these can be seen in a squid 
access.log using a log format.
I can think of other ways to report this but a question:
If it works as expected and expected to always work, why would you want to see 
the HIT in a QOS or TOS?
QOS and TOS levels of socket manipulation will require me to find a way to hack 
the simple web service and I will probably won’t go this way.
I do know that you will be able to manipulate QOS or TOS in squid if some 
header exist in the response.

I will might be able to look at the subject if there is a real 
technical\functional need for that in a long term usage.

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Friday, July 15, 2016 8:48 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hi,

Questions
1-What happens if disk or partition becomes full ?
2-Is there a way to use more than one location for store ?
3-Currently hits from your code , could not be counted .How i can use qos
flows/tos mark those hits ?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678524.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-15 Thread Omid Kosari
Hi,

Questions
1-What happens if disk or partition becomes full ?
2-Is there a way to use more than one location for store ?
3-Currently hits from your code , could not be counted .How i can use qos
flows/tos mark those hits ?



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678524.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-14 Thread Eliezer Croitoru
Hey Omid,

The key concept is that it is possible but not always worth the effort..
I have tested it to work for Windows 10 and for couple other platforms but I 
didn't verified how it will react to every version of Windows 7.
I have tested how things works with WSUSOFFLINE and you will need to change the 
regex dstdomain into:
acl wu dstdom_regex download\.windowsupdate\.com$ download\.microsoft\.com$

Now you need to have my latest updated version in order to avoid caching of MS 
AV updates which are critical and should never be cached for more then 1 hour.

You can try to "seed" the cache using a client which will run WSUSOFFLINE but 
to my understanding it's not required since you will store more then you 
actually need.
If one user is downloading an ancient or special update you don't need it 
stored unless you can predict it will be used\downloaded a lot.

Let me know if you need some help with it.

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users [mailto:squid-users-boun...@lists.squid-cache.org] On Behalf 
Of Omid Kosari
Sent: Thursday, July 14, 2016 2:59 PM
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Windows Updates a Caching Stub zone, A windows 
updates store.

Hi,

Great idea . I was looking for something like this for years and i was too
lazy to start it myself ;)

I am going to test your code in a multi thousand client ISP .

It would more great if use the experiences of http://www.wsusoffline.net/
specially for your fetcher . It is GPL

Also the ip address 13.107.4.50 is mainly used by microsoft for its download
services . With services like
https://www.virustotal.com/en-gb/ip-address/13.107.4.50/information/ we have
found that other domains also used for update/download services . Maybe not
bad if create special things for this ip address .

Thanks in advance



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678492.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-14 Thread Omid Kosari
Hi,

Great idea . I was looking for something like this for years and i was too
lazy to start it myself ;)

I am going to test your code in a multi thousand client ISP .

It would more great if use the experiences of http://www.wsusoffline.net/
specially for your fetcher . It is GPL

Also the ip address 13.107.4.50 is mainly used by microsoft for its download
services . With services like
https://www.virustotal.com/en-gb/ip-address/13.107.4.50/information/ we have
found that other domains also used for update/download services . Maybe not
bad if create special things for this ip address .

Thanks in advance



--
View this message in context: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Windows-Updates-a-Caching-Stub-zone-A-windows-updates-store-tp4678454p4678492.html
Sent from the Squid - Users mailing list archive at Nabble.com.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


[squid-users] Windows Updates a Caching Stub zone, A windows updates store.

2016-07-10 Thread Eliezer Croitoru
Windows Updates a Caching Stub zone
 

I have been working for quite some time trying to see if it is possible to
cache windows updates using Squid.
I have seen it is possible but to test a concept I wrote a small proxy and a
helper tool.
The tools are a Proof Of Concept and an almost full implementation of the
idea.
I consider it a Squid Helper tool.

Feel free to use the tool and if you need any help using it just contact me
here or off list.

Eliezer


Eliezer Croitoru  
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il
 

<>___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


RE: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Dave Burkholder
Are there any comments here? I've tried adding the following options from 
http://wiki.squid-cache.org/SquidFaq/WindowsUpdate (even though I don't 
especially want to cache updates)

range_offset_limit -1
maximum_object_size 200 MB
quick_abort_min -1

No joy. I've tried transparent  standard proxy modes. Not using authentication 
anywhere. I've now tested on 4 LANs behind Squid 3.2.6 on CentOS 5  6 machines 
and WU isn't working on any of them.

On one machine I downgraded to 3.2.0.18 and was able to get WU to work. Was 
there a regression since 3.2.0.18?

Thanks,

Dave


-Original Message-
From: Dave Burkholder 
Sent: Wednesday, January 30, 2013 9:09 PM
To: squid-users@squid-cache.org
Subject: [squid-users] Windows Updates on 3.2.6

Hello everyone,

I've upgraded a number of machines from 3.1.12 to squid 3.2.6. Since then, 
Windows Updates haven't completed and I'm totally scratching my head.


Has anyone else experienced this problem? (I'm including my config file below.) 
Or have some ACLs or defaults changed in 3.2.x that might be triggering this?



Thanks,

Dave

 

#
# Recommended minimum configuration:
#
# webconfig: acl_start
acl webconfig_lan src 192.168.0.0/16 10.0.0.0/8 acl webconfig_to_lan dst 
192.168.0.0/16 10.0.0.0/8 # webconfig: acl_end

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing # should be 
allowed

acl SSL_ports port 443
acl SSL_ports port 81 83 1 # Webconfig / Webmail / Webmin
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443# https
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl Safe_ports port 81 83 1# Webconfig / Webmail / Webmin
acl CONNECT method CONNECT

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost http_access allow manager localhost 
http_access deny manager http_access allow webconfig_to_lan

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports http_access deny CONNECT 
!SSL_ports

# We strongly recommend the following be uncommented to protect innocent # web 
applications running on the proxy server who think the only # one who can 
access services on localhost is a local user #http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

# Example rule allowing access from your local networks.
# from where browsing should be allowed
http_access allow localhost

# And finally deny all other access to this proxy http_access allow 
webconfig_lan http_access deny all

# Squid normally listens to port 3128
http_port 3128

# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?

# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /var/spool/squid 2048 16 256

# Leave coredumps in the first cache dir coredump_dir /var/spool/squid

follow_x_forwarded_for allow localhost

# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp:144020%10080
refresh_pattern ^gopher:14400%1440
refresh_pattern -i (/cgi-bin/|\?) 00%0
refresh_pattern .020%4320
redirect_program /usr/sbin/adzapper
maximum_object_size 51200 KB




Re: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Eliezer Croitoru

Squid access logs?
What is the exact problem? you can't download at all or there is a problem?
Please share your squid.conf and squid -v output.
Where did you got your RPM? from my repo?

Please share more info and if you can get tcpdump output this can really 
help to find your problem.
Note that I am using 3.2.6 + 3.3 + 3.HEAD on CentOS 6 and it works fine 
with windows updates as for right now.


Regards,
Eliezer

On 1/31/2013 4:54 PM, Dave Burkholder wrote:

Are there any comments here? I've tried adding the following options 
fromhttp://wiki.squid-cache.org/SquidFaq/WindowsUpdate  (even though I don't 
especially want to cache updates)

range_offset_limit -1
maximum_object_size 200 MB
quick_abort_min -1

No joy. I've tried transparent  standard proxy modes. Not using authentication 
anywhere. I've now tested on 4 LANs behind Squid 3.2.6 on CentOS 5  6 machines and 
WU isn't working on any of them.

On one machine I downgraded to 3.2.0.18 and was able to get WU to work. Was 
there a regression since 3.2.0.18?

Thanks,

Dave


RE: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Dave Burkholder
Hello Eliezer,

Thank you for your reply. My exact problem is that Windows Updates do not 
install or even download at all.

The squid RPMs were built by my partner in 2 architectures: Centos 5 i386 and 
Centos 6 x86_64. Same nonfunctioning behavior in both. 

I didn't realize you had a squid repo; I'd be glad to try your builds if 
they're compatible. Where is your repo hosted?


I had included the conf file in my first email, but a link would be better:

www.thinkwelldesigns.com/squid_conf.txt 


###
squid -v: (Centos 6 x86_64)
---
Squid Cache: Version 3.2.6
configure options:  '--host=x86_64-unknown-linux-gnu' 
'--build=x86_64-unknown-linux-gnu' '--program-prefix=' '--prefix=/usr' 
'--exec-prefix=/usr' '--bindir=/usr/bin' '--sbindir=/usr/sbin' 
'--sysconfdir=/etc' '--datadir=/usr/share' '--includedir=/usr/include' 
'--libdir=/usr/lib64' '--libexecdir=/usr/libexec' '--sharedstatedir=/var/lib' 
'--mandir=/usr/share/man' '--infodir=/usr/share/info' '--exec_prefix=/usr' 
'--libexecdir=/usr/lib64/squid' '--localstatedir=/var' 
'--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=DB,LDAP,MSNT,MSNT-multi-domain,NCSA,NIS,PAM,POP3,RADIUS,SASL,SMB,getpwnam'
 '--enable-auth-ntlm=smb_lm,fake' '--enable-auth-digest=file,LDAP,eDirectory' 
'--enable-auth-negotiate=kerberos' '--enable-extern
 al-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' 
'--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-http-violations' 
'--enable-icap-client' '--enable-ident-lookups' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
'--enable-useragent-log' '--enable-wccpv2' '--enable-esi' '--enable-ecap' 
'--with-aio' '--with-default-user=squid' '--with-filedescriptors=16384' 
'--with-dl' '--with-openssl' '--with-pthreads' 
'build_alias=x86_64-unknown-linux-gnu' 'host_alias=x86_64-unknown-linux-gnu' 
'CFLAGS=-O2 -g -fpie' 'CXXFLAGS=-O2 -g -fpie' 
'PKG_CONFIG_PATH=/usr/lib64/pkgconfig:/usr/share/pkgconfig'

###
squid -v: (Centos 5 i386)
---
Squid Cache: Version 3.2.6
configure options:  '--host=i686-redhat-linux-gnu' 
'--build=i686-redhat-linux-gnu' '--target=i386-redhat-linux' 
'--program-prefix=' '--prefix=/usr' '--exec-prefix=/usr' '--bindir=/usr/bin' 
'--sbindir=/usr/sbin' '--sysconfdir=/etc' '--datadir=/usr/share' 
'--includedir=/usr/include' '--libdir=/usr/lib' '--libexecdir=/usr/libexec' 
'--sharedstatedir=/usr/com' '--mandir=/usr/share/man' 
'--infodir=/usr/share/info' '--exec_prefix=/usr' '--libexecdir=/usr/lib/squid' 
'--localstatedir=/var' '--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=LDAP,MSNT,NCSA,PAM,SMB,YP,getpwnam,multi-domain-NTLM,SASL,DB,POP3,squid_radius_auth'
 '--enable-auth-ntlm=smb_lm,no_check,fakeauth' 
'--enable-auth-digest=password,ldap,eDirectory' '--en
 able-auth-negotiate=squid_kerb_auth' 
'--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group'
 '--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-icap-client' 
'--enable-ident-lookups' '--with-large-files' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
'--enable-underscores' '--enable-useragent-log' '--enable-wccpv2' 
'--enable-esi' '--with-aio' '--with-default-user=squid' 
'--with-filedescriptors=16384' '--with-dl' '--with-openssl' '--with-pthreads' 
'--with-winbind-auth-challenge' '--enable-http-violations' 'CFLAGS=-march=i686' 
'CXXFLAGS=-march=i686' 'build_alias=i686-redhat-linux-gnu' 
'host_alias=i686-redhat-linux-gnu' 'target_alias=i386-redhat-linux' 
'LDFLAGS=-pie' 'PKG_CONFIG_PATH=/usr/lib/pkgconfig:/usr/share/pkgconfig' 
--enable-ltd
 l-convenience

I'll send access.log data in a few minutes.

Again, thank you so much for your reply.

Dave


-Original Message-
From: Eliezer Croitoru [mailto:elie...@ngtech.co.il] 
Sent: Thursday, January 31, 2013 10:04 AM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Windows Updates on 3.2.6

Squid access logs?
What is the exact problem? you can't download

RE: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Dave Burkholder
Here are links to squid access.log

www.thinkwelldesigns.com/access_log.txt 

And tcpdump for 10.0.2.150

www.thinkwelldesigns.com/tcpdump.zip 

Thanks,

Dave

-Original Message-
From: Dave Burkholder 
Sent: Thursday, January 31, 2013 10:29 AM
To: Eliezer Croitoru; squid-users@squid-cache.org
Subject: RE: [squid-users] Windows Updates on 3.2.6

Hello Eliezer,

Thank you for your reply. My exact problem is that Windows Updates do not 
install or even download at all.

The squid RPMs were built by my partner in 2 architectures: Centos 5 i386 and 
Centos 6 x86_64. Same nonfunctioning behavior in both. 

I didn't realize you had a squid repo; I'd be glad to try your builds if 
they're compatible. Where is your repo hosted?


I had included the conf file in my first email, but a link would be better:

www.thinkwelldesigns.com/squid_conf.txt 


###
squid -v: (Centos 6 x86_64)
---
Squid Cache: Version 3.2.6
configure options:  '--host=x86_64-unknown-linux-gnu' 
'--build=x86_64-unknown-linux-gnu' '--program-prefix=' '--prefix=/usr' 
'--exec-prefix=/usr' '--bindir=/usr/bin' '--sbindir=/usr/sbin' 
'--sysconfdir=/etc' '--datadir=/usr/share' '--includedir=/usr/include' 
'--libdir=/usr/lib64' '--libexecdir=/usr/libexec' '--sharedstatedir=/var/lib' 
'--mandir=/usr/share/man' '--infodir=/usr/share/info' '--exec_prefix=/usr' 
'--libexecdir=/usr/lib64/squid' '--localstatedir=/var' 
'--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=DB,LDAP,MSNT,MSNT-multi-domain,NCSA,NIS,PAM,POP3,RADIUS,SASL,SMB,getpwnam'
 '--enable-auth-ntlm=smb_lm,fake' '--enable-auth-digest=file,LDAP,eDirectory' 
'--enable-auth-negotiate=kerberos' '--enable-extern
   al-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' 
'--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-http-violations' 
'--enable-icap-client' '--enable-ident-lookups' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
'--enable-useragent-log' '--enable-wccpv2' '--enable-esi' '--enable-ecap' 
'--with-aio' '--with-default-user=squid' '--with-filedescriptors=16384' 
'--with-dl' '--with-openssl' '--with-pthreads' 
'build_alias=x86_64-unknown-linux-gnu' 'host_alias=x86_64-unknown-linux-gnu' 
'CFLAGS=-O2 -g -fpie' 'CXXFLAGS=-O2 -g -fpie' 
'PKG_CONFIG_PATH=/usr/lib64/pkgconfig:/usr/share/pkgconfig'

###
squid -v: (Centos 5 i386)
---
Squid Cache: Version 3.2.6
configure options:  '--host=i686-redhat-linux-gnu' 
'--build=i686-redhat-linux-gnu' '--target=i386-redhat-linux' 
'--program-prefix=' '--prefix=/usr' '--exec-prefix=/usr' '--bindir=/usr/bin' 
'--sbindir=/usr/sbin' '--sysconfdir=/etc' '--datadir=/usr/share' 
'--includedir=/usr/include' '--libdir=/usr/lib' '--libexecdir=/usr/libexec' 
'--sharedstatedir=/usr/com' '--mandir=/usr/share/man' 
'--infodir=/usr/share/info' '--exec_prefix=/usr' '--libexecdir=/usr/lib/squid' 
'--localstatedir=/var' '--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=LDAP,MSNT,NCSA,PAM,SMB,YP,getpwnam,multi-domain-NTLM,SASL,DB,POP3,squid_radius_auth'
 '--enable-auth-ntlm=smb_lm,no_check,fakeauth' 
'--enable-auth-digest=password,ldap,eDirectory' '--en
   able-auth-negotiate=squid_kerb_auth' 
'--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group'
 '--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-icap-client' 
'--enable-ident-lookups' '--with-large-files' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
'--enable-underscores' '--enable-useragent-log' '--enable-wccpv2' 
'--enable-esi' '--with-aio' '--with-default-user=squid' 
'--with-filedescriptors=16384' '--with-dl' '--with-openssl' '--with-pthreads' 
'--with-winbind-auth-challenge' '--enable-http-violations' 'CFLAGS=-march=i686' 
'CXXFLAGS=-march=i686' 'build_alias=i686-redhat-linux-gnu' 
'host_alias=i686-redhat-linux-gnu' 'target_alias=i386-redhat-linux' 
'LDFLAGS=-pie' 'PKG_CONFIG_PATH=/usr/lib/pkgconfig:/usr/share/pkgconfig' 
--enable-l
 td  l-convenience

I'll

Re: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Eliezer Croitoru

On 1/31/2013 6:11 PM, Dave Burkholder wrote:

Here are links to squid access.log

www.thinkwelldesigns.com/access_log.txt

Ok seems like pretty normal to me from squid point of view.
I have the same lines which windows tries to access and dosn't exist.



And tcpdump for 10.0.2.150

www.thinkwelldesigns.com/tcpdump.zip
In what format is it? I have tried to read it with wireshark and it 
seems like corrupted or something.

I think I do understand what is the problem from squid.conf.

range_offset_limit -1

Remove it..
try to make the proxy as simple as it is.

The above can cause windows to not fetch objects and when fails tries to 
use SSL which I dont know if it can or cannot use.


Eliezer



Thanks,

Dave

-Original Message-
From: Dave Burkholder
Sent: Thursday, January 31, 2013 10:29 AM
To: Eliezer Croitoru; squid-users@squid-cache.org
Subject: RE: [squid-users] Windows Updates on 3.2.6

Hello Eliezer,

Thank you for your reply. My exact problem is that Windows Updates do not 
install or even download at all.

The squid RPMs were built by my partner in 2 architectures: Centos 5 i386 and 
Centos 6 x86_64. Same nonfunctioning behavior in both.

I didn't realize you had a squid repo; I'd be glad to try your builds if 
they're compatible. Where is your repo hosted?


I had included the conf file in my first email, but a link would be better:

www.thinkwelldesigns.com/squid_conf.txt


###
squid -v: (Centos 6 x86_64)
---
Squid Cache: Version 3.2.6
configure options:  '--host=x86_64-unknown-linux-gnu' 
'--build=x86_64-unknown-linux-gnu' '--program-prefix=' '--prefix=/usr' 
'--exec-prefix=/usr' '--bindir=/usr/bin' '--sbindir=/usr/sbin' 
'--sysconfdir=/etc' '--datadir=/usr/share' '--includedir=/usr/include' 
'--libdir=/usr/lib64' '--libexecdir=/usr/libexec' '--sharedstatedir=/var/lib' 
'--mandir=/usr/share/man' '--infodir=/usr/share/info' '--exec_prefix=/usr' 
'--libexecdir=/usr/lib64/squid' '--localstatedir=/var' 
'--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=DB,LDAP,MSNT,MSNT-multi-domain,NCSA,NIS,PAM,POP3,RADIUS,SASL,SMB,getpwnam'
 '--enable-auth-ntlm=smb_lm,fake' '--enable-auth-digest=file,LDAP,eDirectory' 
'--enable-auth-negotiate=kerberos' '--enable-extern
al-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' 
'--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-http-violations' 
'--enable-icap-client' '--enable-ident-lookups' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
'--enable-useragent-log' '--enable-wccpv2' '--enable-esi' '--enable-ecap' 
'--with-aio' '--with-default-user=squid' '--with-filedescriptors=16384' 
'--with-dl' '--with-openssl' '--with-pthreads' 
'build_alias=x86_64-unknown-linux-gnu' 'host_alias=x86_64-unknown-linux-gnu' 
'CFLAGS=-O2 -g -fpie' 'CXXFLAGS=-O2 -g -fpie' 
'PKG_CONFIG_PATH=/usr/lib64/pkgconfig:/usr/share/pkgconfig'

###
squid -v: (Centos 5 i386)
---
Squid Cache: Version 3.2.6
configure options:  '--host=i686-redhat-linux-gnu' 
'--build=i686-redhat-linux-gnu' '--target=i386-redhat-linux' 
'--program-prefix=' '--prefix=/usr' '--exec-prefix=/usr' '--bindir=/usr/bin' 
'--sbindir=/usr/sbin' '--sysconfdir=/etc' '--datadir=/usr/share' 
'--includedir=/usr/include' '--libdir=/usr/lib' '--libexecdir=/usr/libexec' 
'--sharedstatedir=/usr/com' '--mandir=/usr/share/man' 
'--infodir=/usr/share/info' '--exec_prefix=/usr' '--libexecdir=/usr/lib/squid' 
'--localstatedir=/var' '--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
'--with-logdir=$(localstatedir)/log/squid' 
'--with-pidfile=$(localstatedir)/run/squid.pid' '--disable-dependency-tracking' 
'--enable-arp-acl' '--enable-follow-x-forwarded-for' '--enable-auth' 
'--enable-auth-basic=LDAP,MSNT,NCSA,PAM,SMB,YP,getpwnam,multi-domain-NTLM,SASL,DB,POP3,squid_radius_auth'
 '--enable-auth-ntlm=smb_lm,no_check,fakeauth' 
'--enable-auth-digest=password,ldap,eDirectory' '--en
able-auth-negotiate=squid_kerb_auth' 
'--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group'
 '--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
'--enable-delay-pools' '--enable-epoll' '--enable-icap-client' 
'--enable-ident-lookups' '--with-large-files' '--enable-linux-netfilter' 
'--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
'--enable-ssl' '--enable-ssl-crtd

RE: [squid-users] Windows Updates on 3.2.6

2013-01-31 Thread Dave Burkholder
Well, I'm not too familiar with tcpdump. It was tcpdump -i any host 10.0.2.150 
-w tcpdump.txt

I'll be glad to remove the range_offset_limit -1 line. I'd added it because the 
Squid wiki page @ http://wiki.squid-cache.org/SquidFaq/WindowsUpdate specifies 
it when wanting to cache updates. I wasn't after caching -- only after 
working so I thought I'd try it either way.



-Original Message-
From: Eliezer Croitoru [mailto:elie...@ngtech.co.il] 
Sent: Thursday, January 31, 2013 4:38 PM
To: squid-users@squid-cache.org
Subject: Re: [squid-users] Windows Updates on 3.2.6

On 1/31/2013 6:11 PM, Dave Burkholder wrote:
 Here are links to squid access.log

 www.thinkwelldesigns.com/access_log.txt
Ok seems like pretty normal to me from squid point of view.
I have the same lines which windows tries to access and dosn't exist.


 And tcpdump for 10.0.2.150

 www.thinkwelldesigns.com/tcpdump.zip
In what format is it? I have tried to read it with wireshark and it seems like 
corrupted or something.
I think I do understand what is the problem from squid.conf.

range_offset_limit -1

Remove it..
try to make the proxy as simple as it is.

The above can cause windows to not fetch objects and when fails tries to use 
SSL which I dont know if it can or cannot use.

Eliezer


 Thanks,

 Dave

 -Original Message-
 From: Dave Burkholder
 Sent: Thursday, January 31, 2013 10:29 AM
 To: Eliezer Croitoru; squid-users@squid-cache.org
 Subject: RE: [squid-users] Windows Updates on 3.2.6

 Hello Eliezer,

 Thank you for your reply. My exact problem is that Windows Updates do not 
 install or even download at all.

 The squid RPMs were built by my partner in 2 architectures: Centos 5 i386 and 
 Centos 6 x86_64. Same nonfunctioning behavior in both.

 I didn't realize you had a squid repo; I'd be glad to try your builds if 
 they're compatible. Where is your repo hosted?


 I had included the conf file in my first email, but a link would be better:

 www.thinkwelldesigns.com/squid_conf.txt


 ##
 #
 squid -v: (Centos 6 x86_64)
 --
 -
 Squid Cache: Version 3.2.6
 configure options:  '--host=x86_64-unknown-linux-gnu' 
 '--build=x86_64-unknown-linux-gnu' '--program-prefix=' '--prefix=/usr' 
 '--exec-prefix=/usr' '--bindir=/usr/bin' '--sbindir=/usr/sbin' 
 '--sysconfdir=/etc' '--datadir=/usr/share' '--includedir=/usr/include' 
 '--libdir=/usr/lib64' '--libexecdir=/usr/libexec' '--sharedstatedir=/var/lib' 
 '--mandir=/usr/share/man' '--infodir=/usr/share/info' '--exec_prefix=/usr' 
 '--libexecdir=/usr/lib64/squid' '--localstatedir=/var' 
 '--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
 '--with-logdir=$(localstatedir)/log/squid' 
 '--with-pidfile=$(localstatedir)/run/squid.pid' 
 '--disable-dependency-tracking' '--enable-arp-acl' 
 '--enable-follow-x-forwarded-for' '--enable-auth' 
 '--enable-auth-basic=DB,LDAP,MSNT,MSNT-multi-domain,NCSA,NIS,PAM,POP3,RADIUS,SASL,SMB,getpwnam'
  '--enable-auth-ntlm=smb_lm,fake' '--enable-auth-digest=file,LDAP,eDirectory' 
 '--enable-auth-negotiate=kerberos' '--enable-exte
 rn
 al-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' 
 '--enable-cache-digests' '--enable-cachemgr-hostname=localhost' 
 '--enable-delay-pools' '--enable-epoll' '--enable-http-violations' 
 '--enable-icap-client' '--enable-ident-lookups' '--enable-linux-netfilter' 
 '--enable-referer-log' '--enable-removal-policies=heap,lru' '--enable-snmp' 
 '--enable-ssl' '--enable-ssl-crtd' '--enable-storeio=aufs,diskd,ufs' 
 '--enable-useragent-log' '--enable-wccpv2' '--enable-esi' '--enable-ecap' 
 '--with-aio' '--with-default-user=squid' '--with-filedescriptors=16384' 
 '--with-dl' '--with-openssl' '--with-pthreads' 
 'build_alias=x86_64-unknown-linux-gnu' 'host_alias=x86_64-unknown-linux-gnu' 
 'CFLAGS=-O2 -g -fpie' 'CXXFLAGS=-O2 -g -fpie' 
 'PKG_CONFIG_PATH=/usr/lib64/pkgconfig:/usr/share/pkgconfig'

 ##
 #
 squid -v: (Centos 5 i386)
 --
 -
 Squid Cache: Version 3.2.6
 configure options:  '--host=i686-redhat-linux-gnu' 
 '--build=i686-redhat-linux-gnu' '--target=i386-redhat-linux' 
 '--program-prefix=' '--prefix=/usr' '--exec-prefix=/usr' '--bindir=/usr/bin' 
 '--sbindir=/usr/sbin' '--sysconfdir=/etc' '--datadir=/usr/share' 
 '--includedir=/usr/include' '--libdir=/usr/lib' '--libexecdir=/usr/libexec' 
 '--sharedstatedir=/usr/com' '--mandir=/usr/share/man' 
 '--infodir=/usr/share/info' '--exec_prefix=/usr' 
 '--libexecdir=/usr/lib/squid' '--localstatedir=/var' 
 '--datadir=/usr/share/squid' '--sysconfdir=/etc/squid' 
 '--with-logdir=$(localstatedir)/log/squid' 
 '--with-pidfile=$(localstatedir)/run/squid.pid' 
 '--disable-dependency-tracking' '--enable-arp-acl' 
 '--enable-follow-x-forwarded

[squid-users] Windows Updates on 3.2.6

2013-01-30 Thread Dave Burkholder
Hello everyone,

I've upgraded a number of machines from 3.1.12 to squid 3.2.6. Since then, 
Windows Updates haven't completed and I'm totally scratching my head.


Has anyone else experienced this problem? (I'm including my config file below.) 
Or have some ACLs or defaults changed in 3.2.x that might be triggering this?



Thanks,

Dave

 

#
# Recommended minimum configuration:
#
# webconfig: acl_start
acl webconfig_lan src 192.168.0.0/16 10.0.0.0/8
acl webconfig_to_lan dst 192.168.0.0/16 10.0.0.0/8
# webconfig: acl_end

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed

acl SSL_ports port 443
acl SSL_ports port 81 83 1 # Webconfig / Webmail / Webmin
acl Safe_ports port 80# http
acl Safe_ports port 21# ftp
acl Safe_ports port 443# https
acl Safe_ports port 70# gopher
acl Safe_ports port 210# wais
acl Safe_ports port 1025-65535# unregistered ports
acl Safe_ports port 280# http-mgmt
acl Safe_ports port 488# gss-http
acl Safe_ports port 591# filemaker
acl Safe_ports port 777# multiling http
acl Safe_ports port 81 83 1# Webconfig / Webmail / Webmin
acl CONNECT method CONNECT

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
http_access allow webconfig_to_lan

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on localhost is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

# Example rule allowing access from your local networks.
# from where browsing should be allowed
http_access allow localhost

# And finally deny all other access to this proxy
http_access allow webconfig_lan
http_access deny all

# Squid normally listens to port 3128
http_port 3128

# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?

# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /var/spool/squid 2048 16 256

# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid

follow_x_forwarded_for allow localhost

# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp:144020%10080
refresh_pattern ^gopher:14400%1440
refresh_pattern -i (/cgi-bin/|\?) 00%0
refresh_pattern .020%4320
redirect_program /usr/sbin/adzapper
maximum_object_size 51200 KB



Re: [squid-users] Windows Updates, YouTube and WoW

2010-11-09 Thread Kevin Wilcox
On 8 November 2010 22:43, Amos Jeffries squ...@treenet.co.nz wrote:
 On Mon, 8 Nov 2010 18:32:52 -0500, Kevin Wilcox kevin.wil...@gmail.com
 wrote:

snip

 I understand that it could be related to the partial content reply for
 the request and I understand that it could also be related to the
 URL/foo? style request. Is the best approach to just automatically
 pass anything for blizzard.com/worldofwarcraft.com straight through
 and not attempt to cache the updates? I've seen some comments where
 using

 acl QUERY urlpath_regex cgi-bin \?
 cache deny QUERY

 will cause those requests to not be cached (and I understand why that
 is) but I'm wondering if I should just ignore them altogether,
 especially given the third item - YouTube.

 Yes, don't use that QUERY stuff. The dynamic URL which are cacheable will
 have expiry and control headers to make it happen. The others are caught
 and discarded properly by the new default refresh_pattern for cgi-bin and
 \?.

Excellent! I'm still seeing a *huge* performance difference between having

acl warcraft dstdomain blizzard.vo.llnwd.net
acl warcraft dstdomain attdist.blizzard.com
cache deny warcraft

in my config. As in, orders of magnitude (literally, and much more
reliable than forgetting to change my comments when I limited Squid to
2GB ;)) different. If I'm going to ignore them altogether, and the P2P
downloaded portion won't be cached anyway, that seems to me to be the
cleanest way to do it.

 Caching youtube still currently requires the storeurl feature of 2.7 which
 has not bee ported to 3.x.
 There are embeded visitor details and timestamps of when the video was
 requested in the YT URL which cause the cache to fill up with large videos
 at URL which will never be re-requested. This actively prevents any totally
 unrelated web objects from using the cache space.

 It is a good idea to prevent the YT videos from being stored at all unless
 you can de-duplicate them.

I didn't realise that about the URL. I was thinking the request came
for the URL shown in the address bar -
http://youtube.com/watch?v=static_string, an address that can be
passed around and shared (how most of our folks share video). Good to
know that there is a unique request that's actually being sent for the
content.

Regarding storing - if put into production, the disk cache would be
sufficiently large enough to address storing a hundred gigs of videos
and regularly clearing the disk cache isn't a problem. I don't care if
we manually clear before the page actually expires, I'm just looking
to make *some* impact in the amount of bandwidth used by the target
audience.

 # Cache Mem - ideal amount of RAM to use
 cache_mem 2048 MB

 # Maximum object size - default is 4MB, not nearly enough to be useful
 maximum_object_size 1024 MB

 # Maximum object size in memory - we have 4GB, we can handle larger
 objects
 maximum_object_size_in_memory 512 MB

 Um, no you have 2GB (cache_mem) in which to store these objects. All 4+ of
 them.

Good point, I should have updated my comments as I played around with
the cache_mem and max object values.

Thanks Amos!

kmw


[squid-users] Windows Updates, YouTube and WoW

2010-11-08 Thread Kevin Wilcox
Hi all.

This is currently a test environment so making changes isn't an issue.

Initially I had issues with hosts updating any flavour of Microsoft
Windows but solved that with the included squid.conf. I'm even
getting real cache hits on some of the Windows XP and Windows 7
updates in my test lab, so the amount of effort I've put in so far is
pretty well justified. Since the target audience won't have access to
a local WSUS, I can pretty well count it as a win, even if the rest of
this email becomes moot.

Then came the big issue - World of Warcraft installation via the
downloaded client. Things pretty well fell apart. It would install up
to 20% and crash. Then it would install up to 25% and crash. Then 30%
and crash. It did that, crashing further in the process each time,
until it finally installed the base game (roughly 15 crashes). Due to
clamping down on P2P I disabled that update mechanism and told the
downloader to use only direct download. I'm averaging 0.00KB/s with
bursts from 2KB/s to 64 KB/s. If I take squid out of the line I get
speeds between 1 and 3 MB/s+ and things just work - but that sort of
defeats the purpose in having a device that will cache
non-authenticated user content. Having one user download a new 1 GB
patch, and it being available locally for the other couple of hundred,
would be ideal. Still, it isn't a deal breaker.

I understand that it could be related to the partial content reply for
the request and I understand that it could also be related to the
URL/foo? style request. Is the best approach to just automatically
pass anything for blizzard.com/worldofwarcraft.com straight through
and not attempt to cache the updates? I've seen some comments where
using

acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY

will cause those requests to not be cached (and I understand why that
is) but I'm wondering if I should just ignore them altogether,
especially given the third item - YouTube.

The target population for this cache is rather large. Typically,
youtube is a huge culprit for bandwidth usage and a lot of the times
it's hundreds of people hitting the same videos. I've been looking at
how to cache those and it seems like it's required to either not use
the above ACL or it's to setup another ACL that specifically allows
youtube.

All of those comments and workarounds have been regarding the 2.x set
of squid, though. I'm curious if there is a cleaner way to go about
caching youtube (or, perhaps I should say, video.google.com) in 3.1.x,
or if it's possible to cache things like the WoW updates now? We're
looking to experiment with some proprietary devices that claim to be
able to cache Windows Updates, YouTube/Google Video, etc., but I'm
wondering if my woes are just because of my inexperience with squid or
if they're just that far ahead in terms of functionality?

Any hints, tips or suggestions would be more than welcome!

Relevant version information and configuration files:

fergie# squid -v
Squid Cache: Version 3.1.9
configure options:  '--with-default-user=squid'
'--bindir=/usr/local/sbin' '--sbindir=/usr/local/sbin'
'--datadir=/usr/local/etc/squid'
'--libexecdir=/usr/local/libexec/squid' '--localstatedir=/var/squid'
'--sysconfdir=/usr/local/etc/squid' '--with-logdir=/var/log/squid'
'--with-pidfile=/var/run/squid/squid.pid'
'--enable-removal-policies=lru heap' '--disable-linux-netfilter'
'--disable-linux-tproxy' '--disable-epoll' '--disable-translation'
'--enable-auth=basic digest negotiate ntlm'
'--enable-basic-auth-helpers=DB NCSA PAM MSNT SMB squid_radius_auth'
'--enable-digest-auth-helpers=password'
'--enable-external-acl-helpers=ip_user session unix_group
wbinfo_group' '--enable-ntlm-auth-helpers=smb_lm' '--without-pthreads'
'--enable-storeio=ufs diskd' '--enable-disk-io=AIO Blocking
DiskDaemon' '--disable-ipv6' '--disable-snmp' '--disable-htcp'
'--disable-wccp' '--enable-pf-transparent' '--disable-ecap'
'--disable-loadable-modules' '--enable-kqueue' '--with-large-files'
'--prefix=/usr/local' '--mandir=/usr/local/man'
'--infodir=/usr/local/info/' '--build=amd64-portbld-freebsd8.1'
'build_alias=amd64-portbld-freebsd8.1' 'CC=cc' 'CFLAGS=-O2 -pipe
-fno-strict-aliasing' 'LDFLAGS=' 'CPPFLAGS=' 'CXX=c++' 'CXXFLAGS=-O2
-pipe -fno-strict-aliasing' 'CPP=cpp'
--with-squid=/usr/ports/www/squid31/work/squid-3.1.9
--enable-ltdl-convenience

It's running in transparent mode on

fergie# uname -m -r -s -v
FreeBSD 8.1-RELEASE FreeBSD 8.1-RELEASE #0: Mon Jul 19 02:36:49 UTC
2010 r...@mason.cse.buffalo.edu:/usr/obj/usr/src/sys/GENERIC
amd64

which is basically a vanilla FreeBSD 8.1 install with squid installed
from ports.

My squid.conf:


###
#
# Recommended minimum configuration:
#

acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed

Re: [squid-users] Windows Updates, YouTube and WoW

2010-11-08 Thread Amos Jeffries
On Mon, 8 Nov 2010 18:32:52 -0500, Kevin Wilcox kevin.wil...@gmail.com
wrote:
 Hi all.
 
 This is currently a test environment so making changes isn't an issue.
 
 Initially I had issues with hosts updating any flavour of Microsoft
 Windows but solved that with the included squid.conf. I'm even
 getting real cache hits on some of the Windows XP and Windows 7
 updates in my test lab, so the amount of effort I've put in so far is
 pretty well justified. Since the target audience won't have access to
 a local WSUS, I can pretty well count it as a win, even if the rest of
 this email becomes moot.
 
 Then came the big issue - World of Warcraft installation via the
 downloaded client. Things pretty well fell apart. It would install up
 to 20% and crash. Then it would install up to 25% and crash. Then 30%
 and crash. It did that, crashing further in the process each time,
 until it finally installed the base game (roughly 15 crashes). Due to
 clamping down on P2P I disabled that update mechanism and told the
 downloader to use only direct download. I'm averaging 0.00KB/s with
 bursts from 2KB/s to 64 KB/s. If I take squid out of the line I get
 speeds between 1 and 3 MB/s+ and things just work - but that sort of
 defeats the purpose in having a device that will cache
 non-authenticated user content. Having one user download a new 1 GB
 patch, and it being available locally for the other couple of hundred,
 would be ideal. Still, it isn't a deal breaker.
 
 I understand that it could be related to the partial content reply for
 the request and I understand that it could also be related to the
 URL/foo? style request. Is the best approach to just automatically
 pass anything for blizzard.com/worldofwarcraft.com straight through
 and not attempt to cache the updates? I've seen some comments where
 using
 
 acl QUERY urlpath_regex cgi-bin \?
 cache deny QUERY
 
 will cause those requests to not be cached (and I understand why that
 is) but I'm wondering if I should just ignore them altogether,
 especially given the third item - YouTube.

Yes, don't use that QUERY stuff. The dynamic URL which are cacheable will
have expiry and control headers to make it happen. The others are caught
and discarded properly by the new default refresh_pattern for cgi-bin and
\?.

 
 The target population for this cache is rather large. Typically,
 youtube is a huge culprit for bandwidth usage and a lot of the times
 it's hundreds of people hitting the same videos. I've been looking at
 how to cache those and it seems like it's required to either not use
 the above ACL or it's to setup another ACL that specifically allows
 youtube.
 
 All of those comments and workarounds have been regarding the 2.x set
 of squid, though. I'm curious if there is a cleaner way to go about
 caching youtube (or, perhaps I should say, video.google.com) in 3.1.x,
 or if it's possible to cache things like the WoW updates now? We're
 looking to experiment with some proprietary devices that claim to be
 able to cache Windows Updates, YouTube/Google Video, etc., but I'm
 wondering if my woes are just because of my inexperience with squid or
 if they're just that far ahead in terms of functionality?


Caching youtube still currently requires the storeurl feature of 2.7 which
has not bee ported to 3.x.
There are embeded visitor details and timestamps of when the video was
requested in the YT URL which cause the cache to fill up with large videos
at URL which will never be re-requested. This actively prevents any totally
unrelated web objects from using the cache space.

It is a good idea to prevent the YT videos from being stored at all unless
you can de-duplicate them.

 
 Any hints, tips or suggestions would be more than welcome!
 
 Relevant version information and configuration files:
 
snip
 
 # Uncomment and adjust the following to add a disk cache directory.
 cache_dir ufs /var/squid/cache 175000 16 256
 
 # Cache Mem - ideal amount of RAM to use
 cache_mem 2048 MB
 
 # Maximum object size - default is 4MB, not nearly enough to be useful
 maximum_object_size 1024 MB
 
 # Maximum object size in memory - we have 4GB, we can handle larger
objects
 maximum_object_size_in_memory 512 MB

Um, no you have 2GB (cache_mem) in which to store these objects. All 4+ of
them.


Amos


[squid-users] Windows updates please help

2010-02-04 Thread Hubert Choma
 Hello
  
   My squid ver. 2.6 stable Centos 2.6.18-164.el5 .
  
   I'm using the configuration of the WU from the example
   http://wiki.squid-cache.org/SquidFaq/WindowsUpdate
  
   I would like to force squid to cache all windows update (version V6)
   files e.g .cab .exe and 700MB ISO files
  
   I am noticed that windows media player does not update via squid. WU
   generates error 0x8024402F.
  
   I would like to setup squid cache maximum web content, antivirus updates
   and WU.
  
   Where can I find example how to cache dynamic pages ?
  
   hierarchy_stoplist cgi-bin ?
   acl QUERY urlpath_regex cgi-bin \?
 
  By deleting the above. And the lines which make use of QUERY they begin
  to cache.

 I understand that I must hash these lines. Is that you meant ?
 
# hierarchy_stoplist cgi-bin ?
# acl QUERY urlpath_regex cgi-bin \?
# cache deny QUERY

Thaht's correct ?

  Also see my notes in your refresh_pattern config below
 
  
  
   Please correct my config
  
   windowsupdate.txt
   .go.microsoft.com
   .windowsupdate.microsoft.com
   .update.microsoft.com
   .update.microsoft.com/windowsupdate/v7/default.aspx
   download.windowsupdate.com
   .download.microsoft.com
   ntservicepack.microsoft.com
   activex.microsoft.com
   redir.metaservices.microsoft.com
   images.metaservices.microsoft.com
   c.microsoft.com
   crl.microsoft.com
   codecs.microsoft.com
   urs.microsoft.com
   wustat.windows.com
  
  
   squid.conf
  
  
   http_port 192.168.0.12:8080
   hierarchy_stoplist cgi-bin ?
   acl QUERY urlpath_regex cgi-bin \?
   cache deny QUERY
   acl apache rep_header Server ^Apache
   broken_vary_encoding allow apache
   cache_mem 650 MB
   maximum_object_size 4194240 KB
   cache_dir ufs /var/spool/squid 6500 16 256
   #logformat squid %tl %6tr %a %Ss/%03Hs %st %rm %ru %un %Sh/%A mt
   access_log /var/log/squid/access.log squid
   mime_table /etc/squid/mime.conf
   refresh_pattern ^ftp: 1440 20% 10080
 
  Right here between the FTP default handling and the general traffic
  default handing (.) you need to add this:
 
  refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
 
  to properly prevent evil dynamic content from sticking around longer
  than it should (ie if its not giving cache-control and/or expiry, drop
  it. if it is okay then).
 
   refresh_pattern . 0 20% 4320

 You mean like this ??

 refresh_pattern ^ftp: 1440 20% 10080
 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
 refresh_pattern . 0 20% 4320

 ie if its not giving cache-control and/or expiry, drop
  it.

 What to drop ?


  Hmm. . matches every URL. Squid stops processing refresh_pattern at
  the first matching pattern.
 
  -- point: no refresh_pattern below here will ever be used.
 point: no refresh_pattern below here will ever be used.

So what to do with this ? What makes . ?? Remove first line and leave
 yours ? I didn't understand.

refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|)(\?.*)?$ 0 50% 
7200 what with reload-into-ims ?

   refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|) 0 50% 7200
   reload-into-ims
 
  Ahm...
  refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|)(\?.*)?$ 0
  50% 7200
 
   refresh_pattern update.microsoft.com/windowsupdate/v6/.*\.(cab|exe|dll)
   43200 100% 43200 reload-into-ims
   refresh_pattern windowsupdate.com/.*\.(cab|exe|dll) 43200 100% 43200
   reload-into-ims
   refresh_pattern windowsupdate.microsoft.com/.*\.(cab|exe|dll) 43200 100%
   43200 reload-into-ims
   refresh_pattern download.microsoft.com/.*\.(cab|exe|dll) 43200 100%
   43200 reload-into-ims
   refresh_pattern au.download.windowsupdate.com/.*\.(cab|exe|dll) 43200
   100% 43200 reload-into-ims
   refresh_pattern symantecliveupdate.com/.*\.(zip|exe) 43200 100% 43200
   reload-into-ims
   refresh_pattern windowsupdate.com/.*\.(cab|exe) 43200 100% 43200
   reload-into-ims
   refresh_pattern download.microsoft.com/.*\.(cab|exe) 43200 100% 43200
   reload-into-ims
   refresh_pattern avast.com/.*\.(vpu|vpaa) 4320 100% 43200 reload-into-ims
   refresh_pattern . 0 20% 4320
 
  Aha!. The dot pattern did get copied down. (or cut-n-pasted from the
  wiki?)

On Wiki I cant' find this patterns where are they ?

 
   range_offset_limit -1 KB
   ## MOJE ACL #
   acl mojasiec src 192.168.0.0/255.255.255.0
 
  thats 192.168.0.0/24.
 
   acl dozwolone dstdomain -i /etc/squid/dozwolone.txt
   acl ograniczone_komputery src 192.168.0.3 192.168.0.6 192.168.0.17
   192.168.0.12 192.168.0.15 192.168.0.16
   acl poczta dstdom_regex .*poczta.* .*mail.*
 
  Hmm. you can drop the .* at beginning and end of squid patterns. They
  are added automatically.
 No !!
without * eg. poczta.* .mail.* users can go on wembail and I would like
 to denied webmail ! So * are necessary .*mail.* !!

   #acl sm9 src 192.168.0.3
   #http_access allow sm9
   acl WindowsUpdate dstdomain -i /etc/squid/windowsupdate.txt
   acl CONNECT method CONNECT
   http_access allow dozwolone ograniczone_komputery !poczta
   http_access allow CONNECT WindowsUpdate mojasiec
   

Re: [squid-users] Windows updates please help

2010-02-04 Thread Amos Jeffries

Hubert Choma wrote:

 Hello

My squid ver. 2.6 stable Centos 2.6.18-164.el5 .

I'm using the configuration of the WU from the example
http://wiki.squid-cache.org/SquidFaq/WindowsUpdate

I would like to force squid to cache all windows update (version V6)
files e.g .cab .exe and 700MB ISO files

I am noticed that windows media player does not update via squid. WU
generates error 0x8024402F.

I would like to setup squid cache maximum web content, antivirus updates
and WU.

Where can I find example how to cache dynamic pages ?

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?

By deleting the above. And the lines which make use of QUERY they begin
to cache.


 I understand that I must hash these lines. Is that you meant ?
 
# hierarchy_stoplist cgi-bin ?

# acl QUERY urlpath_regex cgi-bin \?
# cache deny QUERY

Thaht's correct ?




snip

mime_table /etc/squid/mime.conf
refresh_pattern ^ftp: 1440 20% 10080

Right here between the FTP default handling and the general traffic
default handing (.) you need to add this:

refresh_pattern -i (/cgi-bin/|\?) 0 0% 0

to properly prevent evil dynamic content from sticking around longer
than it should (ie if its not giving cache-control and/or expiry, drop
it. if it is okay then).


refresh_pattern . 0 20% 4320

 You mean like this ??


refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320

ie if its not giving cache-control and/or expiry, drop

it.


 What to drop ?


Nevermind. My small attempt to educate you what the config means seems 
to have failed.





Hmm. . matches every URL. Squid stops processing refresh_pattern at
the first matching pattern.

-- point: no refresh_pattern below here will ever be used.

point: no refresh_pattern below here will ever be used.


So what to do with this ?


Only you know what the proxy needs to be doing. Nobody can answer that 
question for you.


I do not know what or why you wrote the refresh_pattern lines the way 
they were. All I can do is tell you what they are doing because they are 
obviously not doing what you want, whatever that is.




What makes . ??


  refresh_pattern . 0 20% 4320


Remove first line and leave
 yours ? I didn't understand.


. aka refresh_pattern .  aka  refresh_pattern -i . are all the 
'dot' pattern.


Like I said earlier, the . pattern matches every URL that exists. 
Squid will stop processing the refresh_pattern list at the first 
matching pattern.


Those two facts together mean that the dot . pattern is _always_ the 
last refresh_pattern Squid will use. Even if you happen to put others 
below it. The ones below it will never be used.


Understand now?



refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|)(\?.*)?$ 0 50% 
7200 what with reload-into-ims ?




I just left it off. No special reason. My point was to demonstrate the 
tricky (\?.*)?$  bit at the end of the pattern. It's needed to catch 
nasty websites obfuscating their URLs.


I don't like reload-into-ims. It prevents the users, who are the 
authoritative peoples able to actually see and know when the page is 
displaying badly from refreshing the objects and causing the page to 
display properly.


Sadly some broken software sends the reload at bad times, causing 
bandwidth waste but no actual problems. So its up to you.




refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|) 0 50% 7200
reload-into-ims

Ahm...
refresh_pattern -i \.(gif|jpg|jpeg|png|js|css|flv|bmp|)(\?.*)?$ 0
50% 7200


refresh_pattern update.microsoft.com/windowsupdate/v6/.*\.(cab|exe|dll)
43200 100% 43200 reload-into-ims
refresh_pattern windowsupdate.com/.*\.(cab|exe|dll) 43200 100% 43200
reload-into-ims
refresh_pattern windowsupdate.microsoft.com/.*\.(cab|exe|dll) 43200 100%
43200 reload-into-ims
refresh_pattern download.microsoft.com/.*\.(cab|exe|dll) 43200 100%
43200 reload-into-ims
refresh_pattern au.download.windowsupdate.com/.*\.(cab|exe|dll) 43200
100% 43200 reload-into-ims
refresh_pattern symantecliveupdate.com/.*\.(zip|exe) 43200 100% 43200
reload-into-ims
refresh_pattern windowsupdate.com/.*\.(cab|exe) 43200 100% 43200
reload-into-ims
refresh_pattern download.microsoft.com/.*\.(cab|exe) 43200 100% 43200
reload-into-ims
refresh_pattern avast.com/.*\.(vpu|vpaa) 4320 100% 43200 reload-into-ims
refresh_pattern . 0 20% 4320

Aha!. The dot pattern did get copied down. (or cut-n-pasted from the
wiki?)


On Wiki I cant' find this patterns where are they ?


An old example config for media content. I removed the lines from the 
wiki a while back but people still appear sometimes with errors 
cut-n-paste'd from the old examples.



range_offset_limit -1 KB
## MOJE ACL #
acl mojasiec src 192.168.0.0/255.255.255.0

thats 192.168.0.0/24.


acl dozwolone dstdomain -i /etc/squid/dozwolone.txt
acl ograniczone_komputery src 192.168.0.3 192.168.0.6 192.168.0.17
192.168.0.12 192.168.0.15 192.168.0.16
acl poczta dstdom_regex .*poczta.* .*mail.*

Hmm. you can drop the .* at beginning and end of 

[squid-users] Windows Updates

2006-10-20 Thread Donald J Organ IV
OK i currently have squid version 2.5stable12 running, and everything 
seems to be running fine.


I am only running into a problem when trying to run windows updates.  I 
am getting the infamous 0x80072F76 error when going to the site, i get 
the Checking if your computer has the latest version of Windows 
Updates then i get the error screen.


Also please be aware that I am running dansguardian as well, i am not 
sure where the problem is coming from but if anyone on the list has had 
any experience in fixing this it would be greatly appreciated.


Re: [squid-users] Windows Updates

2006-10-20 Thread Sutto Zoltan

There is a knowledgebase article about Error codes and how to solve them

You may receive an Error 0x80072EE2, Error 0x80072EE7, Error 
0x80072EFD, Error 0x80072F76, or Error 0x80072F78 error message when 
you try to use the Windows Update Web site or the Microsoft Update Web site

http://support.microsoft.com/kb/836941/en-us

- Original Message - 
From: Donald J Organ IV [EMAIL PROTECTED]

To: squid-users@squid-cache.org
Sent: Friday, October 20, 2006 9:40 PM
Subject: [squid-users] Windows Updates


OK i currently have squid version 2.5stable12 running, and everything 
seems to be running fine.


I am only running into a problem when trying to run windows updates.  I am 
getting the infamous 0x80072F76 error when going to the site, i get the 
Checking if your computer has the latest version of Windows Updates then 
i get the error screen.


Also please be aware that I am running dansguardian as well, i am not sure 
where the problem is coming from but if anyone on the list has had any 
experience in fixing this it would be greatly appreciated.


_ NOD32 1.1820 (20061020) Információ _

Az üzenetet a NOD32 antivirus system megvizsgálta.
http://www.nod32.hu






[squid-users] windows updates

2003-02-25 Thread Rodney Richison
Windows updates behind transparent proxy keeps saying no updates available,
yet when I enter the proxy settings manually in lan settings, the update
comes right away.  Fix?


Highest Regards,

Rodney
www.rcrnet.net
918-358-





Re: [squid-users] windows updates

2003-02-25 Thread Karl Pielorz


--On 25 February 2003 14:38 -0600 Rodney Richison [EMAIL PROTECTED] 
wrote:

Windows updates behind transparent proxy keeps saying no updates
available, yet when I enter the proxy settings manually in lan
settings, the update comes right away.  Fix?
Highest Regards,

Rodney
I've noticed a 'similar' thing here - going to the Windows Update page 
takes ages, and you finally get a You need a new component to scan for 
updates - which you elect to download (and it takes ages) - only to throw 
you back to You need a new component to scan for updates [ad infinitum]. 
Turn the proxy off, and it's an order of magnitude quicker, and you don't 
get caught in the loop.

Knowing our setup here, I'd guess it's over anonimity by the proxy - so I'm 
going to make sure theres no headers being stripped by Squid that perhaps 
should be there (or at least try it when it's not removing anything) - does 
your proxy strip anything out headers / anonimity wise?

-Kp


Re: [squid-users] windows updates

2003-02-25 Thread Rodney Richison
Is a fresh default install. Other than transparent. I get the feeling it's
related to the certificate windows update first installs. Any way to tell
squid not to cache that, but cache the updates?


Highest Regards,

Rodney
www.rcrnet.net
918-358-
- Original Message -
From: Karl Pielorz [EMAIL PROTECTED]
To: Rodney Richison [EMAIL PROTECTED]; [EMAIL PROTECTED]
[EMAIL PROTECTED]
Sent: Tuesday, February 25, 2003 3:11 PM
Subject: Re: [squid-users] windows updates




 --On 25 February 2003 14:38 -0600 Rodney Richison
[EMAIL PROTECTED]
 wrote:

  Windows updates behind transparent proxy keeps saying no updates
  available, yet when I enter the proxy settings manually in lan
  settings, the update comes right away.  Fix?
 
  Highest Regards,
 
  Rodney

 I've noticed a 'similar' thing here - going to the Windows Update page
 takes ages, and you finally get a You need a new component to scan for
 updates - which you elect to download (and it takes ages) - only to throw
 you back to You need a new component to scan for updates [ad infinitum].
 Turn the proxy off, and it's an order of magnitude quicker, and you don't
 get caught in the loop.

 Knowing our setup here, I'd guess it's over anonimity by the proxy - so
I'm
 going to make sure theres no headers being stripped by Squid that perhaps
 should be there (or at least try it when it's not removing anything) -
does
 your proxy strip anything out headers / anonimity wise?

 -Kp