[squid-users] redirect request

2010-03-15 Thread jayesh chavan
Hi,
How can I check what http-website is requested to redirect in my
perl redirect program?I want to access each request coming and write
my file.
Regards,
   Jayesh


[squid-users] how to reduce space

2010-03-15 Thread Malik Madni

below you will find the free disk of my server
 
 
[r...@proxy ~]# df -h
FilesystemSize  Used Avail Use% Mounted on
/dev/cciss/c0d0p2  19G   11G  7.1G  61% /
/dev/cciss/c0d0p1 483M   16M  442M   4% /boot
tmpfs 1.8G 0  1.8G   0% /dev/shm
/dev/cciss/c0d0p4 2.8G  2.3G  352M  87% /test
[r...@proxy ~]#

most of the disk is used by the below directory.
4.7G/usr/local/squid/var/cache
 
is it safe to remove this directory??if yes then how??
  
_
Get the latest jobs delivered. Sign up for SEEK Jobmail.
http://clk.atdmt.com/NMN/go/157639755/direct/01/

[squid-users] how to cross compile squid for arm-linux

2010-03-15 Thread joshoha

Hi,

I meet the following error when trying to cross-compile the squid source  
for arm-linux



[r...@centos squid-2.7.STABLE7]#./configure  
CC=/usr/local/arm/4.3.2/bin/arm-linux-gcc --host=arm-linux  
--prefix=/opt/squid/

..
checking for crypt... yes
checking if epoll works... configure: error: cannot run test program while  
cross compiling

See `config.log' for more details.

I dont know why.
Any idea?

thank you!


Re: [squid-users] how to cross compile squid for arm-linux

2010-03-15 Thread Kinkie
Hi,
 Squid's build system does not support crosscompiling. You need to
build on the target system.

Sorry

On 3/15/10, joshoha josh...@gmail.com wrote:
 Hi,

 I meet the following error when trying to cross-compile the squid source
 for arm-linux


 [r...@centos squid-2.7.STABLE7]#./configure
 CC=/usr/local/arm/4.3.2/bin/arm-linux-gcc --host=arm-linux
 --prefix=/opt/squid/
 ..
 checking for crypt... yes
 checking if epoll works... configure: error: cannot run test program while
 cross compiling
 See `config.log' for more details.

 I dont know why.
 Any idea?

 thank you!



-- 
/kinkie


Re: [squid-users] redirect request

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 12:15 +0530 skrev jayesh chavan:
 Hi,
 How can I check what http-website is requested to redirect in my
 perl redirect program?I want to access each request coming and write
 my file.

It's sent on standard input.

http://www.squid-cache.org/Doc/config/url_rewrite_program/

http://wiki.squid-cache.org/Features/Redirectors

Regards
Henrik



Re: [squid-users] how to reduce space

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 06:53 + skrev Malik Madni:

 most of the disk is used by the below directory.
 4.7G/usr/local/squid/var/cache

What is your cache_dir configuration?

 is it safe to remove this directory??if yes then how??

Yes.

1. Stop Squid.

2. Remove cache by running

rm -rf /usr/local/squid/var/cache/*

3. Rebuild the cache by running

squid -z

4. Start Squid again.


Alternatively with much shorter downtime:


1. Stop Squid

2. Move the old cache out of the way by running

cd /usr/local/squid/var/cache
mkdir junk
mv ?? swap.* junk

3. Rebuild the cache by running

squid -z

4. Start Squid again.

5. Remove the old cache contents by running

rm -rf junk

Regards
Henrik



Re: [squid-users] squid redirection to local apache

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 11:22 +0530 skrev jayesh chavan:
 Hi,
  My apache is listening on port 80.Also port 80 is included in
 ssl_ports option.The options in squid config. are:

Why in ssl_ports?

Is this a normal proxy or a reverse proxy?

Is the Apache to be used as a proxy or as a web server for some sites?

Regards
Henrik



Re: [squid-users] how to cross compile squid for arm-linux

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 09:49 +0100 skrev Kinkie:
 Hi,
  Squid's build system does not support crosscompiling. You need to
 build on the target system.

Or build a configure cache. Most easily done by running configure once
on the target using the --config-cache configure option.

Regards
Henrik



RE: [squid-users] how to reduce space

2010-03-15 Thread Malik Madni

my cache_dir configuration is
 
cache_dir ufs /usr/local/squid/var/cache 5120 16 256



 From: hen...@henriknordstrom.net
 To: m4madnima...@hotmail.com
 CC: squid-users@squid-cache.org
 Date: Mon, 15 Mar 2010 10:00:18 +0100
 Subject: Re: [squid-users] how to reduce space

 mån 2010-03-15 klockan 06:53 + skrev Malik Madni:

 most of the disk is used by the below directory.
 4.7G /usr/local/squid/var/cache

 What is your cache_dir configuration?

 is it safe to remove this directory??if yes then how??

 Yes.

 1. Stop Squid.

 2. Remove cache by running

 rm -rf /usr/local/squid/var/cache/*

 3. Rebuild the cache by running

 squid -z

 4. Start Squid again.


 Alternatively with much shorter downtime:


 1. Stop Squid

 2. Move the old cache out of the way by running

 cd /usr/local/squid/var/cache
 mkdir junk
 mv ?? swap.* junk

 3. Rebuild the cache by running

 squid -z

 4. Start Squid again.

 5. Remove the old cache contents by running

 rm -rf junk

 Regards
 Henrik
 
_
Link all your email accounts and social updates with Hotmail. Find out now.
http://windowslive.ninemsn.com.au/oneinbox?ocid=T162MSN05A0710G

RE: [squid-users] how to reduce space

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 09:24 + skrev Malik Madni:
 my cache_dir configuration is
  
 cache_dir ufs /usr/local/squid/var/cache 5120 16 256

So reduce that if you do not want that much in the cache.. the above is
configured for 5 GB.

Regards
Henrik



[squid-users] Icap+clam Av

2010-03-15 Thread senthilkumaar2021

Hi All,

I have installed Icap +Clam AV with squid squid3.0 stable24

I followed documentation in
http://wiki.squid-cache.org/Features/ICAP

I am getting following error in cache.log
Squid got an invalid ICAP OPTIONS response from service 
icap://127.0.0.1:1344/response; error: unsupported status code of 
OPTIONS response
essential ICAP service is down after an options fetch failure: 
icap://127.0.0.1:1344/response [down,!valid]


And in the browser i am getting following errors
ICAP protocol error.
The system returned: [No Error]
This means that some aspect of the ICAP communication failed.
Some possible problems are:
The ICAP server is not reachable.
An Illegal response was received from the ICAP server.

Please guide me in configuring squid+icap+clamAV

Thanking you

Regards
senthil


[squid-users] Displaying squid cache urls for windows

2010-03-15 Thread jayesh chavan
Hi,
   Is there any tool like purge for windows to display urls stored in
squid cache?

Regards,
  Jayesh


Re: [squid-users] squid redirection to local apache

2010-03-15 Thread jayesh chavan
2010/3/15 jayesh chavan jayesh.jayi...@gmail.com:
 Hi,
   squid is used as normal proxy.Port 80 is in ssl ports by
 default.Apache has to be used only to serve some requested files by
 squid as squid can not deliver it.
 Regards,
  Jayesh



[squid-users] Squid 3.0.STABLE25 is available

2010-03-15 Thread Amos Jeffries

The Squid HTTP Proxy team is pleased to announce the
availability of the Squid-3.0.STABLE25 release!

This release fixes a few regression issues from earlier 3.0 releases and 
resolves several digest authentication issues.


Digest authentication has been re-written for true compliance with
standards. Resolving a number of long outstanding issues with Squid-2.x 
as well as Squid-3.x series.


All Squid-3.0 users needing digest authentication are advised to upgrade 
to this release as soon as possible.



Following our planned release timetable:

 All users of Squid-3.0 are encouraged to plan for upgrades within the
year. Support for Squid-3.0 will officially cease with the release of
Squid-3.1.1 which is expected to occur in 2-4 weeks.


Please refer to the release notes at
http://www.squid-cache.org/Versions/v3/3.0/RELEASENOTES.html
if and when you are ready to make the switch to Squid-3.

This new release can be downloaded from our HTTP or FTP servers

 http://www.squid-cache.org/Versions/v3/3.0/
 ftp://ftp.squid-cache.org/pub/squid/
 ftp://ftp.squid-cache.org/pub/archive/3.0/

or the mirrors. For a list of mirror sites see

 http://www.squid-cache.org/Download/http-mirrors.dyn
 http://www.squid-cache.org/Download/mirrors.dyn

If you encounter any issues with this release please file a bug report.
 http://bugs.squid-cache.org/


Amos Jeffries


Re: [squid-users] Icap+clam Av

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 15:08 +0530 skrev senthilkumaar2021:

 Squid got an invalid ICAP OPTIONS response from service 
 icap://127.0.0.1:1344/response; error: unsupported status code of 
 OPTIONS response

That URL is not a valid c-icap URL for the clamav service.

Just as HTTP URLs ICAP URLs are also unique to their servicces. The
examples in the wiki is just meant as illustration and is not specific
to the clamav service of c-icap.

There was a good guide on the c-icap site on how to install c-icap and
matching squid configuration snippets, but I can not find it now..
trying Google. And yes it's still on the site just not linked in the new
menu or moved over to the wiki yet..

http://c-icap.sourceforge.net/install.html

Regards
Henrik



Re: [squid-users] Displaying squid cache urls for windows

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 15:20 +0530 skrev jayesh chavan:
 Hi,
Is there any tool like purge for windows to display urls stored in
 squid cache?

the purge tool should work fine on windows. Not sure if there is
precompiled binaries anywhere, but should build fine under cygwin.

You'll probably need this patch however to use it with current Squid
releases (on any platform):

http://www.henriknordstrom.net/code/purge.patch

Regards
Henrik



Re: [squid-users] Questions about portal sites such as yahoo cache with squid

2010-03-15 Thread Amos Jeffries

dave jones wrote:

Hi,

Does anyone using squid to cache yahoo portal site successfully?
If so, would you tell me how to do? Thanks.

Best regards,
Dave.


Yahoo! use Squid as part of their deployment.
 I imagine they already have the correct HTTP protocol details to make 
the content cacheable or have good reasons for leaving it as non-cacheable.


If you want to investigate this yourself use www.redbot.org (Yahoo! 
sponsored) to see how cacheable the portal URLs are.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Icap+clam Av

2010-03-15 Thread senthilkumaar2021

Thank you very much

In http://c-icap.sourceforge.net/install.html i am not able to find more 
explanation

Kindy help me
Henrik Nordström wrote:

mån 2010-03-15 klockan 15:08 +0530 skrev senthilkumaar2021:

  
Squid got an invalid ICAP OPTIONS response from service 
icap://127.0.0.1:1344/response; error: unsupported status code of 
OPTIONS response



That URL is not a valid c-icap URL for the clamav service.

Just as HTTP URLs ICAP URLs are also unique to their servicces. The
examples in the wiki is just meant as illustration and is not specific
to the clamav service of c-icap.

There was a good guide on the c-icap site on how to install c-icap and
matching squid configuration snippets, but I can not find it now..
trying Google. And yes it's still on the site just not linked in the new
menu or moved over to the wiki yet..

http://c-icap.sourceforge.net/install.html

Regards
Henrik



  




RE: [squid-users] how to reduce space

2010-03-15 Thread Malik Madni

basically i was running out of space on the Squid server thats why i was trying 
to delete files in cache directory. is there other irrelvant files that should 
i dlete??
only Squid Server is istalled on the machine. when i see shortage of space then 
i removed 
/usr/local/squid/var/log/access.log
/usr/local/squid/var/log/store.log
 
and then /usr/local/squid/sbin/squid -k reconfigure(restart)
 
after some days when space consumed then same process is repeated. if there is 
some better way then plez let me know
 
 


 Subject: RE: [squid-users] how to reduce space
 From: hen...@henriknordstrom.net
 To: m4madnima...@hotmail.com
 CC: squid-users@squid-cache.org
 Date: Mon, 15 Mar 2010 10:35:26 +0100

 mån 2010-03-15 klockan 09:24 + skrev Malik Madni:
 my cache_dir configuration is

 cache_dir ufs /usr/local/squid/var/cache 5120 16 256

 So reduce that if you do not want that much in the cache.. the above is
 configured for 5 GB.

 Regards
 Henrik
 
_
Link all your email accounts and social updates with Hotmail. Find out now.
http://windowslive.ninemsn.com.au/oneinbox?ocid=T162MSN05A0710G

Re: [squid-users] how to reduce space

2010-03-15 Thread Amos Jeffries

Malik Madni wrote:

basically i was running out of space on the Squid server thats why i was trying 
to delete files in cache directory. is there other irrelvant files that should 
i dlete??
only Squid Server is istalled on the machine. when i see shortage of space then i removed 
/usr/local/squid/var/log/access.log

/usr/local/squid/var/log/store.log
 
and then /usr/local/squid/sbin/squid -k reconfigure(restart)
 
after some days when space consumed then same process is repeated. if there is some better way then plez let me know
 
 


You should have a daily log rotation scheduled that performs that keeps 
the logs sane. Squid does it when called with squid -k rotate, but 
it's often better to integrate with whatever system is rotating the OS logs.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Questions about portal sites such as yahoo cache with squid

2010-03-15 Thread dave jones
On Mon, Mar 15, 2010 at 6:20 PM, Amos Jeffries  wrote:
 dave jones wrote:

 Hi,

 Does anyone using squid to cache yahoo portal site successfully?
 If so, would you tell me how to do? Thanks.

 Best regards,
 Dave.

 Yahoo! use Squid as part of their deployment.
  I imagine they already have the correct HTTP protocol details to make the
 content cacheable or have good reasons for leaving it as non-cacheable.

 If you want to investigate this yourself use www.redbot.org (Yahoo!
 sponsored) to see how cacheable the portal URLs are.

Ah, the result is:

HTTP/1.1 200 OK
Date: Mon, 15 Mar 2010 10:39:12 GMT
P3P: policyref=http://info.yahoo.com/w3c/p3p.xml;, CP=CAO DSP COR CUR ADM
DEV TAI PSA PSD IVAi IVDi CONi TELo OTPi OUR DELi SAMi OTRi UNRi
PUBi IND PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA POL HEA PRE
LOC GOV
Cache-Control: private
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip

Seems like the content cannot be cachable?

 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18

Regards,
Dave.


Re: [squid-users] Questions about portal sites such as yahoo cache with squid

2010-03-15 Thread Amos Jeffries

dave jones wrote:

On Mon, Mar 15, 2010 at 6:20 PM, Amos Jeffries  wrote:

dave jones wrote:

Hi,

Does anyone using squid to cache yahoo portal site successfully?
If so, would you tell me how to do? Thanks.

Best regards,
Dave.

Yahoo! use Squid as part of their deployment.
 I imagine they already have the correct HTTP protocol details to make the
content cacheable or have good reasons for leaving it as non-cacheable.

If you want to investigate this yourself use www.redbot.org (Yahoo!
sponsored) to see how cacheable the portal URLs are.


Ah, the result is:

HTTP/1.1 200 OK
Date: Mon, 15 Mar 2010 10:39:12 GMT
P3P: policyref=http://info.yahoo.com/w3c/p3p.xml;, CP=CAO DSP COR CUR ADM
DEV TAI PSA PSD IVAi IVDi CONi TELo OTPi OUR DELi SAMi OTRi UNRi
PUBi IND PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA POL HEA PRE
LOC GOV
Cache-Control: private
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip

Seems like the content cannot be cachable?



Yes, the response apparently contains private details for a specific 
user. Caching and sharing around to other users is not advisable.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] squid redirection to local apache

2010-03-15 Thread jayesh chavan
Hi,
   Its not ssl it is safe_ports.
  Regards,
  Jayesh

On Mon, Mar 15, 2010 at 3:21 PM, jayesh chavan jayesh.jayi...@gmail.com wrote:
 2010/3/15 jayesh chavan jayesh.jayi...@gmail.com:
 Hi,
   squid is used as normal proxy.Port 80 is in ssl ports by
 default.Apache has to be used only to serve some requested files by
 squid as squid can not deliver it.
 Regards,
  Jayesh




[squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Hubert Choma
Hello

I have squid 2.7 stable7 on centos 5.3
I downloaded it from http://people.redhat.com/~jskala/squid/ because 
Cenots repos has  2.6 version only.

When I would to start squid /etc/init.d/squid start I receive [failed] 
but in cache.log there is no Warnings or errors . So I can not pinpoint 
the error. How can I find the cause of the error ???

I use default config.

Earlier I have received squid WARNING failed to resolve 192.168.0.1 to 
a fully qualified hostname. So I made an entry in etc hosts and add 
visible_hostname proliant.geodezja.wolomin in squid conf but still have 
error. Its strange  service squid status show me that squid is running !!

but service squid restart shows:
service squid stop [ok]
service squid start [failed]
service squid status running

The machine which squid is running have 2 NIC's (Linux router) eth0 
192.168.1.1 and eth1 192.168.0.1.
My public domain IP is 83.18.17.30 and dns from internet provider. So 
what should look like a valid entry in the /etc/hosts ??

127.0.0.1   localhost.localdomain   localhost   proliant
#192.168.1.2proliant
192.168.0.1 proliant.geodezja.wolomin.pl proliant
#83.18.17.30geodezja.wolomin.pl proliant
192.168.0.2 sm2
192.168.0.3 sm3
192.168.0.4 sm4
192.168.0.6 sm19
192.168.0.8 sm9
::1 localhost6.localdomain6 localhost6


squid.conf
http_port 8080 transparent
visible_hostname proliant.geodezja.wolomin.pl
acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32
#acl localnet src 10.0.0.0/8# RFC1918 possible internal network
#acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
#acl localnet src 192.168.0.0/16# RFC1918 possible internal 
network
acl localnet src 192.168.0.0/24
acl SSL_ports port 443
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localnet
http_access allow localhost
http_access deny all
icp_access allow localnet
icp_access deny all

hierarchy_stoplist cgi-bin ?
access_log /var/log/squid/access.log squid
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
refresh_pattern .   0   20% 4320
acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9]
upgrade_http0.9 deny shoutcast
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
coredump_dir /var/spool/squid



squid.log

2010/03/15 10:25:38| Preparing for shutdown after 0 requests
2010/03/15 10:25:38| Waiting 30 seconds for active connections to finish
2010/03/15 10:25:38| FD 13 Closing HTTP connection
2010/03/15 10:25:38| Shutting down...
2010/03/15 10:25:38| FD 14 Closing ICP connection
2010/03/15 10:25:38| Closing unlinkd pipe on FD 11
2010/03/15 10:25:38| storeDirWriteCleanLogs: Starting...
2010/03/15 10:25:38|   Finished.  Wrote 2336 entries.
2010/03/15 10:25:38|   Took 0.0 seconds (2975796.2 entries/sec).
CPU Usage: 0.049 seconds = 0.031 user + 0.018 sys
Maximum Resident Size: 0 KB
Page faults with physical i/o: 0
Memory usage for squid via mallinfo():
total space in arena:2772 KB
Ordinary blocks: 2660 KB  7 blks
Small blocks:   0 KB  5 blks
Holding blocks:   284 KB  1 blks
Free Small blocks:  0 KB
Free Ordinary blocks: 111 KB
Total in use:2944 KB 96%
Total free:   111 KB 4%
2010/03/15 10:25:38| logfileClose: closing log /var/log/squid/store.log
2010/03/15 10:25:38| logfileClose: closing log /var/log/squid/access.log
2010/03/15 10:25:38| Squid Cache (Version 2.7.STABLE7): Exiting normally.
2010/03/15 10:25:42| Starting Squid Cache version 2.7.STABLE7 for 
i386-redhat-linux-gnu...
2010/03/15 10:25:42| Process ID 11568
2010/03/15 10:25:42| With 1024 file descriptors available
2010/03/15 10:25:42| Using epoll for the IO loop
2010/03/15 10:25:42| DNS Socket created at 0.0.0.0, port 37353, FD 6
2010/03/15 10:25:42| Adding nameserver 194.204.152.34 from 
/etc/resolv.conf
2010/03/15 10:25:42| Adding nameserver 194.204.159.1 from /etc/resolv.conf
2010/03/15 10:25:42| Adding nameserver 192.168.1.1 from /etc/resolv.conf
2010/03/15 10:25:42| User-Agent logging is disabled.
2010/03/15 10:25:42| Referer logging is disabled.
2010/03/15 10:25:42| logfileOpen: opening log /var/log/squid/access.log
2010/03/15 10:25:42| Unlinkd pipe 

Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread fedorischev
В сообщении от Monday 15 March 2010 14:02:21 Hubert Choma написал(а):
 Hello

 I have squid 2.7 stable7 on centos 5.3
 I downloaded it from http://people.redhat.com/~jskala/squid/ because
 Cenots repos has  2.6 version only.

 When I would to start squid /etc/init.d/squid start I receive [failed]
 but in cache.log there is no Warnings or errors . So I can not pinpoint
 the error. How can I find the cause of the error ???

 I use default config.

Looks like a permission problem.

Squid log and cache directories must be readable and writeable by squid user. 
See cache_effective_user directive in squid.conf and verify directory 
permissions.

WBR.


[squid-users] transparent squid + clamav + https

2010-03-15 Thread Stefan Reible

Hi,

for my exam I want to set up a transparent proxy with http and https  
under gentoo linux.


The transparent http proxy with clamav ist working very nice, but now  
i have problems with the implementation of ssl. My first idea was, to  
break down the encryption at the squid, an then create a new one.


http://wiki.squid-cache.org/Features/SslBump

Is this possible? I think the problem is, that if someone opens an  
https encrypted website like https://google.de he gets the certificate  
from the proxy in his browser, not from the webserver. This wouldn`t  
be so fine..


Do you have any solutions, informations or ideas for this problem?

Thanks, Stefan

PS: I have an secound problem with downloading big files, is it  
possilbe to send any infos about the download progress to the  
webbrowser? Like opening an ajax script or something else.




Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Hubert Choma
Dnia 15-03-2010 o godz. 12:21 fedorischev napisał(a):
 Đ#65533;Ń#65533;žĐžĐąŃ#65533;ľĐ˝Đ¸Đ¸ ĐžŃ#65533;Monday 15 March 2010 
14:02:21 Hubert Choma
 напиŃ#65533;°Đť(Đ°):
  Hello
 
  I have squid 2.7 stable7 on centos 5.3
  I downloaded it from http://people.redhat.com/~jskala/squid/ because
  Cenots repos has  2.6 version only.
 
  When I would to start squid /etc/init.d/squid start I receive [failed]
  but in cache.log there is no Warnings or errors . So I can not pinpoint
  the error. How can I find the cause of the error ???
 
  I use default config.
 
 Looks like a permission problem.
 
 Squid log and cache directories must be readable and writeable by squid
 user.
 See cache_effective_user directive in squid.conf and verify directory
 permissions.
 
 WBR.
In my squid.conf there is no entry cache_effective_user but I add it and 
error still occurs.
I check permissions
ls -all /var/log/squid/
razem 1640
drwxr-x---  3 squid squid   4096 mar 15 12:55 .
drwxr-xr-x 19 root  root4096 mar 15 04:05 ..
-rw-r-  1 squid squid 143767 mar 15 09:35 access.log
-rw-r-  1 squid squid 153839 mar 14 04:02 access.log.1.gz
-rw-r-  1 squid squid 706160 mar 15 12:59 cache.log
drwxrwxrw-  2 squid squid   4096 mar 15 12:59 run
-rw-r--r--  1 squid squid   1347 mar 15 09:25 squid.out
-rw-r-  1 squid squid 226746 mar 15 11:25 store.log
-rw-r-  1 squid squid 396998 mar 14 04:02 store.log.1.gz

and /var/spool/squid

ls -all /var/spool/squid/
razem 100608
drwxrwx---  18 squid squid4096 mar 15 12:59 .
drwxr-xr-x  14 root  root 4096 mar  9  2009 ..
drwxr-x--- 258 squid squid4096 mar 11 16:24 00
drwxr-x--- 258 squid squid4096 mar 11 16:24 01
drwxr-x--- 258 squid squid4096 mar 11 16:24 02
drwxr-x--- 258 squid squid4096 mar 11 16:24 03
drwxr-x--- 258 squid squid4096 mar 11 16:24 04
drwxr-x--- 258 squid squid4096 mar 11 16:24 05
drwxr-x--- 258 squid squid4096 mar 11 16:24 06
drwxr-x--- 258 squid squid4096 mar 11 16:24 07
drwxr-x--- 258 squid squid4096 mar 11 16:24 08
drwxr-x--- 258 squid squid4096 mar 11 16:24 09
drwxr-x--- 258 squid squid4096 mar 11 16:24 0A
drwxr-x--- 258 squid squid4096 mar 11 16:24 0B
drwxr-x--- 258 squid squid4096 mar 11 16:24 0C
drwxr-x--- 258 squid squid4096 mar 11 16:24 0D
drwxr-x--- 258 squid squid4096 mar 11 16:24 0E
drwxr-x--- 258 squid squid4096 mar 11 16:24 0F
-rw---   1 squid squid 4616192 mar 11 16:27 core.13064
-rw---   1 squid squid 4616192 mar 11 16:27 core.13076
-rw---   1 squid squid 4616192 mar 11 16:27 core.13087
-rw---   1 squid squid 4616192 mar 11 16:27 core.13094
-rw---   1 squid squid 4616192 mar 11 16:27 core.13110
-rw---   1 squid squid 4612096 mar 11 16:35 core.13326
-rw---   1 squid squid 4612096 mar 11 16:35 core.13345
-rw---   1 squid squid 4612096 mar 11 16:35 core.13357
-rw---   1 squid squid 4612096 mar 11 16:35 core.13364
-rw---   1 squid squid 4612096 mar 11 16:36 core.13367
-rw---   1 squid squid 4612096 mar 12 16:30 core.3594
-rw---   1 squid squid 4612096 mar 12 16:30 core.3614
-rw---   1 squid squid 4612096 mar 12 16:30 core.3622
-rw---   1 squid squid 4612096 mar 12 16:30 core.3629
-rw---   1 squid squid 4612096 mar 12 16:30 core.3809
-rw---   1 squid squid 4612096 mar 12 16:30 core.3933
-rw---   1 squid squid 4612096 mar 12 16:30 core.3991
-rw---   1 squid squid 4612096 mar 12 16:30 core.4122
-rw---   1 squid squid 4612096 mar 12 16:30 core.4143
-rw---   1 squid squid 4612096 mar 11 17:06 core.4669
-rw---   1 squid squid 4612096 mar 11 17:07 core.4687
-rw---   1 squid squid 4612096 mar 11 17:07 core.4724
-rw---   1 squid squid 4612096 mar 11 17:07 core.4741
-rw---   1 squid squid 4612096 mar 11 17:07 core.4754
-rw---   1 squid squid 4612096 mar 11 17:07 core.4762
-rw---   1 squid squid 4612096 mar 11 17:08 core.4772
-rw---   1 squid squid 4612096 mar 11 17:08 core.4779
-rw---   1 squid squid 4612096 mar 11 17:08 core.5201
-rw-r-   1 squid squid  112176 mar 15 12:59 swap.state

So permissions are seems to be ok  ? So what else should I check ? 
PLEASE HELP!!

squid -X output shows
2010/03/15 13:05:53| parse_line: minimum_expiry_time 60 seconds
2010/03/15 13:05:53| parse_line: store_avg_object_size 13 KB
2010/03/15 13:05:53| parse_line: store_objects_per_bucket 20
2010/03/15 13:05:53| parse_line: request_header_max_size 20 KB
2010/03/15 13:05:53| parse_line: reply_header_max_size 20 KB
2010/03/15 13:05:53| parse_line: request_body_max_size 0 KB
2010/03/15 13:05:53| parse_line: via on
2010/03/15 13:05:53| parse_line: cache_vary on
2010/03/15 13:05:53| parse_line: collapsed_forwarding off
2010/03/15 13:05:53| parse_line: refresh_stale_hit 0 seconds
2010/03/15 13:05:53| parse_line: ie_refresh off
2010/03/15 13:05:53| parse_line: vary_ignore_expire off
2010/03/15 13:05:53| parse_line: request_entities off
2010/03/15 13:05:53| parse_line: relaxed_header_parser on
2010/03/15 

Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Sakhi Louw
2010/3/15 Hubert Choma hubert...@wp.pl:
 Dnia 15-03-2010 o godz. 12:21 fedorischev napisał(a):
 Đ#65533;Ń#65533;žĐžĐąŃ#65533;ľĐ˝Đ¸Đ¸ ĐžŃ#65533;Monday 15 March 2010
 14:02:21 Hubert Choma
 напиŃ#65533;°Đť(Đ°):
  Hello
 
  I have squid 2.7 stable7 on centos 5.3
  I downloaded it from http://people.redhat.com/~jskala/squid/ because
  Cenots repos has  2.6 version only.
 
  When I would to start squid /etc/init.d/squid start I receive [failed]
  but in cache.log there is no Warnings or errors . So I can not pinpoint
  the error. How can I find the cause of the error ???
 
  I use default config.

 Looks like a permission problem.

 Squid log and cache directories must be readable and writeable by squid
 user.
 See cache_effective_user directive in squid.conf and verify directory
 permissions.

 WBR.
 In my squid.conf there is no entry cache_effective_user but I add it and
 error still occurs.
 I check permissions
 ls -all /var/log/squid/
 razem 1640
 drwxr-x---  3 squid squid   4096 mar 15 12:55 .
 drwxr-xr-x 19 root  root    4096 mar 15 04:05 ..
 -rw-r-  1 squid squid 143767 mar 15 09:35 access.log
 -rw-r-  1 squid squid 153839 mar 14 04:02 access.log.1.gz
 -rw-r-  1 squid squid 706160 mar 15 12:59 cache.log
 drwxrwxrw-  2 squid squid   4096 mar 15 12:59 run
 -rw-r--r--  1 squid squid   1347 mar 15 09:25 squid.out
 -rw-r-  1 squid squid 226746 mar 15 11:25 store.log
 -rw-r-  1 squid squid 396998 mar 14 04:02 store.log.1.gz

 and /var/spool/squid

 ls -all /var/spool/squid/
 razem 100608
 drwxrwx---  18 squid squid    4096 mar 15 12:59 .
 drwxr-xr-x  14 root  root     4096 mar  9  2009 ..
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 00
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 01
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 02
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 03
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 04
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 05
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 06
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 07
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 08
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 09
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0A
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0B
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0C
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0D
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0E
 drwxr-x--- 258 squid squid    4096 mar 11 16:24 0F
 -rw---   1 squid squid 4616192 mar 11 16:27 core.13064
 -rw---   1 squid squid 4616192 mar 11 16:27 core.13076
 -rw---   1 squid squid 4616192 mar 11 16:27 core.13087
 -rw---   1 squid squid 4616192 mar 11 16:27 core.13094
 -rw---   1 squid squid 4616192 mar 11 16:27 core.13110
 -rw---   1 squid squid 4612096 mar 11 16:35 core.13326
 -rw---   1 squid squid 4612096 mar 11 16:35 core.13345
 -rw---   1 squid squid 4612096 mar 11 16:35 core.13357
 -rw---   1 squid squid 4612096 mar 11 16:35 core.13364
 -rw---   1 squid squid 4612096 mar 11 16:36 core.13367
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3594
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3614
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3622
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3629
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3809
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3933
 -rw---   1 squid squid 4612096 mar 12 16:30 core.3991
 -rw---   1 squid squid 4612096 mar 12 16:30 core.4122
 -rw---   1 squid squid 4612096 mar 12 16:30 core.4143
 -rw---   1 squid squid 4612096 mar 11 17:06 core.4669
 -rw---   1 squid squid 4612096 mar 11 17:07 core.4687
 -rw---   1 squid squid 4612096 mar 11 17:07 core.4724
 -rw---   1 squid squid 4612096 mar 11 17:07 core.4741
 -rw---   1 squid squid 4612096 mar 11 17:07 core.4754
 -rw---   1 squid squid 4612096 mar 11 17:07 core.4762
 -rw---   1 squid squid 4612096 mar 11 17:08 core.4772
 -rw---   1 squid squid 4612096 mar 11 17:08 core.4779
 -rw---   1 squid squid 4612096 mar 11 17:08 core.5201
 -rw-r-   1 squid squid  112176 mar 15 12:59 swap.state

 So permissions are seems to be ok  ? So what else should I check ?
 PLEASE HELP!!

 squid -X output shows
 2010/03/15 13:05:53| parse_line: minimum_expiry_time 60 seconds
 2010/03/15 13:05:53| parse_line: store_avg_object_size 13 KB
 2010/03/15 13:05:53| parse_line: store_objects_per_bucket 20
 2010/03/15 13:05:53| parse_line: request_header_max_size 20 KB
 2010/03/15 13:05:53| parse_line: reply_header_max_size 20 KB
 2010/03/15 13:05:53| parse_line: request_body_max_size 0 KB
 2010/03/15 13:05:53| parse_line: via on
 2010/03/15 13:05:53| parse_line: cache_vary on
 2010/03/15 13:05:53| parse_line: collapsed_forwarding off
 2010/03/15 13:05:53| parse_line: refresh_stale_hit 0 seconds
 2010/03/15 13:05:53| parse_line: ie_refresh off
 2010/03/15 13:05:53| parse_line: vary_ignore_expire off
 

[squid-users] Orig. request HTTP/1.0, outgoing request HTTP/1.1

2010-03-15 Thread Silamael
Hello together,

Just build the new Squid 3.1.0.18 and noticed that now, Squid does an
HTTP/1.1 request to the server even if the client just sent an HTTP/1.0
request. Previous versions of Squid did not do that.
Is this intended or did i stumble over some minor bug?

-- Matthias


Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Hubert Choma
Dnia 15-03-2010 o godz. 13:14 Sakhi Louw napisał(a):
 2010/3/15 Hubert Choma hubert...@wp.pl:
  Dnia 15-03-2010 o godz. 12:21 fedorischev napisał(a):
  Đ#65533;Ń#65533;žĐžĐąŃ#65533;ľĐ˝Đ¸Đ¸ ĐžŃ#65533;Monday 15 March 2010
  14:02:21 Hubert Choma
  напиŃ#65533;°Đť(Đ°):
   Hello
  
   I have squid 2.7 stable7 on centos 5.3
   I downloaded it from http://people.redhat.com/~jskala/squid/ because
   Cenots repos has  2.6 version only.
  
   When I would to start squid /etc/init.d/squid start I receive [failed]
   but in cache.log there is no Warnings or errors . So I can not pinpoint
   the error. How can I find the cause of the error ???
  
   I use default config.
 
  Looks like a permission problem.
 
  Squid log and cache directories must be readable and writeable by squid
  user.
  See cache_effective_user directive in squid.conf and verify directory
  permissions.
 
  WBR.
  In my squid.conf there is no entry cache_effective_user but I add it and
  error still occurs.
  I check permissions
  ls -all /var/log/squid/
  razem 1640
  drwxr-x---  3 squid squid   4096 mar 15 12:55 .
  drwxr-xr-x 19 root  root    4096 mar 15 04:05 ..
  -rw-r-  1 squid squid 143767 mar 15 09:35 access.log
  -rw-r-  1 squid squid 153839 mar 14 04:02 access.log.1.gz
  -rw-r-  1 squid squid 706160 mar 15 12:59 cache.log
  drwxrwxrw-  2 squid squid   4096 mar 15 12:59 run
  -rw-r--r--  1 squid squid   1347 mar 15 09:25 squid.out
  -rw-r-  1 squid squid 226746 mar 15 11:25 store.log
  -rw-r-  1 squid squid 396998 mar 14 04:02 store.log.1.gz
 
  and /var/spool/squid
 
  ls -all /var/spool/squid/
  razem 100608
  drwxrwx---  18 squid squid    4096 mar 15 12:59 .
  drwxr-xr-x  14 root  root     4096 mar  9  2009 ..
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 00
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 01
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 02
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 03
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 04
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 05
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 06
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 07
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 08
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 09
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0A
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0B
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0C
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0D
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0E
  drwxr-x--- 258 squid squid    4096 mar 11 16:24 0F
  -rw---   1 squid squid 4616192 mar 11 16:27 core.13064
  -rw---   1 squid squid 4616192 mar 11 16:27 core.13076
  -rw---   1 squid squid 4616192 mar 11 16:27 core.13087
  -rw---   1 squid squid 4616192 mar 11 16:27 core.13094
  -rw---   1 squid squid 4616192 mar 11 16:27 core.13110
  -rw---   1 squid squid 4612096 mar 11 16:35 core.13326
  -rw---   1 squid squid 4612096 mar 11 16:35 core.13345
  -rw---   1 squid squid 4612096 mar 11 16:35 core.13357
  -rw---   1 squid squid 4612096 mar 11 16:35 core.13364
  -rw---   1 squid squid 4612096 mar 11 16:36 core.13367
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3594
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3614
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3622
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3629
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3809
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3933
  -rw---   1 squid squid 4612096 mar 12 16:30 core.3991
  -rw---   1 squid squid 4612096 mar 12 16:30 core.4122
  -rw---   1 squid squid 4612096 mar 12 16:30 core.4143
  -rw---   1 squid squid 4612096 mar 11 17:06 core.4669
  -rw---   1 squid squid 4612096 mar 11 17:07 core.4687
  -rw---   1 squid squid 4612096 mar 11 17:07 core.4724
  -rw---   1 squid squid 4612096 mar 11 17:07 core.4741
  -rw---   1 squid squid 4612096 mar 11 17:07 core.4754
  -rw---   1 squid squid 4612096 mar 11 17:07 core.4762
  -rw---   1 squid squid 4612096 mar 11 17:08 core.4772
  -rw---   1 squid squid 4612096 mar 11 17:08 core.4779
  -rw---   1 squid squid 4612096 mar 11 17:08 core.5201
  -rw-r-   1 squid squid  112176 mar 15 12:59 swap.state
 
  So permissions are seems to be ok  ? So what else should I check ?
  PLEASE HELP!!
 
  squid -X output shows
  2010/03/15 13:05:53| parse_line: minimum_expiry_time 60 seconds
  2010/03/15 13:05:53| parse_line: store_avg_object_size 13 KB
  2010/03/15 13:05:53| parse_line: store_objects_per_bucket 20
  2010/03/15 13:05:53| parse_line: request_header_max_size 20 KB
  2010/03/15 13:05:53| parse_line: reply_header_max_size 20 KB
  2010/03/15 13:05:53| parse_line: request_body_max_size 0 KB
  2010/03/15 13:05:53| parse_line: via on
  2010/03/15 13:05:53| parse_line: cache_vary on
  2010/03/15 13:05:53| parse_line: collapsed_forwarding off
  2010/03/15 

Re: [squid-users] Orig. request HTTP/1.0, outgoing request HTTP/1.1

2010-03-15 Thread Amos Jeffries

Silamael wrote:

Hello together,

Just build the new Squid 3.1.0.18 and noticed that now, Squid does an
HTTP/1.1 request to the server even if the client just sent an HTTP/1.0
request. Previous versions of Squid did not do that.
Is this intended or did i stumble over some minor bug?

-- Matthias


RFC 2616 compliance has nearly been reached in 3.1.
Squid can now talk HTTP/1.1 to web servers. Upgrading requests like you 
saw is the final requirement to be met in that area.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread fedorischev
В сообщении от Monday 15 March 2010 15:30:23 вы написали:

 Thanks for advice and what now? I checked /var/log/messages and still no
 errors ?
 in access.log there are no current entry so connections to WWW work but
 not through squid.
 Please Help !


I'm afraid to ask this question, but...

Did you point your web-browser on proxy ? :)


Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Hubert Choma
Dnia 15-03-2010 o godz. 13:40 fedorischev napisał(a):
 Đ#65533;Ń#65533;žĐžĐąŃ#65533;ľĐ˝Đ¸Đ¸ ĐžŃ#65533;Monday 15 March 2010 
15:30:23 вŃ#65533;
 напиŃ#65533;°ĐťĐ¸:
 
  Thanks for advice and what now? I checked /var/log/messages and still no
  errors ?
  in access.log there are no current entry so connections to WWW work but
  not through squid.
  Please Help !
 
 
 I'm afraid to ask this question, but...
 
 Did you point your web-browser on proxy ? :)

No because http_port 192.168.0.1:8080 transparent so it's transparent 
mode and no need to set web-browser (eth0 = Wan eth1=Lan=192.168.0.1)

iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT 
--to-port 8080
It's correct iptables rule ?




[squid-users] Cache_dir size considerations

2010-03-15 Thread GIGO .

Dear All,
 
Relevant info:
 
On IBM Server 3650 x series with physical RAID 1 .
Disk space comprises of 70 Gb 
Intended is caching of windows update.
intended is caching of youtube contents.
Single is dedicated to running squid on Ubuntu 8.04
 
Current Settings : cache_dir aufs /usr/local/squid/var/cache 5 16 256
However i beleived that it should be changed to something like this: cache_dir 
aufs /usr/local/squid/var/cache 50gb 48 768
 
or
 
cache_dir aufs /usr/local/squid/var/cache 2 32 512. etc etc.
 
so that not more file lies in L2.
 
The confusion is that isnt it correct to allocating this large 50gb hard-disk 
space to caching? Does such large cache will result in some issues? Does this 
large size has to do with some latency issue etc?
 
 
I am looking forward for your guidance.
 
rega...@all

  
_
Hotmail: Free, trusted and rich email service.
https://signup.live.com/signup.aspx?id=60969

Re: [squid-users] Websites not loading correctly

2010-03-15 Thread Matus UHLAR - fantomas
 On Wed, 10 Mar 2010 17:52:15 +0100, Alex Marsal alex.mar...@carglass.es
 wrote:
 Actually we are running msie7, I've checked the http 1.1 option and is
 enabled. It's imposible to request a flight company or online bank
 administrator to modify their servers because squid is unable to display
 the website properly.

 I've found that usually the websites doesn't work giving a javascript
 error on the left corner bottom. Don't know if this would help, but I
 really need to find a fix with those websites.

 Would release 3.1 fix this? Any other help?

 Amos Jeffries squ...@treenet.co.nz ha escrito:
 If your browser can't run the javascript that website needs, then it's a
 broken website. Not Squid related. See if you can figure out what that
 error is about. It may solve the issue for you.

On 12.03.10 00:52, Alex Marsal wrote:
 But why a broken website is displayed properly in the same browser  
 without squid and is not working with squid? Is an online bank website  
 broken?

It's quite possible (and apparently also quite common) that someone makes
website broken in the way that it works OK when accessing directly,
but breaks when using proxy.

There are webmasters who don't care about their site correctness but are
satisfied when it works with their configuration and whenever you complain,
they reply use browser xxx etc. and put requirements info on the site.

The best you can do is find out what _exactly_ is broken when you access via
squid.

Is it possible for you to download the page directly and through proxy, and
then display differences?

Or is it possible to look directly at the script that appears broken to see
if it's any different?

Or is there any possibility you use kind of object or path rewriting (icap,
ecap, URL rewriter) that could lead to these errors?

-- 
Matus UHLAR - fantomas, uh...@fantomas.sk ; http://www.fantomas.sk/
Warning: I wish NOT to receive e-mail advertising to this address.
Varovanie: na tuto adresu chcem NEDOSTAVAT akukolvek reklamnu postu.
He who laughs last thinks slowest. 


Re: [squid-users] Orig. request HTTP/1.0, outgoing request HTTP/1.1

2010-03-15 Thread Silamael
On 03/15/2010 01:37 PM, Amos Jeffries wrote:
 RFC 2616 compliance has nearly been reached in 3.1.
 Squid can now talk HTTP/1.1 to web servers. Upgrading requests like you
 saw is the final requirement to be met in that area.

Hi Amos,

Thank you for the quick answer :)

-- Matthias


Re: [squid-users] Icap+clam Av

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 15:56 +0530 skrev senthilkumaar2021:
 Thank you very much
 
 In http://c-icap.sourceforge.net/install.html i am not able to find more 
 explanation
 Kindy help me


Quote:

icap_service service_avi_req reqmod_precache 0 
icap://localhost:1344/srv_clamav
icap_service service_avi respmod_precache 1 
icap://localhost:1344/srv_clamav


Regards
Henrik



Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 14:00 +0100 skrev Hubert Choma:

 iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT 
 --to-port 8080
 It's correct iptables rule ?

Is eth0 the interface where clients traffic is arriving?

If you are using wccp then the interface is usually a gre interface, not
ethx..

Regards
Henrik



Re: [squid-users] transparent squid + clamav + https

2010-03-15 Thread Henrik Nordström
mån 2010-03-15 klockan 12:30 +0100 skrev Stefan Reible:

 The transparent http proxy with clamav ist working very nice, but now  
 i have problems with the implementation of ssl. My first idea was, to  
 break down the encryption at the squid, an then create a new one.
 
 http://wiki.squid-cache.org/Features/SslBump
 
 Is this possible? I think the problem is, that if someone opens an  
 https encrypted website like https://google.de he gets the certificate  
 from the proxy in his browser, not from the webserver. This wouldn`t  
 be so fine..

Well, it's the only possibility, othewise the proxy (and clamav) won't
be able to inspect the https traffic.

 PS: I have an secound problem with downloading big files, is it  
 possilbe to send any infos about the download progress to the  
 webbrowser? Like opening an ajax script or something else.

Yes. See the viralator mode of c-icap srv_clamav.

The service supports 3 different modes of download management

- Wait with response until scanning have completed
- Send some data of the file while scanning is performed to keep the
client patiently waiting.
- viralator mode showing progress while scanning is done, and then
redirecting to a download URL when complete

The problem with viralator mode is that it may break some things as it
responds with another response while scanning.

Regards
Henrik



[squid-users] Problem with whitelisting

2010-03-15 Thread Frank Becker

Hi all,

I'm using squid on Debian Lenny and use it as a porno-filter. It works 
fine. I now want to use a whitelist because there are sites which are 
banned by my filterlist but these sites are ok.


So I created an acl whitelist and allowed access of course. But it 
doesn't work.


Below there is the segment of my squid.conf. Please, can someone help me 
to whitelist some sites?


Best regards and many thanks in advance

Frank


Here are my rules:
acl our_networks src 192.168.100.0/24
acl blacklist_domains dstdomain /etc/squid/blacklist_domains
acl blacklist_regexp dstdom_regex -i /etc/squid/blacklist_regexp
acl whitelist dstdomain /etc/squid/whitelist
acl blacklistuser src 192.168.100.2-192.168.100.209 
192.168.100.221-192.168.100.225

acl manager proto cache_object

acl admins proxy_auth /etc/squid/admins
acl users proxy_auth REQUIRED
http_access allow manager admins
http_access deny manager
http_access allow users

http_access deny manager
http_access allow purge localhost
http_access deny purge
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow whitelist
http_access deny blacklist_domains
http_access deny blacklist_regexp
http_access allow our_networks
http_access deny all


The whitelist contains:
*.openshotvideo.com
*.sexnsurf.de



--
Frank Becker



Re: [squid-users] transparent squid + clamav + https

2010-03-15 Thread Luis Daniel Lucio Quiroz
Le Lundi 15 Mars 2010 05:30:11, Stefan Reible a écrit :
 Hi,
 
 for my exam I want to set up a transparent proxy with http and https
 under gentoo linux.
 
 The transparent http proxy with clamav ist working very nice, but now
 i have problems with the implementation of ssl. My first idea was, to
 break down the encryption at the squid, an then create a new one.
 
 http://wiki.squid-cache.org/Features/SslBump
 
 Is this possible? I think the problem is, that if someone opens an
 https encrypted website like https://google.de he gets the certificate
 from the proxy in his browser, not from the webserver. This wouldn`t
 be so fine..
 
 Do you have any solutions, informations or ideas for this problem?
 
 Thanks, Stefan
 
 PS: I have an secound problem with downloading big files, is it
 possilbe to send any infos about the download progress to the
 webbrowser? Like opening an ajax script or something else.

There are 2 ways you may do that.

1. Use 3.1's sslbump capabilities.  However you need a CA already installed in 
your clientes to avoid the non-confidence windows of browsers about ssl cert.  
But this won work in transparent mode.  Just explicit.

2. Use  de DynamicSSLCert branch code. 
https://code.launchpad.net/~rousskov/squid/DynamicSslCert
Not available at 3.1, but at 3.2 (can Ammos or Henrik confirm this?).  However 
you still need the CA and this could work in transparent mode.

LD


Re: [squid-users] transparent squid + clamav + https

2010-03-15 Thread Henrik K
On Mon, Mar 15, 2010 at 12:30:11PM +0100, Stefan Reible wrote:

 PS: I have an secound problem with downloading big files, is it possilbe 
 to send any infos about the download progress to the webbrowser? Like 
 opening an ajax script or something else.

If you don't want this limitation, you can use HAVP. It scans the file while
it's being transferred to client, while keeping small part of it buffered
(in case of virus, it is not transferred so client can't open incomplete
file). It's as close to transparent as you can get.



[squid-users] using TOS values

2010-03-15 Thread Evelio Vila
hi all,

here is a situation in which I could use some help.
I have a postgresql table with user logins associated with TOS values
e.g: john -- 0x04, vila-- 0x08, etc.

I would like to use tcp_outgoing_tos option to mark http requests
accordingly.

For a while now I've been using something like 

acl users1 proxy_auth /etc/squid/qos/high_users
tcp_outgoing_tos 0x10 users1

without problems, but we want to use a database.

I found this,
http://osdir.com/ml/web.squid.general/2003-03/msg01273.html
and maybe with some sort of combination between http_access and
proxy_auth this can be achieved. any clues?

TIA,




-- 
evelio vila


Participe en la 15 Convención Científica de Ingeniería y Arquitectura, del 29 
de noviembre al 3 de diciembre de 2010
La Ingeniería y la Arquitectura por un Futuro Sustentable

Palacio de Convenciones, La Habana, Cuba
http://www.cujae.edu.cu/eventos/convencion 




Re: [squid-users] /etc/init.d/squid start failed

2010-03-15 Thread Hubert Choma
Dnia 15-03-2010 o godz. 14:42 Henrik Nordström napisał(a):
 mĂĄn 2010-03-15 klockan 14:00 +0100 skrev Hubert Choma:
 
  iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT
  --to-port 8080
  It's correct iptables rule ?
 
 Is eth0 the interface where clients traffic is arriving?
 
 If you are using wccp then the interface is usually a gre interface, not
 ethx..
 
 Regards
 Henrik

Hello Henrik Thanks for help !

I don't use WCCP eth0 its interface for WAN. Yes You have got right when 
eth0 is set then my site doesn't work from WAN side and LAN side.  I 
changed it from eth0 to eth1

iptables -A PREROUTING -t nat -i eth1 -p tcp --dport 80 -j REDIRECT 
--to-port 8080
but sites are loading very long !!

My topology
UTM Router 192.168.1.1-eth0 192.168.1.2 (services for WAN 
apache,FTP)eth1 192.168.0.1(on eth1 works squid 192.168.0.1:8080 as 
transparent proxy)---LAN XP clients 192.168.0.0/24
So I have 2 NAT :
First UTM router
SECOND Linux (Centos Router with - apache ftp and squid services)
I would like to setup squid as transparent proxy for lan clients.

I have a few doubts :
1) squid uses dns from resolv.conf so I have 3 nameservers
nameserver 194.204.152.34 first dns from internet provider (DSL)
nameserver 194.204.159.1 second dns
nameserver 192.168.1.1 (UTM router)

My public IP 83.18.17.30 is assigned to domain  geodezja.wolomin.pl
so what entries should i use to /etc/hosts ? Centos machine hostname is 
proliant.
I have 2 Nics and 1 hostname=proliant so i think maybe it is a problem 
with correct resolve of internal LAN names. Look at my /etc/hosts

How they should look like valid entries for the ProLiant machine with 2 
NICS ??

# Do not remove the following line, or various programs
# that require network functionality will fail.
127.0.0.1   localhost.localdomain   localhost   proliant
192.168.1.2 proliant
192.168.0.1 proliant.geodezja.wolomin.pl proliant
#83.18.17.30geodezja.wolomin.pl proliant
192.168.0.2 sm2
192.168.0.3 sm3
192.168.0.4 sm4
192.168.0.6 sm19
192.168.0.8 sm9
::1 localhost6.localdomain6 localhost6

2) Maybe its iptables problem ???
I still have error with /etc/init.d/squid start [failed] and in logs 
there are no warnings or errors !?

Sorry for my english
PLEASE HELP!




Re: [squid-users] Ignore requests from certain hosts in access_log

2010-03-15 Thread Baird, Josh
Ok, that sort of worked.  I have a pair of load balancers sitting in
front of my Squid proxy farm. The load balancers insert the
X-Forwarded-For header into each HTTP request which allows Squid to log
their connections using their real client source IP (extracted from
X-Forwarded-For).  In reality, the connections to the squid servers are
being made directly from the load balancers.

When I use log_access to deny logging to the load balancer's IP
addresses, -nothing- gets logged to access_log.  I am attempting to not
log the health HTTP checks from 10.26.100.130/10.26.100.131 but still
log the other traffic.  It doesn't seem that log_access is
X-Forwarded-For aware?  Any ideas?

acl loadbalancers src 10.26.100.130/255.255.255.255
acl loadbalancers src 10.26.100.131/255.255.255.255
log_access deny !loadbalancers

Thanks,

Josh

From: Baird, Josh jba...@follett.com
 I am trying to ignore requests from two IP addresses in my access_log.
 These two hosts connect every second and do health checks of the 
 proxy service and I would like to eliminate the access_log spam that 
 theycreate.  
 Here is what I am trying:
acl loadbalancers src 
 172.26.100.136/255.255.255.255
acl loadbalancers src 
 172.26.100.137/255.255.255.255
access_log /var/log/squid/access.log squid 
 !loadbalancers
This does not seem to have any effect.  Requests from 
 172.26.100.136 and
.137 are still appearing in the access_log.  Any 
 ideas?

What about 'log_access' ?

JD


Re: [squid-users] transparent squid + clamav + https

2010-03-15 Thread Leonardo Carneiro - Veltrac

I have always read that transparent proxy + https was not possible.
It is now? There is a stable squid version with this feature? There aew 
any major drawbacks using this feature?


Tks in advance.


Henrik K wrote:

On Mon, Mar 15, 2010 at 12:30:11PM +0100, Stefan Reible wrote:
  
PS: I have an secound problem with downloading big files, is it possilbe 
to send any infos about the download progress to the webbrowser? Like 
opening an ajax script or something else.



If you don't want this limitation, you can use HAVP. It scans the file while
it's being transferred to client, while keeping small part of it buffered
(in case of virus, it is not transferred so client can't open incomplete
file). It's as close to transparent as you can get.


  


[squid-users] Windows Live , with proxy auth

2010-03-15 Thread Augusto Casagrande
Hi,

I'm runnig Squid 3.0.STABLE19 , using plain proxy auth (auth_param
basic program /usr/sbin/pam_auth).
Some time ago , I had troubles when trying to login to Windows Live
Messenger. It began to causing problems when i enable the proxy auth.
I could not make it work. My solution was installing Pidgin http://pidgin.im/ .

In Microsoft's page
http://support.microsoft.com/?scid=kb%3Ben-us%3B927847x=1y=8 , quote
:
Users behind some authenticating proxy servers may experience sign-in
issues. They might have to configure the proxy servers to force
authentication for the following User Agent string: MSN Explorer/9.0
(MSN 8.0; TmstmpExt). To do this, users should see the documentation
for their proxy server or contact their network administrator.

Is this the fix for this issue? How can i implement it?

Thanks!


Re: [squid-users] Problem with whitelisting

2010-03-15 Thread Amos Jeffries
On Mon, 15 Mar 2010 16:32:16 +0100, Frank Becker
computersac...@beckerwelt.de wrote:
 Hi all,
 
 I'm using squid on Debian Lenny and use it as a porno-filter. It works 
 fine. I now want to use a whitelist because there are sites which are 
 banned by my filterlist but these sites are ok.
 
 So I created an acl whitelist and allowed access of course. But it 
 doesn't work.
 
 Below there is the segment of my squid.conf. Please, can someone help me

 to whitelist some sites?
 
 Best regards and many thanks in advance
 
 Frank
 
 
 Here are my rules:
 acl our_networks src 192.168.100.0/24
 acl blacklist_domains dstdomain /etc/squid/blacklist_domains
 acl blacklist_regexp dstdom_regex -i /etc/squid/blacklist_regexp
 acl whitelist dstdomain /etc/squid/whitelist
 acl blacklistuser src 192.168.100.2-192.168.100.209 
 192.168.100.221-192.168.100.225
 acl manager proto cache_object
 
 acl admins proxy_auth /etc/squid/admins
 acl users proxy_auth REQUIRED
 http_access allow manager admins
 http_access deny manager
 http_access allow users

.. .authenticated users have unlimited access...
 
 http_access deny manager
 http_access allow purge localhost
 http_access deny purge
 http_access deny !Safe_ports
 http_access deny CONNECT !SSL_ports
 http_access allow whitelist

... sites on the whitelist are accessible by anyone on the planet ...

 http_access deny blacklist_domains
 http_access deny blacklist_regexp

... certain domains are blocked ...

 http_access allow our_networks

... the LAN can get to anything not blocked above.
 http_access deny all
 
 
 The whitelist contains:
 *.openshotvideo.com
 *.sexnsurf.de


* is not valid in a domain name.

The dstdomain wildcard pattern is just this:

  .openshotvideo.com
  .sexnsurf.de


Amos


Re: [squid-users] transparent squid + clamav + https

2010-03-15 Thread Amos Jeffries
On Mon, 15 Mar 2010 14:50:54 -0300, Leonardo Carneiro - Veltrac
lscarne...@veltrac.com.br wrote:
 I have always read that transparent proxy + https was not possible.
 It is now? There is a stable squid version with this feature? There aew 
 any major drawbacks using this feature?
 
 Tks in advance.
 

Sadly, yes it's now possible. No there is not yet a stable version of
Squid to do it.

Yes there are still some limits thankfully:
 1) it is only useful for corporate environments which closely monitor
their own staff.
  1b) has some use catching viruses etc if thats whats monitored for. It
is a slippery slope problem.
 2) it does not work for ISP setups.
 3) requires a CA certificate on all client machines, which authorizes the
proxy fake certificates.
 4) does not work for any hidden-mole attacks (they are still invisible
and actually gain extra info about the network from the certificate
challenges).

Amos

 
 Henrik K wrote:
 On Mon, Mar 15, 2010 at 12:30:11PM +0100, Stefan Reible wrote:
   
 PS: I have an secound problem with downloading big files, is it
 possilbe
 to send any infos about the download progress to the webbrowser? Like 
 opening an ajax script or something else.
 

 If you don't want this limitation, you can use HAVP. It scans the file
 while
 it's being transferred to client, while keeping small part of it
buffered
 (in case of virus, it is not transferred so client can't open
incomplete
 file). It's as close to transparent as you can get.





Re: [squid-users] Ignore requests from certain hosts in access_log

2010-03-15 Thread Amos Jeffries
On Mon, 15 Mar 2010 12:15:49 -0500, Baird, Josh jba...@follett.com
wrote:
 Ok, that sort of worked.  I have a pair of load balancers sitting in
 front of my Squid proxy farm. The load balancers insert the
 X-Forwarded-For header into each HTTP request which allows Squid to log
 their connections using their real client source IP (extracted from
 X-Forwarded-For).  In reality, the connections to the squid servers are
 being made directly from the load balancers.
 
 When I use log_access to deny logging to the load balancer's IP
 addresses, -nothing- gets logged to access_log.  I am attempting to not
 log the health HTTP checks from 10.26.100.130/10.26.100.131 but still
 log the other traffic.  It doesn't seem that log_access is
 X-Forwarded-For aware?  Any ideas?
 
 acl loadbalancers src 10.26.100.130/255.255.255.255
 acl loadbalancers src 10.26.100.131/255.255.255.255
 log_access deny !loadbalancers

Ah, you will require these as well:
 # to trust what the load balancers report for XFF
 follow_x_forwarded_for allow loadbalancers

 # to use the XFF details in the logs
 log_uses_indirect_client on

 # to use the XFF details in ACL tests
 # telling loadbalancer generated requests from relayed
 acl_uses_indirect_client on


Amos


[squid-users] Squid v2.6 error accessing site

2010-03-15 Thread Ivan .
Hi,

I am having some trouble accessing the site
http://www.efirstaid.com.au/. I confirm the TCP SYN packet leaves our
edge router, but I don't see anything back?

If I try to go direct without the squid it works fine.

1268696419.311 113830 10.xxx.xxx.xxx TCP_MISS/503 1444 GET
http://www.efirstaid.com.au/ - DIRECT/70.86.101.210 text/html [Host:
www.efirstaid.com.au\r\nUser-Agent: Mozilla/5.0 (Windows; U; Windows
NT 5.2; en-GB; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12\r\nAccept:
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\nAccept-Language:
en-gb,en;q=0.5\r\nAccept-Encoding: gzip,deflate\r\nAccept-Charset:
ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\nKeep-Alive: 300\r\nProxy-Connection:
keep-alive\r\n] [HTTP/1.0 503 Service Unavailable\r\nServer:
squid\r\nDate: Mon, 15 Mar 2010 23:40:19 GMT\r\nContent-Type:
text/html\r\nContent-Length: 1066\r\nExpires: Mon, 15 Mar 2010
23:40:19 GMT\r\nX-Squid-Error: ERR_CONNECT_FAIL 111\r\n\r]

1268696516.331 114023 10.xxx.xxx.xxx  TCP_MISS/503 1444 GET
http://www.efirstaid.com.au/ - DIRECT/70.86.101.210 text/html [Host:
www.efirstaid.com.au\r\nUser-Agent: Mozilla/5.0 (Windows; U; Windows
NT 5.2; en-GB; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12\r\nAccept:
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\nAccept-Language:
en-gb,en;q=0.5\r\nAccept-Encoding: gzip,deflate\r\nAccept-Charset:
ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\nKeep-Alive: 300\r\nProxy-Connection:
keep-alive\r\n] [HTTP/1.0 503 Service Unavailable\r\nServer:
squid\r\nDate: Mon, 15 Mar 2010 23:41:56 GMT\r\nContent-Type:
text/html\r\nContent-Length: 1066\r\nExpires: Mon, 15 Mar 2010
23:41:56 GMT\r\nX-Squid-Error: ERR_CONNECT_FAIL 111\r\n\r]

Trying a wget on the squid box gets the same timeout erro

[r...@proxy squid]# wget http://www.efirstaid.com.au/
--2010-03-16 11:16:45--  http://www.efirstaid.com.au/
Resolving www.efirstaid.com.au... 70.86.101.210
Connecting to www.efirstaid.com.au|70.86.101.210|:80... failed:
Connection refused.


Thanks
Ivan


Re: [squid-users] Squid v2.6 error accessing site

2010-03-15 Thread Amos Jeffries
On Tue, 16 Mar 2010 11:12:44 +1100, Ivan . ivan...@gmail.com wrote:
 Hi,
 
 I am having some trouble accessing the site
 http://www.efirstaid.com.au/. I confirm the TCP SYN packet leaves our
 edge router, but I don't see anything back?

And what makes you think packets failing to return to your network is
caused by Squid?

Amos



Re: [squid-users] Questions about referer url cache

2010-03-15 Thread dave jones
On Sat, Mar 13, 2010 at 2:35 AM, Henrik Nordstrom  wrote:
 fre 2010-03-12 klockan 23:36 +0800 skrev dave jones:
 My question is I want to offline browse the index.html of foo.com,
 but there are many http://us.rd.foo.com/referurl/news/index/realtime/*;
 in index.html, would anyone tell me how do I solve that referer url to direct
 the correct one, like
 http://us.news.foo.com/article/url/d/a/100312/11/21ycr.html;.
 Thank you very much.

 See the url_rewrite_program option in squid.conf.

Thanks. I use url_rewrite_program /etc/squid/redirect_test.php,
but it seems my program doesn't work...

!/usr/local/bin/php
?php

$temp = array();

while ( $input = fgets(STDIN) ) {
  $temp = split(' ', $input);
  $url = split('[*]', $temp[0]);
  $output = $url . \n;

 echo $output;
}

Would anyone tell me how to solve it? Thanks.

Regards,
Dave.


Re: [squid-users] using TOS values

2010-03-15 Thread Amos Jeffries

Evelio Vila wrote:

hi all,

here is a situation in which I could use some help.
I have a postgresql table with user logins associated with TOS values
e.g: john -- 0x04, vila-- 0x08, etc.

I would like to use tcp_outgoing_tos option to mark http requests
accordingly.

For a while now I've been using something like 


acl users1 proxy_auth /etc/squid/qos/high_users
tcp_outgoing_tos 0x10 users1

without problems, but we want to use a database.

I found this,
http://osdir.com/ml/web.squid.general/2003-03/msg01273.html
and maybe with some sort of combination between http_access and
proxy_auth this can be achieved. any clues?



Most of the clues are in that earlier mail.
The other clue is this:
 http://www.squid-cache.org/Doc/config/external_acl_type/


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18


Re: [squid-users] Windows Live , with proxy auth

2010-03-15 Thread Amos Jeffries

Augusto Casagrande wrote:

Hi,

I'm runnig Squid 3.0.STABLE19 , using plain proxy auth (auth_param
basic program /usr/sbin/pam_auth).
Some time ago , I had troubles when trying to login to Windows Live
Messenger. It began to causing problems when i enable the proxy auth.
I could not make it work. My solution was installing Pidgin http://pidgin.im/ .

In Microsoft's page
http://support.microsoft.com/?scid=kb%3Ben-us%3B927847x=1y=8 , quote
:
Users behind some authenticating proxy servers may experience sign-in
issues. They might have to configure the proxy servers to force
authentication for the following User Agent string: MSN Explorer/9.0
(MSN 8.0; TmstmpExt). To do this, users should see the documentation
for their proxy server or contact their network administrator.

Is this the fix for this issue? How can i implement it?


Depends on exactly what they mean by force.  Having auth turned on is 
the problem. Turning it on will not fix that.


Pidgin is a good replacement for MSN. It lets users who want other 
non-MSN chat protocols to use them without needing additional software 
installed.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE8 or 3.0.STABLE25
  Current Beta Squid 3.1.0.18