[squid-users] How to set up Squid reverse proxy for facebook?

2010-02-13 Thread fulan Peng
Hi, Squid gurus!

I am struggling to set up Squid for facebook. At www.facebook.com ,
when you click login it will go to https://login.facebook.com. After
successful login, it will go back to www.facebook.com.  I used Apache
mod_proxy_html rewrite the url, but still it won't work. Now I am
trying to use Squid to cache all subdomains of facebook in https.  Any
hint or suggestion will be appreciated!

Fulan Peng


[squid-users] Squid reverse proxy, is there anything similar to mod_proxy_html in Apache?

2010-02-03 Thread fulan Peng
Hello guys!

I have made mod_proxy_html working with Apache22(
http://apache.webthing.com/mod_proxy_html/)  But many web sites won't
work because Apache mod_proxy cannot handle small errors in html
pages.  For example, in the web pages, href=/onepage.html is ok. But
href=onepage.html won't work. Squid is stronger than Apache to
handle these small errors in web pages.  I fooled around with Squirm
and Squid  many hours and I never made it work in reverse mode.  Is
Squirm the right thing for reverse proxy url rewriting? Is  there any
tool similar to Apache's mod_proxy_html for reverse proxy url
rewriting?

I really cannot understand.

Thanks a lot!

Fulan Peng


[squid-users] Absolute url links bypass Squid

2010-01-30 Thread fulan Peng
Hi, Squid-users!

I want to make a reverse proxy for a very bad web site. In this web
site, all contents have absolute url address.
I can only get its home page. When I click on any page on the home
page, it will bypass Squid and show the backend web site to the
browser.

Say, the web site is http://www.example.com. It has
http://www.example.com in front of all of its pages. Like,
http://www.example.com/page1.html, http://www.example.com/page2.html.
If I am the web site administrator, I would delete all
http://www.example.com and let it be /page1.html, /page2.html.
This will fix the problem, right? Unfornately, I am not.

How to fix this problem for Squid?

I failed to try with Apache mod_proxy and mod_proxy_html. And no luck
with Squid+Squirm.

Thanks a lot!

Fulan Peng


[squid-users] How to configure Squid to proxy a web site with external links to itself?

2010-01-25 Thread fulan Peng
Hi, gurus!

Some web sites use external to refer internal pages. For example, a
page anotherpage.html at the root directory, usually,
/anotherpage.html will be ok. But it uses
http://thiswebsite.com/anotherpage.html instead. The browser has no
problem. But Squid get lost. Squid thought http://thiswebsite.com is
an external web site and quit and disappeared.  How can we get Squid
work for these web sites?
Thanks a lot!

Fulan Peng


[squid-users] How to make squid work for name based virtual web sites?

2009-12-19 Thread fulan Peng
We know lots of web sites hosted more than one web site. That means
one IP,multiple host names.  The web server go to different
directories based on different domain names.
Now how squid proxy them?

if one IP hosted 2 web sites: a.www.company.com and b.www.company.com,
I want to set up squid to proxy a.www.company.com with
https://squid.com:8000 and b.www.company.com with
https://squid.com:8001. How can I do it?

Thanks a lot!


[squid-users] Squid proxy server --SSH TUNNEL -- Squid SSL reverse proxy server: How TO?

2009-08-16 Thread fulan Peng
Hi, Squid users!

I know how to set up Squid proxy server and SSH tunnel for another
computer which has access to the Squid server. After that, I have to
set up my browser in proxy mode. I do not want this. I do not want to
set up my browser. I want to access some specific site as normal, say,
https://localhost:4443? to one site and https://localhost: to
another site. Can you help me  do that? Or Squid just cannot do it.

You may say, why do not make a Squid SSL reverse proxy Server --
Another Squid SSL reverse proxy server. But as I know, Squid cannot do
this. If you know how to do this, please let me know. I want to get
rid of SSH if I could.

I have setup a j2eb proxy server to proxy a Squid SSL reverse proxy.
But it is extremely slow. When I change to tomcat native library to
speed up SSL encryption, it is broken. It is hard coded for Tomcat
keystore way. When change to cert.pem and key.pem way, it is broken.

Thanks a lot!


[squid-users] squid 3.1: How to setup a Squid SSL reverse proxy for a parent SSL Squid proxy?

2009-08-10 Thread fulan Peng
Hi,

I have a Squid reverse proxy running with SSL support.  People can
access it with https://domainA.com. No problem.
Now I want to set up another Squid proxy server to proxy it  with SSL support.
That means https://domainA -- https://domainB.

My configuration file is similar like this for the parent.
Please help to set up the child squid to proxy this parent.

https_port 443 cert=/usr/newrprgate/CertAuth/testcert.cert
key=/usr/newrprgate/CertAuth/testkey.pem
defaultsite=mywebsite.mydomain.com vhost

cache_peer 10.112.62.20 parent 80 0 no-query originserver login=PASS
name=websiteA

acl sites_server_1 dstdomain websiteA.mydomain.com
cache_peer_access websiteA allow sites_server_1
http_access allow sites_server_1

http_access deny all


[squid-users] How to proxy a parent Squid SSL proxy?

2009-07-03 Thread fulan Peng
Hi, Everybody!

I have a Squid ssl proxy server(3.0.16Stable) running listening 8443.
Now I want to set up another Squid to proxy it again with SSL between
them.
The second Squid serves http requests to browsers.

Now I use my browser and try to connect the second Squid, it won't work.

I made self-signed certificates for the first Squid.

Could you please help me out the the squid.conf as following:


cache_peer proxy.website.com parent 8443 0 no-query originserver name=b2

sslproxy_flags DONT_VERIFY_PEER


hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
cache_mem 128 MB
cache_dir diskd /usr/local/squid/cache 2 64 256
debug_options ALL,1

refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320

acl proxy.website.com dstdomain proxy.website.com


acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8

acl my_ports port   8080

acl Safe_ports port 3128# http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT

http_access allow  proxy.website.com


http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !my_ports
http_access allow all
http_access deny all
http_reply_access allow all

cache_peer_access b2  allow proxy.website.com

visible_hostname second.website.com
.


I deleted some options which are not the trouble.


Re: [squid-users] How to make Squid to cache JSP redirected pages?

2009-03-15 Thread fulan Peng
Thank you,Amos!

Due to security reasom, Tomcat do not like to run as a root.
I was using ipfw to forward port 443 to Tomcat's 8443
In my squid machine, I setup squid to listening 8443.
When the index.jsp page work, it redirect the browser to
https://tomcat:443/the redirected page, which is really the
https://tomcat:8443/redirected page.

If we can cache just a partial of a web site, such as
http://host/one-context-of-tomcat
Then we can resolve the problem becuase inside the context, I know
there is not JSP page would redirect us again.
I remember Apache can do this. But Apache to be a reverse proxy is terrible.

Thanks!


On Sun, Mar 15, 2009 at 4:36 AM, Amos Jeffries squi...@treenet.co.nz wrote:
 fulan Peng wrote:

  Hi,
 When I want to cache a Tomcat site which its index.jsp has a command
 to redirect the browser to another pages, Squid is getting lost, the
 browser shows the redirected page but the port number is wrong. Is
 there any way to handle this situation? I am using Squid 3.0 and
 setting up a reverse proxy.
 Thanks!

 The tomcat JSP application does not sound to be proxy-aware. It's giving out
 its internal ip/fqdn:port info rather than the public details it should.

 Best fix is to correct the JSP app to not care about its operating port.

 Hack fix #1, is to get tomcat listening on an internal IP port 80, so the
 port does not get sent by the app.

 Hack fix #2, is to get Squid to listen on the public IP same ports as tomcat
 is sending out. So as to catch back into sequence the visitors who get
 redirected wrong.

 Amos
 --
 Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
  Current Beta Squid 3.1.0.6



[squid-users] How to make Squid to cache JSP redirected pages?

2009-03-14 Thread fulan Peng
 Hi,
When I want to cache a Tomcat site which its index.jsp has a command
to redirect the browser to another pages, Squid is getting lost, the
browser shows the redirected page but the port number is wrong. Is
there any way to handle this situation? I am using Squid 3.0 and
setting up a reverse proxy.
Thanks!


[squid-users] Question on redirector and Squid accelerator mode

2007-08-04 Thread fulan Peng
Hi,
I have setup Squid 3.0 PRE6 working in accelerator mode with SSL and I
setup Squirm tested working. When I click a link on the web page, it
goes to the original address instead of my rewritten address.

This is what I want:
I have set up Squid to accelerate a web site, say, yahoo.com with
another domain name, https://proxy.mydomain.com. In the web page,
there is a link, google.com, when browser click this link, it will go
to http://www.google.com. What I want is to go my another proxy site,
say, https://proxy.mydomain.com:.
I have setup Squirm with a pattern
regex ^http://www\.google\.com ^https://proxy.mydomain.com: in the
squirm.patterns file.
When I type squirm as root
it works.
When I feed
http://www.google.com, it feed back with
https://proxy.mydomain.com:
I thought Squirm is working.
I went squid.conf and added:
url_rewrite_program /usr/local/squirm/bin/squirm
url_rewrite_children 10
Now I type https://proxy.mydomain.com, it will show yahoo.com, but
when I click a link which is linked to http://www.google.com, it goes
to http://www.google.com instead of https://proxy.mydomain.com:
I am thinking, when the browser click the link, it went off to
http://www.google.com, and never talk to Squirm, what we have to do is
to physically change the web page content, replace each
http://www.google.com with https://proxy.mydomain.com:. Now, when
the browser clicks, it will go to https://proxy.mydomain.com:. But
how can I do this? Is this squid url_rewrite meant to do?
Please help me out? Should I use Apache rewrite engine? Should I use
wget and physically replace the web content?


[squid-users] Squid 3 PRE6 SSL client authentication

2007-07-31 Thread fulan Peng
Hi,
I have made Squid 3 PRE6 woking in SSL without client authentication.
It works fine. Then I combined the cert file and key file into a
pkcs12 file and inputed into the brower and add the CA file into
squid.conf as clientca=the CA file used to sign the cert file. I
open my Squid with MS IE, Squid error log said it won't return
certificate. I use Firefox, it says there is an error. I have already
got a SSL cert from InstantSSL. There were no complain both from IE
and Firefox when without client authenticaton.
The problem is that we do not have a procedure to set up client
authentication for Squid. Please help!


Re: [squid-users] How can I become a

2007-07-24 Thread fulan Peng
You can start and stop Squid as user squid on port 8080 or 8443, then
use a firewall to forward all request from 80 or 443 to 8080 or 8443.
This way, you are safe because you do not have to run Squid as root.

You can run Squid as root but how about some hackers get into your system?
If you run Squid as squid and you are root, do not forget to switch to
squid to run Squid.

su squid
/usr/local/squid/sbin/squid

On 7/24/07, Mohan Jayaweera [EMAIL PROTECTED] wrote:
 Hi all,
 Still newbie questions,
 How can I become a privileged user to start/stop squid?
 some more details and links to read also needed.
 Thank you
 Mohan



[squid-users] How to get a SSL certificate for Squid 3.0

2007-07-22 Thread fulan Peng
Hi,

I am using sefl-signed certificate to make SSL connection with
Browsers. The browser has a warning dialog box complaining my CA not
authorized by the browser. I was thinking to authenticate the browser
to get rid of this warning dialog box. But I failed. Now I am thinkg
to use NCSA to authorize the client and to certify my certificate for
customer to get rid of the security warning dialog window.

I checked many SSL certificate providers. They all deal with Apache or
other application. No one says about Squid. I am just wondering how to
make a certificate request to get the company mail me back a good
certificate so that the browser won't complain any more.

I had created a PKCS12 file and imported into the browser, but still
the browser complains. Please help!

Thanks a lot!


Re: [squid-users] How to get a SSL certificate for Squid 3.0

2007-07-22 Thread fulan Peng
Yes, It seems works. I went InstantSSL, some where called Comodo. They
return me 5 files. I only used the mydomain_com.crt to replace the
cert= file and kept everything the same as before. Now the browser
won't have any complain with https.

I will try the client authentication now. I guess because the CA
problem, the browser may not return its certificate to Squid to fail
the SSL browser authentication.


On 7/22/07, Henrik Nordstrom [EMAIL PROTECTED] wrote:
 On sön, 2007-07-22 at 20:04 -0400, fulan Peng wrote:

  I checked many SSL certificate providers. They all deal with Apache or
  other application. No one says about Squid. I am just wondering how to
  make a certificate request to get the company mail me back a good
  certificate so that the browser won't complain any more.

 Just follow whatever guides you find for Apache. It will work for Squid
 as well as both uses OpenSSL.

 Regards
 Henrik




[squid-users] Squid 3.0 SSL client authentication

2007-07-18 Thread fulan Peng

Hi,

I have made Squid 3.0 SSL working without client authentication. Now I
want to assure the client. I want client install the certificate I
send to him and import to his browser. Other browsers without this
certificate will never be able to get my server.
Now, any browser will have a warning dialog box, if he hit OK, my
server will let him in.

In the server I created 3 files. One is CA. One is Cert file and
another is Key file. In the squid.conf, I added cert=location of cert
file and key=location of key file.

I do not think the server CA file is the cafile the client want
because right now he can click the OK button to get in without the CA
file.

My guess is that I have to create client certfile and sign it to give
to client and add clientca=that file.

Some one please help me work this out. The following is the script to
make one way certification: server authentication. Please help me to
add a couple of lines to make the client certs and to change the
squid.conf.

#!/usr/local/bin/bash
MATRIX=0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz
LENGTH=$RANDOM
let LENGTH /= 2000
let LENGTH += 2
while [ ${n:=1} -le $LENGTH ]
do
   NAME=$NAME${MATRIX:$(($RANDOM%${#MATRIX})):1}
   let n+=1
done
echo $NAME
n=1
LENGTH=$RANDOM
let LENGTH /= 2000
let LENGTH += 2
while [ ${n:=1} -le $LENGTH ]
do
   COMPANY=$COMPANY${MATRIX:$(($RANDOM%${#MATRIX})):1}
   let n+=1
done
echo $COMPANY
n=1
LENGTH=$RANDOM
let LENGTH /= 2000
let LENGTH += 20
while [ ${n:=1} -le $LENGTH ]
do
   PASSWORD=$PASSWORD${MATRIX:$(($RANDOM%${#MATRIX})):1}
   let n+=1
done

echo $PASSWORD
su squid -c /usr/local/squid/sbin/squid -k shutdown
cd /usr/local/squid/etc
rm -f /usr/local/squid/etc/cert.pem
rm -f /usr/local/squid/etc/key.pem
rm -f /usr/local/squid/etc/demoCA/private/cacert.pem
/usr/bin/openssl req -new -x509 -keyout
/usr/local/squid/etc/demoCA/private/cakey.pem -out /usr/l
ocal/squid/etc/demoCA/cacert.pem -days 365 -subj
/C=US/ST=$ST/L=$L/OU=$OU/O=$O/CN=$CN/emailAddres
[EMAIL PROTECTED] -passout pass:$PASSWORD
/usr/bin/openssl req -new -keyout key.pem -out req.pem -days 365 -subj
/C=US/ST=$ST/L=$L/OU=$OU/O
=$O/CN=$CN/[EMAIL PROTECTED]  -passout pass:$PASSWORD
cd /usr/local/squid/etc
cp key.pem key.pem.old
/usr/bin/openssl rsa -in key.pem.old -out key.pem -passin pass:$PASSWORD
/usr/bin/openssl ca -in /usr/local/squid/etc/req.pem -out
/usr/local/squid/etc/cert.pem -passin p
ass:$PASSWORD -batch
chown -R squid:users *
chmod 400 *.pem
chmod 400 demoCA/private/*.pem


[squid-users] How to configure Squid with SSL for a virtual hosting server?

2007-07-16 Thread fulan Peng

Hi, All!

I have configured Squid 3.0 PRE4 to be an accelelator to reverse proxy
a web site, say, www.google.com with a registered domain name, say
www.google-proxy.com
I am running squid as non-root user at port 8443. I use firewall to
redirect 443 request to 8443. So browsers can
https://www.google-proxy.com to browse www.google.com. Every thing is
fine for now.

Now I want to register another domain name, say, yahoo-proxy.com. I
want users to type https://www.yahoo-proxy.com to reach www.yahoo.com.
I want to use only one computer.

Now, www.yahoo-proxy.com and www.google-proxy.com share the same IP
address. How Squid figoure out which is www.google.com and which
www.yahoo.com?

I do not want to use different port number because my customer do not
like to use a column and I do not like to put an Apache in front of
Squid.

Some one please help!


[squid-users] 1000GB traffic 100MB site is free for you to install the Squid proxy for your web site.

2006-11-10 Thread fulan Peng

If you have a home computer and will turn on all the time, you can
install a Squid proxy for my web site. This way, your customer will
access your web with a 100MB speed, and my customer will access my web
site with a dynamic IP address. Also I will offer you a jabber instant
message, chat server for you with your domain name. See
https://breakevilaxis.org. I am working with SIP server and minisip.
When I finished, your customer will have free international phone
calls!


Re: [squid-users] Fwd: Reverse Proxy for HTTPS

2006-09-26 Thread fulan Peng

It does not make sense you encrypt the web pages twice. Why cannot
cache a regular page at http port then send out with SSL? I seems to
me it won't work with double encryption.You can try it.
If you use Windows, 2.6S3 works. Any revision will work on Unix.

The following script is to make certificate for Squid on Windows,
followed by an example of squid.conf(SSL enabled). I have binary for
Windows XP. I compiled it with Cygwin. If you want to install in Unix,
it is very easy. Just ./configure
--with-openssl=.../openssl/include;make;make install; cd ..
squid/var;make cache
cd .. squid/sbin; squid -z; squid. Or do a squid -k parse to test the
configuration file before you run.

c:\openssl\bin\openssl.exe req -new -x509 -keyout
c:\squid\etc\demoCA\private\cakey.pem -out
c:\squid\etc\demoCA\cacert.pem -days 365 -subj
/C=JP/ST=H2iDsZPErqitxps9V86g/L=X8KGZ3iBX5G/OU=wPAV4SQ9ZC8OaSb4S/O=s4R0TH/CN=eO1fsP9t/[EMAIL
 PROTECTED]
-passout pass:z4xZcLW2c4Nty
c:\openssl\bin\openssl.exe req -new -keyout key.pem -out req.pem -days
365 -subj 
/C=JP/ST=H2iDsZPErqitxps9V86g/L=X8KGZ3iBX5G/OU=wPAV4SQ9ZC8OaSb4S/O=s4R0TH/CN=eO1fsP9t/[EMAIL
 PROTECTED]
-passout pass:z4xZcLW2c4Nty
copy key.pem key.pem.old
c:\openssl\bin\openssl.exe rsa -in key.pem.old -out key.pem -passin
pass:z4xZcLW2c4Nty
c:\openssl\bin\openssl.exe ca -in c:\squid\etc\req.pem -out
c:\squid\etc\cert.pem -passin pass:z4xZcLW2c4Nty -batch
c:\squid\sbin\squid.exe

squid.conf
http_port 127.0.0.1:80  defaultsite=ddint.org
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=breakevilaxis.org
cache_peer breakevilaxis.org parent 8800  0 originserver name=futurechinaforum
cache_peer ddint.org parent 80  0 originserver name=ddint
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl breakevilaxis.org dstdomain breakevilaxis.org
acl ddint.org dstdomain ddint.org
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow ddint.org
http_access allow breakevilaxis.org
http_access allow localhost
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
cache_peer_access futurechinaforum   allow breakevilaxis.org
cache_peer_access ddint  allow ddint.org
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


On 9/26/06, Arief Kurniawan [EMAIL PROTECTED] wrote:

I'd like to accelerate our backend HTTPS Server, the SSL Cert. is held
by the backend server (IP 192.168.1.1)
In squid.conf :

http_port 443 vhost
cache_peer  192.168.1.1  parent 443 0 originserver name=myapps
http_access allow all

The question is :
- Is the squid.conf above will be able to redirect any request  from
client  to  https://192.168.1.1  ?
- Or should the squid configured with https_port and obtain another SSL cert ?
- Which is better for this purpose, Squid 3 or Squid 2.6 ? any pointer ?

Regards,

Arief K



Re: [squid-users] File Descriptor

2006-09-24 Thread fulan Peng

I was struggling with this problem a couple of month ago. I am not
sure how I get through. Are you working with Windows? If so you better
get off with msys. You use Cygwin instead. I have one compiled and
running on XP. I never got problem on Linux and FreeBSD. On Windows,
it is 2.6S3, any revision will be working on Linux and FreeBSD.

On 9/24/06, nonama [EMAIL PROTECTED] wrote:

HI,
Need some help from the group. we are curently on
SQUID 2.5-12.
Does the 'file desc currently in use' means, number of
connections to/from squid? We happened to experience
slowness in accessing the net when the file desc
numbers grows.Does limiting the file desc helps to
improve performance. How do I do that?  Which log
should I refer to  check whether there is any problem?
Please help. Thank you so much.

File descriptor usage for squid:
   Maximum number of file descriptors:   1024
   Largest file desc currently in use:300
   Number of file desc currently in use:  248
   Files queued for open:   0
   Available number of file descriptors:  776
   Reserved number of file descriptors:   100
   Store Disk files open:   2


__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around
http://mail.yahoo.com



Re: [squid-users] Reverse proxy HTTPS port on 8443

2006-09-19 Thread fulan Peng

I show you a workable configuration file for 2.6 S3. You can replace
those things.

http_port 127.0.0.1:80  defaultsite=ddint.org
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=zyzg.org.ru
https_port 9001 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=192.168.0.1
https_port 9003 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=www.peacehall.com
cache_peer www.peacehall.com parent 80  0 originserver name=peacehall

cache_peer 192.168.0.1 parent 5225  0 originserver name=futurechinaforum
cache_peer zyzg.org.ru parent 80  0 originserver name=zyzg
cache_peer ddint.org parent 80  0 originserver name=ddint
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl www.peacehall.com dstdomain www.peacehall.com
acl 192.168.0.1 dstdomain 192.168.0.1
acl zyzg.org.ru dstdomain zyzg.org.ru
acl ddint.org dstdomain ddint.org
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow zyzg.org.ru
http_access allow www.peacehall.com
http_access allow ddint.org
#http_access allow www.dajiyuan.com
http_access allow 192.168.0.1
http_access allow localhost
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
cache_peer_access zyzg   allow zyzg.org.ru
cache_peer_access peacehall  allow www.peacehall.com
cache_peer_access futurechinaforum   allow 192.168.0.1
#cache_peer_access dajiyuan  allow www.dajiyuan.com
cache_peer_access ddint  allow ddint.org
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


On 9/19/06, Mohamed Navas V [EMAIL PROTECTED] wrote:

hi,

We have one setup with a reverse proxy for multiple backend back
servers. All these servers are for HTTPtraffic only with accel port
80.

But it's propsed one additional with the existing setup as follows:-


 request on port 8080 request on port 8080
user R.Proxy---Web
Server

  Replay on 8443replay on port 8443
user 
R.ProxyWeb
Server

ie User will request http://example.com:8080/abc but he want to get
HTTPS replay as https://example.com:8443/abc 

We are using squid 2.5, all other servers except this one are
listening on 80,443 ports only.

What changes to be done config file for the same ?

Br--
Navas



[squid-users] Re: Squid server

2006-09-06 Thread fulan Peng

I worked out Squid on FreeBSD, Linux and Windows. I also set up
ejabberd server and sip server on FreeBSD and Windows and Linux. Now I
need work them all out.
My site is at http://breakevilaxis.org

Squid for web proxy.
Jabber for IM proxy.
SIP server for SIP phone proxy.

On 9/6/06, Jakob Curdes [EMAIL PROTECTED] wrote:

StillFree schrieb:

Hi,

This is for my friend's NetCafe.  He want simultaneous
use of 75-100 people including yahoo chat, skype, net
telephone etc.  The question is how much memory
required for such use and what configuration for the
server.  Can anybody tell us detailed system
configuration for the same?

REgards,
Stillfree.

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around
http://mail.yahoo.com






Re: [squid-users] Errors during make

2006-09-01 Thread fulan Peng

Probablly you did not upgrade your libxml2.
If you configure with ssl, you need the newest openssl.
I have set up Squid on Linux, FreeBSD and Windows, all accelerator
mode with SSL. I can help you in my chat server at
http://breakevilaxis.org if you want do the same mode.
I have scripts to create all SSL certificates and files. If you are
not interest in accelerator mode with SSL, then I would not help you.
I do not have time to learn other mode.

Fulan Peng.


On 9/1/06, Robert Shatford [EMAIL PROTECTED] wrote:

Hey guys, this is my first post on this forum, so I would like to say
hello.

I am having troubles when I run make.  It throws a bunch of errors
that I don't understand.  I was wondering if it outputs those errors
to
log somewhere and then I was wondering how do I find out what these
errors mean.  I am very new to Linux and Squid, this is my third
attempt
at making a server and it is the furthest I have gotten.  Any help
would
be wonderful.

Thank you for your time.
Bob Shatford
Asst. Network Administrator
Keller ISD




[squid-users] Pre-Configured Squid 2.6S3 accelerate mode with SSL on Windows.

2006-08-31 Thread fulan Peng

Hi,
I have configured Squid 2.6 STABLE3 NT accelerate mode with SSL on
Windows. If you want to do the same thing, you can download my package
and replace some web site then it will be yours. All you have to do is
to replace the backend website name in the /squid/etc/squid.conf file.
After you unzip the package at C drive root directory, all you have to
do is to go to the /squid/etc directory and fire the command go. If
you want to stop Squid, all you have to do is to fire the command
stop. The place to download the package is at
http://breakevilaxis.org/squid-usa.zip
There is a program called ddint.exe. This is used to produce crazy
random strings to put in the certificates to make your certificates
hard to be identified. If your web site have some words like democracy
and human rights, the Communist China government will block your web
site by filtering out your certificates and send a RESET packet to you
and your clients. So I create a new certificate every time I start
Squid.

I put source code for ddint.exe here in case you worry about it is a
virus. You can compile it with C++.

At the end, it is a workable squid.conf file. I worked several weeks
to get this file.

#include stdafx.h
#include stdlib.h
#include stdio.h
#include string.h
#include time.h
#include iostream
using namespace System;
using namespace std;
void  r16string(int );
static char 
c[64]={'1','q','a','z','2','w','s','x','3','e','d','c','4','r','f','v','5','t','g','b','6','y','h','n','7','u','j','m','8','i','k','T','9','S','o','l','p','0','P','O','I','U','Y','T','R','E','W','Q','A','S','D','F','G','H','J','K','L','M','N','B','V','C','X','Z'};
static char
s[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
static char 
CN[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
L[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
O[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
OU[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
ST[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
PASSWORD[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
emailname[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};
 static char 
emailcompany[64]={'1','2','3','4','5','6','7','8','9','0','1','2','3','4','5','6','7','8','9','0','1','2','3','4'};

 static char*
d[16]={US,CA,DE,CN,KR,BR,FR,JP,IT,BG,TW,UK,RU,CZ,CH,AU};
int main(void)
{
int rand64;
int l=5;
char * C;
srand((unsigned) time(NULL));
int rand16 = rand();
rand16 = rand();
rand16 =0x000F;

C = d[rand16];

cout  openssl req -new -x509 -keyout
c:\\squid\\etc\\demoCA\\private\\cakey.pem -out
c:\\squid\\etc\\demoCA\\cacert.pem -days 365 -subj /C=;
cout C /ST=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
ST[i] = c[rand64];
}
ST[rand16]=0;

coutST/L=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
L[i] = c[rand64];
}
L[rand16]=0;
coutL/OU=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
OU[i] = c[rand64];
}
OU[rand16]=0;
cout OU/O=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
O[i] = c[rand64];
}
O[rand16]=0;
coutO/CN=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
CN[i] = c[rand64];
}
CN[rand16]=0;
cout CN/emailAddress=;
rand16 = rand();
rand16 =0x000F;
rand16 +=l;
for ( int i=0; i  rand16; i++) {
rand64=rand();
rand64 = 0x003F;
emailname[i] = c[rand64];
}
emailname[rand16]=0;
coutemailname@;
rand16 = rand();
rand16 =0x000F;
rand16 

[squid-users] Invalid Request.Squid2.6S3 accelerate with SSL on Windows

2006-08-30 Thread fulan Peng

Hi,
I use Cygwin compiled every revision of Squid 2.6 and 3.0. None of
them work with SSL setup accelerate. In 3.0, the error is the known
bug: too many open files.
In 2.6S3, the error for https is invalid request. Http works fine.

The following are my squid.conf file and some section of the log file.
Please help me to fix this problem. I need SSL in accelerate mode for
Windows badly.

I installed Squid in a local network. The IP is 192.168.0.91. The
gateway is 192.168.0.1. I can see a dialog box pops up when I run
squid. It warns the DNS will run. The DNS IP is provided by my ISP.
Since http works I think there should not have any problem with the
firewall. Also I tried to change the external backend to localhost web
server, still, it won't work.

http_port 127.0.0.1:80 vhost vport
cache_peer zyzg.org.ru parent 80 0 originserver
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
cache_peer 66.29.75.20 parent 80  0 originserver
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
maximum_object_size_in_memory 80 KB
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
collapsed_forwarding on
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access allow all
http_reply_access allow all
icp_access allow all
visible_hostname ddint.org
coredump_dir c:/squid/var/cache
request_entities on
vary_ignore_expire on

=

parseHttpRequest: Client HTTP version 1.1.
2006/08/30 11:30:06| parseHttpRequest: Method is 'GET'
2006/08/30 11:30:06| parseHttpRequest: URI is '/'
2006/08/30 11:30:06| parseHttpRequest: req_hdr = {Accept: */*

Accept-Encoding: gzip, deflate

Cookie: sid=xyyqOo; _cookietime=2592000; _discuz_uid=33777;
_discuz_pw=0e63380c64dc6ef96cca941443b47c9d; _discuz_secques=0

User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1;
Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 2.0.50727)

Host: localhost

Connection: Keep-Alive

Accept-Language: en-us



}
2006/08/30 11:30:06| parseHttpRequest: end = {}
2006/08/30 11:30:06| parseHttpRequest: prefix_sz = 381, req_line_sz = 16
2006/08/30 11:30:06| parseHttpRequest: Request Header is
Accept: */*

Accept-Encoding: gzip, deflate

Cookie: sid=xyyqOo; _cookietime=2592000; _discuz_uid=33777;
_discuz_pw=0e63380c64dc6ef96cca941443b47c9d; _discuz_secques=0

User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1;
Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 2.0.50727)

Host: localhost

Connection: Keep-Alive

Accept-Language: en-us

2006/08/30 11:30:06| cbdataFree: 0x101e0a50
2006/08/30 11:30:06| cbdataFree: Freeing 0x101e0a50
2006/08/30 11:30:06| conn-in.offset = 0
2006/08/30 11:30:06| commSetTimeout: FD 10 timeout 86400
2006/08/30 11:30:06| clientReadRequest: FD 10 Invalid Request
2006/08/30 11:30:06| init-ing hdr: 0x1020ee1c owner: 1
2006/08/30 11:30:06| storeCreateEntry: 'error:invalid-request'
2006/08/30 11:30:06| creating rep: 0x101e2860
2006/08/30 11:30:06| init-ing hdr: 0x101e28a4 owner: 2
2006/08/30 11:30:06| 0x101e28a4 lookup for 38
2006/08/30 11:30:06| 0x101e28a4 lookup for 9
2006/08/30 11:30:06| 0x101e28a4 lookup for 38
2006/08/30 11:30:06| 0x101e28a4 lookup for 9
2006/08/30 11:30:06| 0x101e28a4 lookup for 50
2006/08/30 11:30:06| 0x101e28a4 lookup for 22
2006/08/30 11:30:06| new_MemObject: returning 0x1006c6b0
2006/08/30 11:30:06| new_StoreEntry: returning 0x101e1e10
2006/08/30 11:30:06| storeKeyPrivate: GET error:invalid-request



Re: [squid-users] Invalid Request.Squid2.6S3 accelerate with SSL on Windows

2006-08-30 Thread fulan Peng

Yes. It's working now after I added defaultsite=thebackend server domain name

But there is one problem:  the https always go to http's backend. If I
comment out the http port and put two in the https section, still
there is only one backend I can access via Squid. It seems that one
instance of Squid can only serve one backend? Is this right? I have to
use Windows service to run multiple instances of Squid so that the
clients can access multiple backend servers via my Squid. Is this
right? I have to recompile Squid with--win32-service. Is this right?

Thanks a lot!

Fulan Peng.


On 8/30/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:

ons 2006-08-30 klockan 11:48 -0400 skrev fulan Peng:
 In 2.6S3, the error for https is invalid request. Http works fine.

And I keep saying that you need to configure your https_port proper for
accelerator operation. You have not. As result Squid has no clue what to
do with the requests received on your https_port, and rejects them.

You need AT MIMIMUM to specify a defaultsite=... argument to your
https_port. Maybe vhost as well but vhost only makes sense if you have a
wildcard certificate as SSL is a bit picky about requested domain
names..

Regards
Henrik





Re: [squid-users] Invalid Request.Squid2.6S3 accelerate with SSL on Windows

2006-08-30 Thread fulan Peng

Please help, I can not have more than 2 backends.
The following is the conf file. I can only have the second one. Squid
-k parse is OK.

Thanks a lot!


http_port 127.0.0.1:80 vhost vport
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=zyzg.org.ru
https_port 8443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=breakevilaxis.org
cache_peer breakevilaxis.org parent 80  0 originserver name=breakevilaxis
cache_peer zgzg.org.ru parent 80  0 originserver name=zyzg
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl breakevilaxis.org dstdomain breakevilaxis.org
acl zyzg.org.ru dstdomain zyzg.org.ru
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow zyzg.org.ru
http_access allow breakevilaxis.org
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
cache_peer_access zyzg   allow zyzg.org.ru
cache_peer_access breakevilaxis  allow breakevilaxis.org
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


On 8/30/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:

ons 2006-08-30 klockan 18:05 -0400 skrev fulan Peng:
 Yes. It's working now after I added defaultsite=thebackend server domain 
name

 But there is one problem:  the https always go to http's backend. If I
 comment out the http port and put two in the https section, still
 there is only one backend I can access via Squid. It seems that one
 instance of Squid can only serve one backend?

A single Squid can have as many backends you like, but it needs to be
told what to send where by cache_peer_access.

Regards
Henrik





Re: [squid-users] Invalid Request.Squid2.6S3 accelerate with SSL on Windows

2006-08-30 Thread fulan Peng

Sorry, it found to be a type error. Squid -k only check grammar. No
spelling error. Now I can access more than one backends on https port.
This is the conf file.

Now the http won't work. I think I can fix it.

Thank you so much!

Fulan Peng.


http_port 127.0.0.1:80 vhost vport
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=zyzg.org.ru
https_port 8443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=breakevilaxis.org
cache_peer breakevilaxis.org parent 80  0 originserver name=breakevilaxis
cache_peer zyzg.org.ru parent 80  0 originserver name=zyzg
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl breakevilaxis.org dstdomain breakevilaxis.org
acl zyzg.org.ru dstdomain zyzg.org.ru
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow zyzg.org.ru
http_access allow breakevilaxis.org
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
cache_peer_access zyzg   allow zyzg.org.ru
cache_peer_access breakevilaxis  allow breakevilaxis.org
visible_hostname ddint.org
coredump_dir c:/squid/var/cache



On 8/30/06, fulan Peng [EMAIL PROTECTED] wrote:

Please help, I can not have more than 2 backends.
The following is the conf file. I can only have the second one. Squid
-k parse is OK.

Thanks a lot!


http_port 127.0.0.1:80 vhost vport
https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=zyzg.org.ru
https_port 8443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
defaultsite=breakevilaxis.org
cache_peer breakevilaxis.org parent 80  0 originserver name=breakevilaxis
cache_peer zgzg.org.ru parent 80  0 originserver name=zyzg
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl breakevilaxis.org dstdomain breakevilaxis.org
acl zyzg.org.ru dstdomain zyzg.org.ru
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80
acl Safe_ports port 21
acl Safe_ports port 443 563
acl Safe_ports port 70
acl Safe_ports port 210
acl Safe_ports port 1025-65535
acl Safe_ports port 280
acl Safe_ports port 488
acl Safe_ports port 591
acl Safe_ports port 777
acl CONNECT method CONNECT
http_access allow zyzg.org.ru
http_access allow breakevilaxis.org
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
cache_peer_access zyzg   allow zyzg.org.ru
cache_peer_access breakevilaxis  allow breakevilaxis.org
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


On 8/30/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:
 ons 2006-08-30 klockan 18:05 -0400 skrev fulan Peng:
  Yes. It's working now after I added defaultsite=thebackend server domain 
name
 
  But there is one problem:  the https always go to http's backend. If I
  comment out the http port and put two in the https section, still
  there is only one backend I can access via Squid. It seems that one
  instance of Squid can only serve one backend?

 A single Squid can have as many backends you like, but it needs to be
 told what to send where by cache_peer_access.

 Regards
 Henrik






[squid-users] What configuration should I set up?

2006-08-29 Thread fulan Peng

Hi,
I need Squid on Windows to be a proxy server so that clients in China
can access the backend server via the Squid proxy. The backend server
will not be able to be accessed by the clients. The backend server can
not be resolved by the Chinese DNS. The Squid's host can be accessed
by the clients. What is the setup I should set up. I think it is the
accelerate mode or reverse proxy mode. If I am wrong, please point it
out. I never set up a transparent mode. My understanding is that I
have to set up reverse proxy mode.

Fulan Peng.


[squid-users] Invalid request error of Squid 2.6 PRE3 https on Windows.

2006-08-28 Thread fulan Peng

Hi,
I have compiled Squid2.6PRE3 in Cygwin on Windows XP SP2. The http is
ok but there is a invalid request error for the https accelorator
mode.

Here are a section of the log file.

cbdataLock: 0x10227870
2006/08/28 07:56:32| parseHttpRequest: Client HTTP version 1.1.
2006/08/28 07:56:32| parseHttpRequest: Method is 'GET'
2006/08/28 07:56:32| parseHttpRequest: URI is '/'
2006/08/28 07:56:32| parseHttpRequest: req_hdr = {Accept: */*

Accept-Encoding: gzip, deflate

Cookie: sid=aDtap9

User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1;
Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 2.0.50727)

Host: localhost

Connection: Keep-Alive

Accept-Language: en-us



}
2006/08/28 07:56:32| parseHttpRequest: end = {}
2006/08/28 07:56:32| parseHttpRequest: prefix_sz = 277, req_line_sz = 16
2006/08/28 07:56:32| parseHttpRequest: Request Header is
Accept: */*

Accept-Encoding: gzip, deflate

Cookie: sid=aDtap9

User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1;
Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 2.0.50727)

Host: localhost

Connection: Keep-Alive

Accept-Language: en-us




2006/08/28 07:56:32| cbdataFree: 0x1023ae80
2006/08/28 07:56:32| cbdataFree: Freeing 0x1023ae80
2006/08/28 07:56:32| conn-in.offset = 0
2006/08/28 07:56:32| commSetTimeout: FD 10 timeout 86400
2006/08/28 07:56:32| clientReadRequest: FD 10 Invalid Request
2006/08/28 07:56:32| init-ing hdr: 0x10200d84 owner: 1
2006/08/28 07:56:32| storeCreateEntry: 'error:invalid-request'
2006/08/28 07:56:32| creating rep: 0x1023d780


Re: [squid-users] Invalid request error of Squid 2.6 PRE3 https on Windows.

2006-08-28 Thread fulan Peng

Yes, I have setup https reverse mode.
The following is my squid.conf file. Please help me to see if any
problem in it. If I type http://localhost in the browser address line,
it works. If I type https://localhost, it asked my to confirm the
certificate, then a invalid request error page.

http_port 127.0.0.1:80 vhost vport
cache_peer zyzg.org.ru parent 80 0 originserver
https_port 443 cert=/usr/squid/etc/cert.pem key=/usr/squid/etc/key.pem
cache_peer breakevilaxis.org parent 80  0 originserver
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
visible_hostname ddint.org
coredump_dir c:/squid/var/cache






On 8/28/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:

On Mon, 2006-08-28 at 08:07 -0400, fulan Peng wrote:
 Hi,
 I have compiled Squid2.6PRE3 in Cygwin on Windows XP SP2. The http is
 ok but there is a invalid request error for the https accelorator
 mode.

Have you set your https_port proper for accelerator/reverse proxy
operation? Minimum defaultsite= should be specified (in addition to
certificates etc..)

Regards
Henrik




Re: [squid-users] Invalid request error of Squid 2.6 PRE3 https on Windows.

2006-08-28 Thread fulan Peng

Here is my squid.conf file.
I compiled squid 2.6PRE3 on Cygwin ssl enabled on Windows XP SP2.
I copied all neccessary dll's into c:\windows\system32 then run the
program in Windows. Also I tried run Squid within Cygwin, the
same:Invalid request. The http is OK.

http_port 127.0.0.1:80 vhost vport
cache_peer zyzg.org.ru parent 80 0 originserver
https_port 443 cert=/usr/squid/etc/cert.pem key=/usr/squid/etc/key.pem
cache_peer breakevilaxis.org parent 80  0 originserver
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


Re: [squid-users] Invalid request error of Squid 2.6 PRE3 https on Windows.

2006-08-28 Thread fulan Peng

Sorry, I am using Gmail. This is gmail problem. I posted the message.
It did not show up. This is the configuration file.

http_port 127.0.0.1:80 vhost vport
cache_peer zyzg.org.ru parent 80 0 originserver
https_port 443 cert=/usr/squid/etc/cert.pem key=/usr/squid/etc/key.pem
cache_peer breakevilaxis.org parent 80  0 originserver
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  
acl Safe_ports port 21  
acl Safe_ports port 443 563 
acl Safe_ports port 70  
acl Safe_ports port 210 
acl Safe_ports port 1025-65535  
acl Safe_ports port 280 
acl Safe_ports port 488 
acl Safe_ports port 591 
acl Safe_ports port 777 
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
http_reply_access allow all
icp_access allow all
visible_hostname ddint.org
coredump_dir c:/squid/var/cache


[squid-users] Invalid request error of Squid 2.6 PRE3 on Windows.

2006-08-27 Thread fulan Peng

Hi,
I have compiled Squid 2.6PRE3 on Windows XP SP2 with the latest Cygwin
and with SSL enabled. It works with the http port but won't work with
the https port.

Attached is my squid.conf file and the log file. Some one please help
me to give a dirty fix so as I can run it some how.

I also compile the 3.0 PRE4. The Squid won't start because of the
known bug of the maximum number of file descritor problem.  If you
know there is a patch to let me run any ways that will be great!

Thanks!

Fulan Peng.

Here is my squid.conf and some text cut from the log.

http_port 127.0.0.1:80 vhost vport
cache_peer zyzg.org.ru parent 80 0 originserver

https_port 443 cert=c:\squid\etc\cert.pem key=c:\squid\etc\key.pem
cache_peer breakevilaxis.org parent 80  0 originserver
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log c:/squid/var/logs/access.log squid
debug_options ALL,9
refresh_pattern ^ftp:   144020% 10080
refresh_pattern ^gopher:14400%  1440
refresh_pattern .   0   20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80  # http
acl Safe_ports port 21  # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70  # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
# Deny requests to unknown ports
http_access deny !Safe_ports
# Deny CONNECT to other than SSL ports
http_access deny CONNECT !SSL_ports
http_access allow all

http_reply_access allow all
icp_access allow all
coredump_dir c:/squid/var/cache
==


2006/08/27 18:59:42| storeClientCopy3: Copying from memory
2006/08/27 18:59:42| memCopy: offset 0: size 4096
2006/08/27 18:59:42| storeSwapOut: lowest_offset = 0
2006/08/27 18:59:42| cbdataValid: 0x1023c128
2006/08/27 18:59:42| clientSendMoreHeaderData: error:invalid-request, 1760 bytes
2006/08/27 18:59:42| clientSendMoreHeaderData: FD 10
'error:invalid-request', out.offset=0
2006/08/27 18:59:42| creating rep: 0x1023e510
2006/08/27 18:59:42| init-ing hdr: 0x1023e554 owner: 2
2006/08/27 18:59:42| 0x1023e554 lookup for 38
2006/08/27 18:59:42| 0x1023e554 lookup for 9
2006/08/27 18:59:42| 0x1023e554 lookup for 38
2006/08/27 18:59:42| 0x1023e554 lookup for 9
2006/08/27 18:59:42| 0x1023e554 lookup for 22
2006/08/27 18:59:42| cleaning hdr: 0x1023e554 owner: 2
2006/08/27 18:59:42| init-ing hdr: 0x1023e554 owner: 2
2006/08/27 18:59:42| 0x1023e554 lookup for 38
2006/08/27 18:59:42| 0x1023e554 lookup for 9
2006/08/27 18:59:42| 0x1023e554 lookup for 38
2006/08/27 18:59:42| 0x1023e554 lookup for 9
2006/08/27 18:59:42| 0x1023e554 lookup for 22
2006/08/27 18:59:42| parsing hdr: (0x1023e554)
Server: squid/2.6.STABLE3

Date: Sun, 27 Aug 2006 22:59:42 GMT

Content-Type: text/html

Content-Length: 1547

Expires: Sun, 27 Aug 2006 22:59:42 GMT

X-Squid-Error: ERR_INVALID_REQ 0


[squid-users] How to compile Squid 3.0 for Windows?

2006-08-25 Thread fulan Peng

Hi,
I am looking Squid on Windows with SSL and accelorator. I tried all
2.6 binaries and all of them have an error of no OPENSSL_Applink
error. I did not try the 2.5 version.
I tried to compile 2.6 and 3.0 all reversions on msys-mingw, and all
of them have a compiling error.  The following is the error message
and the C source code.
I have already done the same thing on FreeBSD and Linux without any error.
I do not know why. Thanks.


if gcc -DHAVE_CONFIG_H -I. -I. -I../include -I../include -I../include
-I../lib/cppunit-1.10.0/include -I ../lib/cppunit-1.10.0/include
-Ic:/apache2/srclib/openssl/include/include  -Werror -Wall
-Wpointer-arith -Wwrite-strings -Wmissing-prototypes
-Wmissing-declarations -Wcomments -Wall -g -O2 -mthreads -MT
getfullhostname.o -MD -MP -MF .deps/getfullhostname.Tpo -c -o
getfullhostname.o getfullhostname.c; \
then mv -f .deps/getfullhostname.Tpo .deps/getfullhostname.Po;
else rm -f .deps/getfullhostname.Tpo; exit 1; fi
getfullhostname.c: In function `getfullhostname':
getfullhostname.c:86: warning: implicit declaration of function `gethostname'
getfullhostname.c:88: warning: implicit declaration of function `gethostbyname'
getfullhostname.c:88: warning: assignment makes pointer from integer
without a cast
getfullhostname.c:89: error: dereferencing pointer to incomplete type
make[2]: *** [getfullhostname.o] Error 1
make[2]: Leaving directory `/usr/squid-3.0.PRE4-20060825/lib'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/usr/squid-3.0.PRE4-20060825/lib'
make: *** [all-recursive] Error 1


/*
* $Id: getfullhostname.c,v 1.20 2003/01/23 00:37:01 robertc Exp $
*
* DEBUG:
* AUTHOR: Harvest Derived
*
* SQUID Web Proxy Cache  http://www.squid-cache.org/
* --
*
*  Squid is the result of efforts by numerous individuals from
*  the Internet community; see the CONTRIBUTORS file for full
*  details.   Many organizations have provided support for Squid's
*  development; see the SPONSORS file for full details.  Squid is
*  Copyrighted (C) 2001 by the Regents of the University of
*  California; see the COPYRIGHT file for full details.  Squid
*  incorporates software developed and/or copyrighted by other
*  sources; see the CREDITS file for full details.
*
*  This program is free software; you can redistribute it and/or modify
*  it under the terms of the GNU General Public License as published by
*  the Free Software Foundation; either version 2 of the License, or
*  (at your option) any later version.
*
*  This program is distributed in the hope that it will be useful,
*  but WITHOUT ANY WARRANTY; without even the implied warranty of
*  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
*  GNU General Public License for more details.
*
*  You should have received a copy of the GNU General Public License
*  along with this program; if not, write to the Free Software
*  Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111, USA.
*
*/

#include config.h

#if HAVE_LIBC_H
#include libc.h
#endif
#if HAVE_STDIO_H
#include stdio.h
#endif
#if HAVE_STDLIB_H
#include stdlib.h
#endif
#if HAVE_STRING_H
#include string.h
#endif
#if HAVE_SYS_PARAM_H
#include sys/param.h
#endif
#if HAVE_SYS_TYPES_H
#include sys/types.h
#endif
#if HAVE_SYS_SOCKET_H
#include sys/socket.h
#endif
#if HAVE_NETINET_IN_H
#include netinet/in.h
#endif
#if HAVE_ARPA_INET_H
#include arpa/inet.h
#endif
#if HAVE_NETDB_H  !defined(_SQUID_NETDB_H_)   /* protect on NEXTSTEP */
#define _SQUID_NETDB_H_
#include netdb.h
#endif
#if HAVE_UNISTD_H
#include unistd.h
#endif

#include util.h

/*
*  getfullhostname() - Returns the fully qualified name of the current
*  host, or NULL on error.  Pointer is only valid until the next call
*  to the gethost*() functions.
*/
const char *
getfullhostname(void)
{
   const struct hostent *hp = NULL;
   static char buf[SQUIDHOSTNAMELEN + 1];

   if (gethostname(buf, SQUIDHOSTNAMELEN)  0)
return NULL;
hp = gethostbyname(buf);
   if (hp  != NULL)
xstrncpy(buf, hp-h_name, SQUIDHOSTNAMELEN);
   return buf;
}


[squid-users] How to just cache the default index.html page only?

2006-07-30 Thread fulan Peng

Hi,
I just want to cache up the web site only its index.html page. Then
the rest I will return back to its original web site. Can I do this?
Do I have to have rewrite the web page so that re-direct the links to
its original web site?

Thanks a lot!

Fulan Peng.

On 7/30/06, Nicola Giosmin [EMAIL PROTECTED] wrote:

Dear all,

I  just subscribed  and I  am a  newby so,  please
forgive me if my questions are too stupid. :)

I  needed  to  keep  track  about  the  internet
navigation  of 11  users  of a  public machine.  I
installed a Knoppix and  then Squid (Webalizer for
the reports).

Squid works  fine, Webalizer  too and  the reports
are generated without any problem.

The problem  is that I  need a report  about *who*
and  *when*  he/she  surfs on  the  internet  (not
really *what*  he/she sees). Due to  the fact that
it is the first time that I install Squid I saw in
the manual that this kind  of things could be done
with an ACL (type ident).

So I created in squid.conf the following:

acl users ident user1 user2 user3 ... user11
http_access allow users
ident_lookup_access allow users

The browser  (Mozilla-Firefox) is properly  set up
to use proxy, and Squid reports each page seen.

But it does not report about the users' names. Any
idea?

Thanks a lot

nicgios

ps. Ident2 is active in background...



Chiacchiera con i tuoi amici in tempo reale!
http://it.yahoo.com/mail_it/foot/*http://it.messenger.yahoo.com



[squid-users] Re: How to set up a reverse proxy server over SSL?

2006-07-03 Thread fulan Peng

Hello,
I got success to set up a reverse proxy server over SSL.
The following is my experience:

1, compile squid with --enable-ssl and optional --with-openssl= if your
ssl-devel not in /usr/include/openssl f.e.
--with-openssl=/usr/local/include

./configure --enable-ssl --with-openssl=/usr/local/ssl/include

2. cd /usr/local/squid/etc
mkdir demoCA
cd demoCA
touch index.txt
echo 01  serial
mkdir private
mkdir newcerts

generate CA certificate (self-signed)
/usr/local/ss/bin/openssl req -new -x509 -keyout
/usr/local/squid/etc/demoCA/private/cakey.pem -out
/usr/local/squid/etc/demoCA/cacert.pem -days 365 -subj
/CA=US/ST=/L=x/OU=/O=/CN=yourdomain/[EMAIL PROTECTED]

3. generate certificate
/usr/local/ssl/bin/openssl req -new -keyout key.pem -out req.pem -days 365
where req.pem - certificate request

4. Remove the password from the key.
cd /usr/local/squid/etc
cp key.pem key.pem.old
/usr/local/ssl/bin/openssl rsa -in key.pem.old -out key.pem

5.sign this certificate with your CA cert
/usr/local/ssl/bin/openssl ca -in /usr/local/squid/etc/req.pem -out
/usr/local/squid/etc/cert.pem

6.remove unneeded lines from cert.pem (usually you only need
lines beetwen
-BEGIN CERTIFICATE-
.
..
-END CERTIFICATE-

7. add this in squid.conf

https_port [ip_address:]port cert=/where/cert.pem key=/where/key.pem

Here are the keys for the config of squid:

acl huanghuagang.org dstdomain huanghuagang.org
acl our_networks src 192.168.0.0/24

http_access allow huanghuagang.org
http_access allow our_networks

https_port  accel vhost cert=/usr/local/squid/etc/cert.pem
key=/usr/local/squid/etc/key.pem
cafile=/usr/local/squid/etc/demoCA/cacert.pem defaultsite=xxx.fr

cache_peer huanghuagang.org parent 80 0 no-query originserver name=huanghuagang

cache_peer_access huanghuagang allow huanghuagang.org

If I need another site, I would assign 8889 to this site and repeat
everything above. I do not know if there is a better way. But this way
is easy to understand.

On 7/3/06, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:

Yes. I have finished to set up reverse proxy server without SSL. It is
fast! I love it!  Now it is an issue to add SSL on it.I think it will
not be hard. I will post the whole procedure and the actual woking
squid.conf file once I got success.

Thank you!



-Original Message-
From: Henrik Nordstrom [EMAIL PROTECTED]
To: fulan Peng [EMAIL PROTECTED]
Cc: squid-users@squid-cache.org; Visolve Squid [EMAIL PROTECTED]
Sent: Mon, 03 Jul 2006 07:34:54 +0200
Subject: Re: [squid-users] How to set up a reverse proxy server over
SSL?

mån 2006-07-03 klockan 09:17 +0530 skrev Visolve Squid:

 Hello Peng,

 The following steps are used to configure the squid-3.0 with SSL

 Compile squid with the ssl support option

 ./configure --prefix=/usr/local/squid --enable-ssl

 Edit the squid configuration for squid with SSL support (Reverse
proxy)

 https_port 443 protocol=http
 cert=/path/to/server/certificate/server_cert.pem
 key=/path/to/server/key/server_priv_key.pem vport=port in which the
 back end server listen

almost... you should primarily use defaultsite=your.main.site to enable
reverse proxy mode and maybe vhost is you need to support domain based
virtual hosting. vport is normally not needed. The port number is
specified in cache_peer.

As hinted above you also need a cache_peer line defining the origin
server address and port.

 acl SSL method CONNECT
 never_direct allow SSL

The CONNECT mthod is not applicable to reverse proxies and should
probably be denied entirely...

Regards
Henrik



[squid-users] How to fix the TCP_MISS 302 678 problem--Image can not be cached.

2006-07-03 Thread fulan Peng

I have set up a Squid reverse proxy server for a web site named
http://www.dajiyuan.com
This site is ok both for IE and Netscape browsers. But when I reverse
proxy it, it pop up a message says A license is required for all but
for personal use of this code. See terms of use of Dyn-web.com. After
I click OK, many image .gif files can not display. In the access_log,
it says TCP_MISS 302 678  First_UP_Parent.
I cannot understand why Squid did not cache this images since they can
be catched in regular browsers.

Thanks a lot!

Fulan Peng.


[squid-users] Re: How to fix the TCP_MISS 302 678 problem--Image can not be cached.

2006-07-03 Thread fulan Peng

Yes. I am reading their terms of use. They do not allow others to use
their image icons. Now I understand. I have to get permission from
this site in order to cache it.
They have something anti-cache code probably.

Thanks a lot!

Fulan Peng.

On 7/3/06, Henrik Nordstrom [EMAIL PROTECTED] wrote:

mån 2006-07-03 klockan 17:20 -0400 skrev fulan Peng:
 I have set up a Squid reverse proxy server for a web site named
 http://www.dajiyuan.com
 This site is ok both for IE and Netscape browsers. But when I reverse
 proxy it, it pop up a message says A license is required for all but
 for personal use of this code. See terms of use of Dyn-web.com.

I would suggest you try to comply with the license requirements of their
code.

 After
 I click OK, many image .gif files can not display. In the access_log,
 it says TCP_MISS 302 678  First_UP_Parent.

302 Found indicates the object has moved temporarily to another URL.
Normally not cacheable.

Regards
Henrik




[squid-users] How to set up a reverse proxy server over SSL?

2006-07-02 Thread fulan Peng

Hi,

I have compiled Squid 3.0 pre-release4 with its default.
Then I changed one line from http-access deny all to http-access allow all
and tried out the non-ssl forward proxy server worked.

Now I want to set up with SSL and a reverse proxy server.

Could you please help to tell where is a tutorial or a sample configuration
file?

I have a few non-SSL web sites(say, http://breakevilaxis.org,
http://www.dajiyuan.com,http://zyzg.org). I want to wrap then with Squid
SSL. If it is much easier to set up a non-SSL reverse proxy server and add a
stunnel on top on it. I would try stunnel. Since Squid already support SSL,
we should use its own SSL capability.

Thanks a lot!

Frank Peng.