On 29/07/11 09:05, Scott Mace wrote:
I have a whitelist to allow users to access only sites required.  We primarily 
use it for ftp, either through a web browser or filezilla-like clients.  The 
browser based is flawless, but odd behavior with ftp clients.
acl whitelist dstdomain "/etc/squid3/whitelist"
http_access deny !whitelist
Whitelist contains (for testing):
gatekeeper.dec.com

Here is the result:
1311886691.258  21738 192.168.100.194 TCP_MISS/200 998 CONNECT 
gatekeeper.dec.com:21 - DIRECT/192.6.29.21 -
1311886757.392      0 192.168.100.194 TCP_DENIED/403 1899 CONNECT 
192.6.29.21:51967 - NONE/- text/html

As you can see, it changes from using hostname to IP address, which matches 
nothing in the whitelist, and is denied.  If I add the IP to the whitelist, it 
works perfectly.  How can I force it to always use the hostname?

You can't. It uses what the client is trying to use. So that malicious clients can get denied.

As for the "change". This is how FTP works. With multiple channels setup. Reason #1 why it cannot be relayed by Squid. Use a proxy designed to relay FTP, such as frox, instead.

Squid only supports fetching FTP data and reformatting into HTTP responses for clients. read-only via a web browser etc.


IP added to whitelist:
1311887068.133  17458 192.168.100.194 TCP_MISS/200 1919 CONNECT 
gatekeeper.dec.com:21 - DIRECT/192.6.29.21 -
1311887072.841    124 192.168.100.194 TCP_MISS/200 0 CONNECT 192.6.29.21:51255 
- DIRECT/192.6.29.21 -


The access permissions order is important:
  http://wiki.squid-cache.org/SquidFaq/OrderIsImportant

You have also broken the basic security protections:

http://wiki.squid-cache.org/SquidFaq/SecurityPitfalls#The_Safe_Ports_and_SSL_Ports_ACL

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.14
  Beta testers wanted for 3.2.0.10

Reply via email to