Re: [squid-users] Problems with downloads

2008-10-25 Thread Amos Jeffries

Henrik Nordstrom wrote:

On fre, 2008-10-24 at 08:31 -0500, Osmany Goderich wrote:


It was the range_offset_limit -1 KB line that was not letting squid
resume downloads. I set it back to 0KB as it is by default and
woila!!! Everything back to normal!!


Good.

range_offset_limit -1 says Squid should NEVER resume download, and
instead always download the complete file.

To use this you must also disable quick_abort, telling Squid to always
continue downloading the requested object when the client has
disconnected.

quick_abort_min -1 KB


But be warned that both these settings can cause Squid to waste
excessive amounts of bandwidth on data which will perhaps never be
requested by any client..

Also depending on the Squid version range_offset_limit -1 may result in
significant delays or even timeouts if the client requests a range far
into the requested file. Not sure what the status in Squid-3 is wrt
this.


No change on them from Squid-2.

Amos
--
Please use Squid 2.7.STABLE4 or 3.0.STABLE9


RE: [squid-users] Problems with downloads

2008-10-24 Thread Osmany Goderich
I solved the problem.

It was the range_offset_limit -1 KB line that was not letting squid resume 
downloads. I set it back to 0KB as it is by default and woila!!! Everything 
back to normal!!

Thank you very much for your support. This is one of the best mailing lists.

-Mensaje original-
De: Henrik Nordstrom [mailto:[EMAIL PROTECTED] 
Enviado el: jueves, 23 de octubre de 2008 14:07
Para: Osmany Goderich
CC: squid-users@squid-cache.org
Asunto: Re: [squid-users] Problems with downloads

On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
 Hi everyone,
 
 I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have 
 problems with downloads, especially large files. Usually downloads are 
 slow in my network because of the amount of users I have but I dealt 
 with it using download accelerators like “FlashGET”. Now the downloads 
 get interrupted and they never resume and I don’t know why.

Can you try downgrading to 2.7 to see if that makes any difference. If it 
does please file a bug report.

Also check your cache.log for any errors.

  I can’t seem to find
 a pattern as to when or why the downloads get interrupted. I don’t 
 know if I explained my self well enough. I’m suspecting that there is 
 something wrong with all the configurations I did to tune de cache 
 effectiveness.

There isn't much you can do wrong at this level.

Regards
Henrik



RE: [squid-users] Problems with downloads

2008-10-24 Thread Henrik Nordstrom
On tor, 2008-10-23 at 15:54 -0500, Osmany Goderich wrote:

 I had squid2.6STABLE6-5 before and I upgraded it thinking it was a bug in 
 that release. Should I still downgrade to 2.7?

Yes.

Regards
Henrik



signature.asc
Description: This is a digitally signed message part


RE: [squid-users] Problems with downloads

2008-10-24 Thread Henrik Nordstrom
On fre, 2008-10-24 at 08:31 -0500, Osmany Goderich wrote:

 It was the range_offset_limit -1 KB line that was not letting squid
 resume downloads. I set it back to 0KB as it is by default and
 woila!!! Everything back to normal!!

Good.

range_offset_limit -1 says Squid should NEVER resume download, and
instead always download the complete file.

To use this you must also disable quick_abort, telling Squid to always
continue downloading the requested object when the client has
disconnected.

quick_abort_min -1 KB


But be warned that both these settings can cause Squid to waste
excessive amounts of bandwidth on data which will perhaps never be
requested by any client..

Also depending on the Squid version range_offset_limit -1 may result in
significant delays or even timeouts if the client requests a range far
into the requested file. Not sure what the status in Squid-3 is wrt
this.

Regards
Henrik


signature.asc
Description: This is a digitally signed message part


[squid-users] Problems with downloads

2008-10-23 Thread Osmany Goderich
Hi everyone,

I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
problems with downloads, especially large files. Usually downloads are slow
in my network because of the amount of users I have but I dealt with it
using download accelerators like “FlashGET”. Now the downloads get
interrupted and they never resume and I don’t know why. I can’t seem to find
a pattern as to when or why the downloads get interrupted. I don’t know if I
explained my self well enough. I’m suspecting that there is something wrong
with all the configurations I did to tune de cache effectiveness. This is
the configuration I have. I left out all of my ACLs since I presume they are
not very helpful in this case. I would really appreciate any help I can get.
Thanks.


#  TAG: http_port
http_port 192.168.0.4:3128
http_port 192.168.54.4:3128

#  TAG: icp_port
icp_port 0

# OPTIONS WHICH AFFECT THE NEIGHBOR SELECTION ALGORITHM
#

-

#  TAG: cache_peer
cache_peer 192.168.250.2 parent 1080 0 no-query default

#  TAG: hierarchy_stoplist
hierarchy_stoplist cgi-bin ?

#  TAG: nonhierarchical_direct
nonhierarchical_direct off


#  TAG: cache
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY

# OPTIONS WHICH AFFECT THE CACHE SIZE
#

-

#  TAG: cache_mem (bytes)
cache_mem 36 MB

#  TAG: cache_swap_low  (percent, 0-100)
#  TAG: cache_swap_high (percent, 0-100)
cache_swap_low 90
cache_swap_high 95

#  TAG: maximum_object_size  (bytes)
maximum_object_size 35840 KB

#  TAG: maximum_object_size_in_memory    (bytes)
maximum_object_size_in_memory 16 KB

#  TAG: offline_mode
offline_mode off

#  TAG: cache_replacement_policy
cache_replacement_policy heap LFUDA 

#  TAG: memory_replacement_policy
memory_replacement_policy heap GDSF 

#  TAG: range_offset_limit  (bytes)
range_offset_limit -1 

# LOGFILE PATHNAMES AND CACHE DIRECTORIES
#

-

#  TAG: cache_dir
cache_dir ufs /var/spool/squid 20480 16 256

#  TAG: access_log
access_log /var/log/squid/access.log 

#  TAG: cache_log
cache_log /var/log/squid/cache.log

#  TAG: useragent_log
#useragent_log /var/log/squid/useragent.log

#  TAG: cache_store_log
cache_store_log /var/log/squid/store.log

#  TAG: pid_filename
pid_filename /var/run/squid.pid

# OPTIONS FOR EXTERNAL SUPPORT PROGRAMS
#

-

#  TAG: ftp_user
ftp_user [EMAIL PROTECTED]

#  TAG: ftp_list_width
ftp_list_width 32

#  TAG: ftp_passive
ftp_passive on

#  TAG: ftp_sanitycheck
ftp_sanitycheck on

#  TAG: dns_nameservers
dns_nameservers 192.168.54.3 192.168.54.4 192.168.250.1 192.168.250.3

#  TAG: hosts_file
hosts_file /etc/hosts

#  TAG: auth_param

auth_param basic program /usr/lib64/squid/ncsa_auth /etc/squid/claves 
auth_param basic realm proxy.cha.jovenclub.cu
auth_param basic children 5
auth_param basic realm proxy.cha.jovenclub.cu
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off


# OPTIONS FOR TUNING THE CACHE
#

-

#  TAG: request_body_max_size (KB)
request_body_max_size 4 MB

#  TAG: refresh_pattern
refresh_pattern ^ftp:    1440  20%   10080
refresh_pattern ^gopher: 1440  0%    1440
refresh_pattern -i .nup 960 200%    1440
refresh_pattern -i .ver 960 200%    1440
refresh_pattern -i .zip  960   200%  1440
refresh_pattern -i .exe 960 200%    1440
refresh_pattern -i .l2b 960 200%    1440
refresh_pattern -i .p7h 960 200%    1440
refresh_pattern -i .avc 960 200%    1440
refresh_pattern -i .avc 960 200%    1440
refresh_pattern -i .rar  960   200%  1440
refresh_pattern -i .vpu 960 200%    1440
refresh_pattern -i .dat 960 200%    1440
refresh_pattern -i .dif 960 200%    1440
refresh_pattern -i .klz 960 200%    1440
refresh_pattern -i .kdc  960   200%  1440
refresh_pattern -i .ctf 960 200%    1440
refresh_pattern -i .bin 960 200%    1440
refresh_pattern -i .jpg 1440    250%    4320 override-expire
override-lastmod
refresh_pattern -i .gif 1440    250%    4320 override-expire
override-lastmod
refresh_pattern -i .png 1440    250%    4320 override-expire
override-lastmod
refresh_pattern -i .js   1440  250%  4320 override-expire
override-lastmod
refresh_pattern -i .ico  1440  250%  4320 override-expire
override-lastmod
refresh_pattern -i .css  1440  250%  4320 override-expire
override-lastmod
refresh_pattern -i . 300   250%  2880 override-expire
override-lastmod

#  TAG: quick_abort (KB)
quick_abort_min 64 KB
quick_abort_max 128 KB
quick_abort_pct 90

#  TAG: reload_into_ims on|off
reload_into_ims on

#  TAG: 

Re: [squid-users] Problems with downloads

2008-10-23 Thread Henrik Nordstrom
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
 Hi everyone,
 
 I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
 problems with downloads, especially large files. Usually downloads are slow
 in my network because of the amount of users I have but I dealt with it
 using download accelerators like “FlashGET”. Now the downloads get
 interrupted and they never resume and I don’t know why.

Can you try downgrading to 2.7 to see if that makes any difference. If
it does please file a bug report.

Also check your cache.log for any errors.

  I can’t seem to find
 a pattern as to when or why the downloads get interrupted. I don’t know if I
 explained my self well enough. I’m suspecting that there is something wrong
 with all the configurations I did to tune de cache effectiveness.

There isn't much you can do wrong at this level.

Regards
Henrik


signature.asc
Description: This is a digitally signed message part


RE: [squid-users] Problems with downloads

2008-10-23 Thread Osmany Goderich
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
 Hi everyone,
 
 I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have 
 problems with downloads, especially large files. Usually downloads are 
 slow in my network because of the amount of users I have but I dealt 
 with it using download accelerators like “FlashGET”. Now the downloads 
 get interrupted and they never resume and I don’t know why.

Can you try downgrading to 2.7 to see if that makes any difference. If it 
does please file a bug report.

I had squid2.6STABLE6-5 before and I upgraded it thinking it was a bug in that 
release. Should I still downgrade to 2.7?

Also check your cache.log for any errors.

  I can’t seem to find
 a pattern as to when or why the downloads get interrupted. I don’t 
 know if I explained my self well enough. I’m suspecting that there is 
 something wrong with all the configurations I did to tune de cache 
 effectiveness.

There isn't much you can do wrong at this level.

Regards
Henrik



RV: [squid-users] Problems with downloads

2008-10-23 Thread Osmany Goderich
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
 Hi everyone,
 
 I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have 
 problems with downloads, especially large files. Usually downloads are 
 slow in my network because of the amount of users I have but I dealt 
 with it using download accelerators like “FlashGET”. Now the downloads 
 get interrupted and they never resume and I don’t know why.

Can you try downgrading to 2.7 to see if that makes any difference. If it 
does please file a bug report.

I had squid2.6STABLE6-5 before and I upgraded it thinking it was a bug in that 
release. Should I still downgrade to 2.7?

Also check your cache.log for any errors.

I have this when I try to resume the download of something big:

1224795819.227  10450 192.168.0.6 TCP_MISS/200 372 HEAD 
ftp://kucha.ru/pub/mirror/pcbsd/7.0.1/amd64/PCBSD7.0.1-amd64-DVD.iso admin 
DEFAULT_PARENT/192.168.250.2 text/plain

  I can’t seem to find
 a pattern as to when or why the downloads get interrupted. I don’t 
 know if I explained my self well enough. I’m suspecting that there is 
 something wrong with all the configurations I did to tune de cache 
 effectiveness.

There isn't much you can do wrong at this level.

Regards
Henrik



Re: [squid-users] problems detecting downloads with Squid

2004-03-16 Thread Henrik Nordstrom
On Tue, 16 Mar 2004, Luis Miguel wrote:

 Thanks, it works not allowing this kind of download. 
 
 Are there any way to pass this downloads to the redirector?

It is already, but as you noticed there is no way for the redirector to
tell that this is a download. This is because redirectors is called on the 
request before it is forwarded, and to know the returned mime type the 
request must have been forwarded and the response from the web server seen 
by Squid.

Regards
Henrik



[squid-users] problems detecting downloads with Squid

2004-03-15 Thread Luis Miguel
Hi all, I am using Squid 2.5.4-3 on linux, I am using squidguard as redirector to 
block all windows executables, all is working fine except for some webs that bypass 
squid, the .exe file dont show in the log files and the user can download it using 
the browser.

The only log squid generates is:

1079005403.984377 192.168.0.167 TCP_MISS/200 3857 GET 
http://63.217.29.115/connect.php? - DIRECT/63.217.29.115 text/html
1079005404.704544 192.168.0.167 TCP_MISS/200 9924 GET 
http://63.217.29.115/download.php? - DIRECT/63.217.29.115 application/force-download

but you get the .exe file.

If someone want to check the URL: http://63.217.29.115/connect.php?did=od-stnd179

Beware, I think the file that is downloaded is some king of dialer/trojan

Is there any way to detect this kind of downloads? or I am forgetting something.



Greets.






Re: [squid-users] problems detecting downloads with Squid

2004-03-15 Thread Henrik Nordstrom
On Mon, 15 Mar 2004, Luis Miguel wrote:

 Hi all, I am using Squid 2.5.4-3 on linux, I am using squidguard as redirector to 
 block all windows executables, all is working fine except for some webs that 
 bypass squid, the .exe file dont show in the log files and the user can download 
 it using the browser.
 
 The only log squid generates is:
 
 1079005403.984377 192.168.0.167 TCP_MISS/200 3857 GET 
 http://63.217.29.115/connect.php? - DIRECT/63.217.29.115 text/html

 1079005404.704544 192.168.0.167 TCP_MISS/200 9924 GET 
 http://63.217.29.115/download.php? - DIRECT/63.217.29.115 application/force-download

You can use the rep_mime_type acl in http_reply_access to block this kind 
of things..

Regards
Henrik