Hi,

I saw some strange problem with squid. It has been running fine for quite
some time now but I saw it going extremely  slow and it choked up the
bandwidth. Upon further analysis I saw very high number of file descriptors
in use which normally doesnt go beyond 100 as it is not a very heavily
loaded server. All the requests were for either www.mrdouble.com and
www.geocities.com. It seemed that squid is stuck in a loop or something and
recursively sending requests for these sites. 

What is the solution to this problem. Has anyone experienced this. I am
using squid2.0 and squid1.1.22 on Linux 2.0.34 Pentium 200 MMX.

Thanks,

Irfan Akber

squidinfo (Text Document)

Reply via email to