Olivier Nicole wrote:

2) as there are many connections comming from search engines siders
  (90% of all the established connections), I'd like to limit the
  ressources that spiders are using. One way would be through IPFW,
  but are there better ways? Is there a way to limit/prioritize in
  Apache (not that I know any).
google robots.txt which ought to limit what the spiders look at (but consequently reduces what they index, as well).

Overall, though, your problem sounds more like a piece of software bloating as it runs; the longer it runs the more memory it consumes.

Does the machine end up swapping?  Try tracking memory usage.

--Alex




_______________________________________________
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "[EMAIL PROTECTED]"

Reply via email to