Re: Slow server

2006-07-03 Thread Olivier Nicole
  2) as there are many connections comming from search engines siders
 (90% of all the established connections), I'd like to limit the
 ressources that spiders are using. One way would be through IPFW,
 but are there better ways? Is there a way to limit/prioritize in
 Apache (not that I know any).
 Lookup mod_security rules for Apache and mod_dosevasive. mod_evasive
 will help prevent the spiders from opening many pages at one time

Thanks for the idea. I looked at both. mod_evasive would be the one,
but it keeps traffic information on a per web site basis. The problem
is that I have hundred of web sites and the spider tries to access one
page at a time, but one page of each web site...

OK I have to dig that further.

Thanks,

Olivier
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Slow server

2006-06-30 Thread Olivier Nicole
Hi,

I am trying to deal with a server that is getting slower and slower.

Machine is based on a AMD Opteron(tm) Processor 244 with 4GB memory.

It is running MySQL, and Apache 13 and serving about 400 web sites
written in PHP.

OK the design of PHP is certainly not the most efficient, but actually
the server cannot hold 50 simultaneous http connections.

I am wondering:

1) what optimization I should look for in the system

2) as there are many connections comming from search engines siders
   (90% of all the established connections), I'd like to limit the
   ressources that spiders are using. One way would be through IPFW,
   but are there better ways? Is there a way to limit/prioritize in
   Apache (not that I know any).

Best regards,

Olivier
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Slow server

2006-06-30 Thread Olivier Nicole
Hi,

I am trying to deal with a server that is getting slower and slower.

Machine is based on a AMD Opteron(tm) Processor 244 with 4GB memory.

It is running MySQL, and Apache 13 and serving about 400 web sites
written in PHP.

OK the design of PHP is certainly not the most efficient, but actually
the server cannot hold 50 simultaneous http connections.

I am wondering:

1) what optimization I should look for in the system

2) as there are many connections comming from search engines siders
   (90% of all the established connections), I'd like to limit the
   ressources that spiders are using. One way would be through IPFW,
   but are there better ways? Is there a way to limit/prioritize in
   Apache (not that I know any).

Best regards,

Olivier
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: Slow server

2006-06-30 Thread Alex Zbyslaw

Olivier Nicole wrote:


2) as there are many connections comming from search engines siders
  (90% of all the established connections), I'd like to limit the
  ressources that spiders are using. One way would be through IPFW,
  but are there better ways? Is there a way to limit/prioritize in
  Apache (not that I know any).
 

google robots.txt which ought to limit what the spiders look at (but 
consequently reduces what they index, as well).


Overall, though, your problem sounds more like a piece of software 
bloating as it runs; the longer it runs the more memory it consumes.


Does the machine end up swapping?  Try tracking memory usage.

--Alex




___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


RE: Slow server

2006-06-30 Thread Tamouh H.
 
 
 Olivier Nicole wrote:
 
 2) as there are many connections comming from search engines siders
(90% of all the established connections), I'd like to limit the
ressources that spiders are using. One way would be through IPFW,
but are there better ways? Is there a way to limit/prioritize in
Apache (not that I know any).
   
 

Lookup mod_security rules for Apache and mod_dosevasive. mod_evasive will help 
prevent the spiders from opening many pages at one time

mod_security has rules to detect some fake spiders and other bots and block 
them from the get go. 

Both though will add a little bit of overhead to Apache.

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]