>>>>> "s" == shane  <[EMAIL PROTECTED]> writes:


s> Okay, these are my thoughts, what do you think?

I just set upper bounds on the number of mod_perl processes to be
about 3/4 of RAM based on the size of the typical httpd with
everything loaded, and then I set the maximum number of front-ends to
be something like 1/2 of RAM based on their sizes.  Then I watch the
status of both and see how many of each we have typically running and
what the max of each was.  If I top out of mod_perl processes, and
that lasts a while, then I've got to adjust (ie, add RAM.)  I've yet
to get near my max front-end process limit.

This is pretty much the only way you can do it.  No amout of theory is
going to predict how the system will behave.  You have to observe and
adjust.  It just makes no sense to allow mod_perl to use more than the
physical amount of RAM on your system, though.

(When I say physical RAM, I mean the amount your system allows your
group of processes to use.  This may be different than the real
hardware amount of RAM.)

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Vivek Khera, Ph.D.                Khera Communications, Inc.
Internet: [EMAIL PROTECTED]       Rockville, MD       +1-301-545-6996
PGP & MIME spoken here            http://www.kciLink.com/home/khera/

Reply via email to