We're using ithreads with modperl to run some complicated robots
concurrently ( in a commercial environment ) - there is however an issue.
Huge memory usage ... each ithread uses about 10M of ram ( image of Apache,
image of mod perl and image of our deep-link robot ), and as we use 5
ithreads plus the original thread that means that each Apache is using 60 M
and because we trade on being the best and the fastest at what we do we need
to keep plenty of Apaches ready and waiting ( about 20 ) - so we're using
heaps of memory.
Does anybody know how we can reduce the amount of memory we use ? - is there
some smart way to actually share the images. Up to now the problem has been
OK because we simply paid to put 2 Gig on board, but we had nearly 1000
users yesterday and they spend about 10 minutes filling in the forms ( car
insurance quote stuff ) then the robots go out and do the deep-linking and
that can be up to 3 minutes ( and usually do - the sites we deep-link into
can be quite slow and we make 10 - 50 requests into each site ), during
which time Apache is handling a single request !
We are simply running out of memory, which is sad because we are nowhere
near running out of processor and it grieves me to simply use a bigger
server when it seems that smarter could solve the problem. Other than that
things are working very nicely and the site serves quickly and reliably -
see at www.superquote.com ( despite the warnings that ithreads is not yet
safe for commercial use ).
We're on a 2.4.20 Linux with Apache 1.3.28 and mod_perl 1.28
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html