fwiw, I used to run into issues like this a lot under mod_perl.  the
apache process would lay claim to all the memory it ever used until a
restart ( or max children is reached ).

I used a few workarounds when i needed large data processing :
- i called an external process and collected the results
- i used a semaphore in the database or memcached that the webapp
would poll for results
- if i needed the process to run on the same codebase, i'd run another
instance of apache on an alternate port and then proxy only the 'large
requsts' to there
- eventually i would refactor the code to use a SOA setup and have a
dedicated daemon handle the large stuff.

-- 
You received this message because you are subscribed to the Google Groups 
"pylons-discuss" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/pylons-discuss?hl=en.

Reply via email to