I have to do to some resource intense tasks on data from the web, and
would like let the cgi hand off the data to some long running script
that manages a queue and doesn't fork, but hands out items in the queue
to a (third?) script that actually does the processing. This last script
shouldn't fork as I want to keep some resources for other stuff...
I've been thinking to write on (or perhaps two) daemons to accomplish
this, but there might be better ways... POE?
Any suggestions and/or comments are welcome
- [MacPerl-AnyPerl] memory consuming hashes allan
- Christian Huldt