On 9-Aug-06, at 4:01 PM, [EMAIL PROTECTED] wrote:
So rather than answer this, I'll throw the question back to you - how
do you currently cope with a system that expects 100 concurrent users
requesting pages that take 5 minutes to generate?
The only similar thing I have is a daemon that sits and blocks on a
pipe waiting for requests from a process that monitors directories.
So I don't lose requests (assuming I could, not sure if it's possible)
as soon as I get a request I fork and let the child handle the request
without waiting for it and the parent goes back to listening on the
pipe.
When there are a lot of requests it slows the system down
somewhat :) I wrote it about 3 years ago when learnin perl so I am
more than happy to explore better methods than perlIO teaches.
I presume the way to go would be to have the Danga stuff handle the
request in a listener and have it pass on the request to 1 of N other
daemons, like apache does now. Probably.
The best way then to handle this is integrate your current daemon, or
re-write one end of that pipe so that Danga::Socket handles the
results rather than your current setup. This is fairly easy to do and
I'd be happy to show you.
Matt.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]