I have an application I'm building that uses a bi-directional protocol on persistent sockets. (It is a "near-real-time" application, not so time critical as to be truly real-time.) The application is servicing simultaneous requests, but to a finite and limited set of clients (typically 50 or fewer). I would like each client to have it's own worker process because there are times when each client's data will require extensive processing and the other clients shouldn't lose responsiveness when that happen. Finally, the question. Each worker process has to load a bunch of setup information from files when it starts up. In a traditional process fork() model, you would load all that up before the fork and each child would get a copy. Any advice on how to work within Node's constraints yet speed that loading process up? It seems nasty to have each process do all that file reading. (PS: The files being loaded are all JavaScript and there may be thousands of them.)
-- Job Board: http://jobs.nodejs.org/ Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines You received this message because you are subscribed to the Google Groups "nodejs" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/nodejs?hl=en?hl=en
