Node.js, which is an evented server, is more efficient at handling multiple 
concurrent connections than a multi-threaded or multi-process server. There 
would be more overhead incurred for each and every thread or process using 
other technologies than with an evented server like Node.js.

Node.js which uses Chrome's V8 javascript engine, by default is only 
configured to use about 1GB of memory, but I believe that limit can be 
increased in recent versions of node.js which should have the r9823 chrome 
fix which allows the limit to be increased beyond 2GB on 64-bit server (see 
the issue for details on how to give V8 more memory). 
http://code.google.com/p/v8/issues/detail?id=847  However you might want to 
consider running a cluster of nodes on your server before increasing the 
memory, see discussion below.


As to why your streaming might be slow, there are many things to consider:


 - Have you pre-compressed the video so that you are not serving more data 
than necessary? Choose a good codec to keep as much quality as possible for 
the amount of data to server (webM, h.264, etc.). The more compression you 
use, the lower the load on your servers, bandwidth, etc.
 - Are you properly serving streams of the data and not holding too much in 
memory? (ie. not reading in all the data but rather reading as a stream)
 - Are the users receiving the same data (live stream) such that you can 
read once and send to all connected users? (not re-reading same data for 
each connection)
 - Do you have sufficient bandwidth between your server and users? also 
between media storage and server?
 - Have you matched the size of data sending back to the size your devices 
can handle? The more mismatch in the chunk size, the more buffering the 
server will be doing and thus the more memory it will be using. Is the back 
pressure being applied all the way back up the chain in your code so that 
you are pausing reads while waiting for data to be consumed.
 - If you are going to be scaling up to lots of users that are not 
receiving the same stream, then you might want to consider using a node 
cluster running multiple node instances (managed by cluster) on a 
multi-process server with lots of memory. That way the load is distributed 
over many processes each using its own heap of memory. 
 - If you outgrow a node cluster on a single server, then go to a load 
balanced multi-server arrangement and you can continue to scale up assuming 
you are not bottlenecked elsewhere (bandwidth, media server, db, etc.)


So those are some of the things to consider when trying to build a scalable 
streaming media node.js server.

If you consider all of these when building your system, node.js can be a 
very scalable system. Properly architected and coded, it should use less 
memory than a non-evented architecture, especially if you are serving the 
same stream to multiple clients (live stream) then it becomes even more 
efficient since you can read once and send the same data to all.

Companies like Voxer and Transloadit which use node.js to build massively 
scalable systems, so they are good examples of what can be done given the 
right architecture.



-- 
Job Board: http://jobs.nodejs.org/
Posting guidelines: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en

Reply via email to