Assuming you're using something like TLS/SSL for encryption, you may want 
to use a separate SSL terminator - this keeps the encryption/decryption 
logic out of your game server - this is good for security in case you have 
a vulnerability your private keys won't be exposed, this is good for 
performance (when I did heavy benchmarking back on Node 0.8 this was 
critical to performance, though I've been told Node 0.11 has much improved 
performance in this regard) - I've reposted a blog post about this [1].

At Cloud Party, we built a real-time virtual world with servers running on 
Node.js using our experience from developing successful MMORPGs at other 
companies.  You definitely cannot use the approach you described (any time 
any user does anything send a packet to all other users) - the overhead on 
sending packets is pretty high.  A more tenable approach is to have a 
server tick rate (something like 15 or 30 fps is common - it's pointless to 
have the server tick rate higher than the client's FPS or refresh rate 
anyway) and send at most 1 packet to each connected client each tick 
(combining all updates relevant to that client into one packet).  You're 
certainly not going to get Node.js (or, in my experience, even any native 
program) sending 200,000 unique packets as well as running your game logic.

As for serializing, we built our system before Node's Buffer module had the 
.writeUInt32LE/etc API, and we built a highly optimized API that looks 
incredibly similar (although with a much more efficient writeFloat - the 
Buffer API uses a native call for writeFloat).  Using that API for 
serializing should be sufficiently efficient.  If you're doing a lot of 
serializing, you're going to want to make sure to be passing true for the 
"noAssert" parameter when not running in development, there's a lot of 
overhead there (though it's invaluable when debugging).

[1] 
http://jimbesser.wordpress.com/2014/08/13/efficient-load-balancing-and-ssl-termination-for-websockets-and-node-js/

On Monday, August 11, 2014 8:42:51 PM UTC-7, Majid Arif Siddiqui wrote:
>
> When you look for resources on *Node* a lot of it are mostly centric on 
> the *web* (*http*). But *Node* is more than the *web*, so I wish there 
> are also more focus on the other aspect of *Node's* networking 
> capabilities other than *http*.
>
> When working on a *Game Server *you need high concurrency and also some 
> calculations. Some of the important points on game server development are:
>
>
>    - Receive
>    - Frame
>    - Decrypt (because it's always encrypted from the client)
>    - Deserialise (we still use binary when passing data NOT JSON)
>    - Handle
>    - Encrypt
>    - Serialise
>    - Send
>
> The list above it what happens for every incoming request from the client, 
> and that is just the tip. Imagine 200 concurrent players on the same area, 
> when 1 player moves, this update gets sent to all other 199 players. Server 
> receives one packet, sends out 199 packets. If all players walk 5 steps, 5 
> steps = 5 packets, 5 packets * 200 players = *1,000* incoming packets. 
> 1,000 * 200 = *200,000* outgoing packets. This can all happen in 1 
> second, and should be smooth in each players perspective.
>
> Questions:
>
>
>    - Is it true that one should keep callbacks/functions short to avoid 
>    blocking the event loop too long? How short should it be?
>    - When decrypting/deserialising/encrypting buffers (cpu calculations 
>    heavy, custom routines) should I use a C++ add-on? Should I use pure 
>    javascript and pass each task to a child process?
>    - serialising is like converting a JSON structure into a buffer, 
>    should I use the node buffer, an array or a typed array?
>
> Currently I am thinking of doing something like this to handle JSON to 
> buffer.
>
> obj = 
>
>   id: 1,
>
>   name: "obj" 
>
>
>> arr = []
>
> for k in obj
>
>   v = k[obj]
>
>   if v instanceof String
>
>     for i < v.length; i++
>
>        arr.push(v[i])
>
>   else 
>
>      arr.push(v)
>
> return new Buffer(arr)
>
>  
> This way I wont have to define the buffer length first. But is this 
> proper? Of course I left out the part about doing stuff like:
>
> serialize.uint16le = function(val) {
>>   var binary = [];
>>   binary[0] = val & 0xff;
>>   binary[1] = (val >> 8) & 0xff;
>>   return binary;
>> };
>
> deserialize.uint16le = function(pointer, buffer) {
>>   var binary = buffer[pointer.offset++];
>>   binary |= buffer[pointer.offset++] << 8;
>>   binary >>>= 0;
>>   return binary;
>> }; 
>
>
> But how will this fair performance-wise against just using the built-in 
> buffer methods? 
>
> In *MMOG*, performance is one of the make or break for games. Slow or 
> laggy games will make the players loose interest. I'm talking about heavy 
> lag because the server is getting slower, not because of network problems.
>
> Cheers,
> Maj
>
> PS. I'm just a *MMOG* server hobbyist.
>

-- 
Job board: http://jobs.nodejs.org/
New group rules: 
https://gist.github.com/othiym23/9886289#file-moderation-policy-md
Old group rules: 
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
--- 
You received this message because you are subscribed to the Google Groups 
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/nodejs/65de097c-84b1-4d91-8429-38505881aa42%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to